pihnn.nn.ComplexLinear#

class pihnn.nn.ComplexLinear(stack_features, in_features, out_features, n_domains=None, has_bias=True)#

Bases: torch.nn.Module

Extension of torch.nn.Linear to complex values.

So, a complex linear layer performs the operation

\[y = Wx + b,\]

where the input \(x\) is a complex vector of dimension in_features, the output \(y\) and the bias \(b\) are complex vectors of dimension out_features, and the complex-valued weight tensor \(W\) has dimensions (in_features,out_features).

Parameters:
  • in_features (int) – Number of input units at the current layer.

  • out_features (int) – Number of output units at the current layer.

  • has_bias (bool) – True if the current layer includes the bias term.

init(scaling)#

Re-initialization of weights and bias.

Initialization is defined as a scaled complex-valued He initialization (Trabelsi [2018]):

\[\begin{split}\text{Re}(w),\text{Im}(w) &\sim \mathcal{N}\left(0,\frac{\texttt{scaling}}{2 \texttt{in_features}}\right), \\ bias &=0.\end{split}\]

This allows us to easily include the initialization strategy from Calafà et al. [2024], Section 3.2.4.

Parameters:

scaling (float) – Scaling in the He initialization.

weight()#
Returns:

weight (torch.tensor) - Tensor with weights.

bias()#
Returns:

bias (torch.tensor) - Bias vector.

forward(input)#

Forward step \(y=Wx+b\).

Parameters:

input (torch.tensor) – The input vector \(x\).

Returns:

output (torch.tensor) - The output vector \(y\).