pihnn.nn.PIHNN#

class pihnn.nn.PIHNN(PDE, units, material={'lambda': 1, 'mu': 1}, activation=torch.exp, has_bias=True)#

Bases: torch.nn.Module

Main class for the employment of physics-informed holomorphic neural networks (PIHNNs).

PIHNNs are able to solve 4 types of problems, where \(\varphi,\psi\) denote the holomorphic output(s) of the network:

  • 2D Laplace problem (‘laplace’):
    \[\nabla^2u=0 \Leftrightarrow u=\text{Re}(\varphi).\]
  • 2D biharmonic problem with Goursat representation (‘biharmonic’):
    \[\nabla^4u=0 \Leftrightarrow u=\text{Re}((x-iy)\varphi + \psi).\]
  • 2D linear elasticity with Kolosov-Muskhelishvili representation (‘km’):

    \(\sigma_{xx},\sigma_{yy},\sigma_{xy},u_x,u_y\) solve the 2D linear elasticity problem \(\Leftrightarrow\)

    \[\begin{split}\begin{cases} \sigma_{xx} + \sigma_{yy} = 4 \text{Re}(\varphi'), \\ \sigma_{yy} - \sigma_{xx} + 2i\sigma_{xy} = (\overline{z}\varphi''+\psi'), \\ 2\mu(u_x + iu_y) = \gamma \varphi - z \overline{\varphi'} - \overline{\psi}, \end{cases}\end{split}\]

    where \(\mu\) is the shear modulus and \(\gamma\) is the Kolosov constant.

  • 2D linear elasticity with Kolosov-Muskhelishvili representation, stress-only (‘km-so’):

    \(\sigma_{xx},\sigma_{yy},\sigma_{xy}\) solve the 2D linear elasticity problem \(\Leftrightarrow\)

    \[\begin{split}\begin{cases} \sigma_{xx} + \sigma_{yy} = 4 \text{Re}(\varphi), \\ \sigma_{yy} - \sigma_{xx} + 2i\sigma_{xy} = (\overline{z}\varphi'+\psi). \end{cases}\end{split}\]

The output of the network is therefore the scalar function \(\varphi_{NN}\approx\varphi\) in the Laplace problem. Instead, for the other problems the PIHNN is composed by 2 stacked networks \(\varphi_{NN},\psi_{NN}\approx\varphi,\psi\).

Parameters:
  • PDE (str) – Problem to solve, either ‘laplace’, ‘biharmonic’, ‘km’ or ‘km-so’.

  • units (list of int) – List containing number of units at each layer, e.g., [1,10,10,1].

  • material (dict) – Properties of the material, dictionary with ‘lambda’ (first Lamé coefficient), ‘mu’ (second Lamé coefficient).

  • activation (callable) – Activation function, by default the complex exponential.

  • has_bias (bool) – True if the linear layers include bias vectors.

forward(z, real_output=False)#

Forward step, i.e., compute for \(j=1,2\):

\[\mathcal{L}_{N,j} \circ \phi \circ \mathcal{L}_{N-1,j} \circ \phi \dots \circ \mathcal{L}_{1,j} (z)\]

where \(z\) is the input, \(\phi\) the activation function and \(\{\mathcal{L}_{i,j}\}\) the complex linear layers (pihnn.nn.ComplexLinear) for each layer \(i\) and stacked network \(j\).

Parameters:
  • z (torch.tensor) – Input of the network, typically a batch of coordinates from the domain boundary.

  • real_output (bool) – Whether to provide the output in the real-valued representation.

Returns:

phi (torch.tensor) - Output of the network. As mentioned above, it has the same shape of the input for the Laplace problem but double size for the other problems.

initialize_weights(method, beta=0.5, sample=None, gauss=None)#

Initialization of PIHNNs. Implemented methods:

  • Complex-valued He initialization (Trabelsi [2018]):

    \(\text{Re}(w),\text{Im}(w)\sim \mathcal{N}\left(0,\frac{1}{2 \texttt{in_features}}\right), \hspace{3mm} bias=0\).

  • Scaled complex-valued He initialization:

    \(\text{Re}(w),\text{Im}(w)\sim \mathcal{N}\left(0,\frac{\texttt{scaling}}{2 \texttt{in_features}}\right), \hspace{3mm} bias=0\).

  • PIHNNs ad-hoc initialization with exponential activations:

    See Calafà et al. [2024], Section 3.2.4.

Parameters:
  • method (str) – Either ‘he’, ‘he_scaled’, ‘exp’, see description above.

  • beta (float) – Scaling coefficient in the scaled He initialization, \(\beta\) coefficient in the Calafà initialization, not used in He initialization.

  • sample (torch.tensor) – Initial sample \(x_0\) in the Calafà initialization, not used in the other methods.

  • gauss (int) – \(M_e\) coefficient in the Calafà initialization, not used in the other methods.

apply_real_transformation(z, phi)#

Based on the type of PDE, this method returns the real-valued output from the holomorphic potentials. We address to the documentation of pihnn.nn.PIHNN for the review of the 4 types of problems and their associated representation.

For PDE = ‘laplace’ and ‘biharmonic’, \(u\) is evaluated at \(z\). For PDE = ‘km’ and ‘km-so’, \(\sigma_{xx},\sigma_{yy},\sigma_{xy},u_x,u_y\) are stacked in a single tensor. Finally, in ‘km-so’, \(u_x,u_y\) are identically zero. :params z: Input of the model, typically a batch of coordinates from the domain boundary. :type z: torch.tensor :param phi: Complex-valued output of the network. :type phi: torch.tensor :returns: vars (torch.tensor) - Tensor containing the real-valued variable(s) evaluated at \(z\).