easygraph.nn.convs.hypergraphs.hgnnp_conv module#

class easygraph.nn.convs.hypergraphs.hgnnp_conv.HGNNPConv(*args: Any, **kwargs: Any)[source]#

Bases: Module

The HGNN + convolution layer proposed in HGNN+: General Hypergraph Neural Networks paper (IEEE T-PAMI 2022).

Sparse Format:

\[\begin{split}\left\{ \begin{aligned} m_{\beta}^{t} &=\sum_{\alpha \in \mathcal{N}_{v}(\beta)} M_{v}^{t}\left(x_{\alpha}^{t}\right) \\ y_{\beta}^{t} &=U_{e}^{t}\left(w_{\beta}, m_{\beta}^{t}\right) \\ m_{\alpha}^{t+1} &=\sum_{\beta \in \mathcal{N}_{e}(\alpha)} M_{e}^{t}\left(x_{\alpha}^{t}, y_{\beta}^{t}\right) \\ x_{\alpha}^{t+1} &=U_{v}^{t}\left(x_{\alpha}^{t}, m_{\alpha}^{t+1}\right) \\ \end{aligned} \right.\end{split}\]

Matrix Format:

\[\mathbf{X}^{\prime} = \sigma \left( \mathbf{D}_v^{-1} \mathbf{H} \mathbf{W}_e \mathbf{D}_e^{-1} \mathbf{H}^\top \mathbf{X} \mathbf{\Theta} \right).\]
Parameters:
  • in_channels (int) – \(C_{in}\) is the number of input channels.

  • out_channels (int) – \(C_{out}\) is the number of output channels.

  • bias (bool) – If set to False, the layer will not learn the bias parameter. Defaults to True.

  • use_bn (bool) – If set to True, the layer will use batch normalization. Defaults to False.

  • drop_rate (float) – If set to a positive number, the layer will use dropout. Defaults to 0.5.

  • is_last (bool) – If set to True, the layer will not apply the final activation and dropout functions. Defaults to False.

forward(X: torch.Tensor, hg: Hypergraph) torch.Tensor[source]#

The forward function.

Parameters:
  • X (torch.Tensor) – Input vertex feature matrix. Size \((|\mathcal{V}|, C_{in})\).

  • hg (eg.Hypergraph) – The hypergraph structure that contains \(|\mathcal{V}|\) vertices.