easygraph.nn.convs.hypergraphs.hypergcn_conv module#

class easygraph.nn.convs.hypergraphs.hypergcn_conv.HyperGCNConv(*args: Any, **kwargs: Any)[source]#

Bases: Module

The HyperGCN convolution layer proposed in HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs paper (NeurIPS 2019).

Parameters:
  • in_channels (int) – \(C_{in}\) is the number of input channels.

  • out_channels (int) – \(C_{out}\) is the number of output channels.

  • use_mediator (str) – Whether to use mediator to transform the hyperedges to edges in the graph. Defaults to False.

  • bias (bool) – If set to False, the layer will not learn the bias parameter. Defaults to True.

  • use_bn (bool) – If set to True, the layer will use batch normalization. Defaults to False.

  • drop_rate (float) – If set to a positive number, the layer will use dropout. Defaults to 0.5.

  • is_last (bool) – If set to True, the layer will not apply the final activation and dropout functions. Defaults to False.

forward(X: torch.Tensor, hg: Hypergraph, cached_g: Graph | None = None) torch.Tensor[source]#

The forward function.

Parameters:
  • X (torch.Tensor) – Input vertex feature matrix. Size \((N, C_{in})\).

  • hg (eg.Hypergraph) – The hypergraph structure that contains \(N\) vertices.

  • cached_g (eg.Graph) – The pre-transformed graph structure from the hypergraph structure that contains \(N\) vertices. If not provided, the graph structure will be transformed for each forward time. Defaults to None.