easygraph.model.hypergraphs.setgnn module#

class easygraph.model.hypergraphs.setgnn.SetGNN(*args: Any, **kwargs: Any)[source]#

Bases: Module

The SetGNN model proposed in YOU ARE ALLSET: A MULTISET LEARNING FRAMEWORK FOR HYPERGRAPH NEURAL NETWORKS paper (ICLR 2022).

Parameters:
  • num_features (int) – : The dimension of node features.

  • num_classes (int) – The Number of class of the classification task.

  • Classifier_hidden (int) – Decoder hidden units.

  • Classifier_num_layers (int) – Layers of decoder.

  • MLP_hidden (int) – Encoder hidden units.

  • MLP_num_layers (int) – Layers of encoder. dropout (float, optional): Dropout ratio. Defaults to 0.5.

  • aggregate (str) – The aggregation method. Defaults to add

  • normalization (str) – The normalization method. Defaults to ln

  • deepset_input_norm (bool) – Defaults to True.

  • heads (int) – Defaults to 1

  • PMA` (bool) – Defaults to True

  • GPR` (bool) – Defaults to False

  • LearnMask` (bool) – Defaults to False

  • norm` (Tensor) – The weight for edges in bipartite graphs, correspond to data.edge_index

forward(data)[source]#

The data should contain the follows data.x: node features data.edge_index: edge list (of size (2,|E|)) where data.edge_index[0] contains nodes and data.edge_index[1] contains hyperedges !!! Note that self loop should be assigned to a new (hyper)edge id!!! !!! Also note that the (hyper)edge id should start at 0 (akin to node id) data.norm: The weight for edges in bipartite graphs, correspond to data.edge_index !!! Note that we output final node representation. Loss should be defined outside.

generate_edge_index(dataset, self_loop=False)[source]#
reset_parameters()[source]#