easygraph.model.hypergraphs.setgnn module#
- class easygraph.model.hypergraphs.setgnn.SetGNN(*args: Any, **kwargs: Any)[source]#
Bases:
Module
The SetGNN model proposed in YOU ARE ALLSET: A MULTISET LEARNING FRAMEWORK FOR HYPERGRAPH NEURAL NETWORKS paper (ICLR 2022).
- Parameters:
num_features (
int
) – : The dimension of node features.num_classes (
int
) – The Number of class of the classification task.Classifier_hidden (
int
) – Decoder hidden units.Classifier_num_layers (
int
) – Layers of decoder.MLP_hidden (
int
) – Encoder hidden units.MLP_num_layers (
int
) – Layers of encoder.dropout
(float
, optional): Dropout ratio. Defaults to 0.5.aggregate (
str
) – The aggregation method. Defaults toadd
normalization (
str
) – The normalization method. Defaults toln
deepset_input_norm (
bool
) – Defaults to True.heads (
int
) – Defaults to 1PMA` (
bool
) – Defaults to TrueGPR` (
bool
) – Defaults to FalseLearnMask` (
bool
) – Defaults to Falsenorm` (
Tensor
) – The weight for edges in bipartite graphs, correspond to data.edge_index
- forward(data)[source]#
The data should contain the follows data.x: node features data.edge_index: edge list (of size (2,|E|)) where data.edge_index[0] contains nodes and data.edge_index[1] contains hyperedges !!! Note that self loop should be assigned to a new (hyper)edge id!!! !!! Also note that the (hyper)edge id should start at 0 (akin to node id) data.norm: The weight for edges in bipartite graphs, correspond to data.edge_index !!! Note that we output final node representation. Loss should be defined outside.