easygraph.nn.convs.hypergraphs.dhcf_conv module#
- class easygraph.nn.convs.hypergraphs.dhcf_conv.JHConv(*args: Any, **kwargs: Any)[source]#
Bases:
Module
The Jump Hypergraph Convolution layer proposed in Dual Channel Hypergraph Collaborative Filtering paper (KDD 2020).
Matrix Format:
\[\mathbf{X}^{\prime} = \sigma \left( \mathbf{D}_v^{-\frac{1}{2}} \mathbf{H} \mathbf{W}_e \mathbf{D}_e^{-1} \mathbf{H}^\top \mathbf{D}_v^{-\frac{1}{2}} \mathbf{X} \mathbf{\Theta} + \mathbf{X} \right).\]- Parameters:
in_channels (
int
) – \(C_{in}\) is the number of input channels.out_channels (int) – \(C_{out}\) is the number of output channels.
bias (
bool
) – If set toFalse
, the layer will not learn the bias parameter. Defaults toTrue
.use_bn (
bool
) – If set toTrue
, the layer will use batch normalization. Defaults toFalse
.drop_rate (
float
) – If set to a positive number, the layer will use dropout. Defaults to0.5
.is_last (
bool
) – If set toTrue
, the layer will not apply the final activation and dropout functions. Defaults toFalse
.
- forward(X: torch.Tensor, hg: Hypergraph) torch.Tensor [source]#
The forward function.
- Parameters:
X (
torch.Tensor
) – Input vertex feature matrix. Size \((N, C_{in})\).hg (
eg.Hypergraph
) – The hypergraph structure that contains \(N\) vertices.