easygraph.nn package
Subpackages
- easygraph.nn.convs package
- Subpackages
- easygraph.nn.convs.hypergraphs package
- Submodules
- easygraph.nn.convs.hypergraphs.dhcf_conv module
- easygraph.nn.convs.hypergraphs.halfnlh_conv module
- easygraph.nn.convs.hypergraphs.hgnn_conv module
- easygraph.nn.convs.hypergraphs.hgnnp_conv module
- easygraph.nn.convs.hypergraphs.hnhn_conv module
- easygraph.nn.convs.hypergraphs.hypergcn_conv module
- easygraph.nn.convs.hypergraphs.unignn_conv module
- Module contents
- easygraph.nn.convs.hypergraphs package
- Submodules
- easygraph.nn.convs.common module
- easygraph.nn.convs.pma module
- Module contents
- Subpackages
Submodules
easygraph.nn.loss module
- class easygraph.nn.loss.BPRLoss(alpha: float = 1.0, beta: float = 1.0, activation: str = 'sigmoid_then_log')[source]
Bases:
Module
This criterion computes the Bayesian Personalized Ranking (BPR) loss between the positive scores and the negative scores.
- Parameters:
alpha (
float
, optional) – The weight for the positive scores in the BPR loss. Defaults to1.0
.beta (
float
, optional) – The weight for the negative scores in the BPR loss. Defaults to1.0
.activation (
str
, optional) – The activation function to use can be one of"sigmoid_then_log"
,"softplus"
. Defaults to"sigmoid_then_log"
.
- forward(pos_scores: Tensor, neg_scores: Tensor)[source]
The forward function of BPRLoss.
- Parameters:
pos_scores (
torch.Tensor
) – The positive scores.neg_scores (
torch.Tensor
) – The negative scores.
- training: bool
easygraph.nn.regularization module
- class easygraph.nn.regularization.EmbeddingRegularization(p: int = 2, weight_decay: float = 0.0001)[source]
Bases:
Module
Regularization function for embeddings.
- Parameters:
p (
int
) – The power to use in the regularization. Defaults to2
.weight_decay (
float
) – The weight of the regularization. Defaults to1e-4
.
- forward(*embs: List[Tensor])[source]
The forward function.
- Parameters:
embs (
List[torch.Tensor]
) – The input embeddings.
- training: bool