easygraph.nn package

Subpackages

Submodules

easygraph.nn.loss module

class easygraph.nn.loss.BPRLoss(alpha: float = 1.0, beta: float = 1.0, activation: str = 'sigmoid_then_log')[source]

Bases: Module

This criterion computes the Bayesian Personalized Ranking (BPR) loss between the positive scores and the negative scores.

Parameters:
  • alpha (float, optional) – The weight for the positive scores in the BPR loss. Defaults to 1.0.

  • beta (float, optional) – The weight for the negative scores in the BPR loss. Defaults to 1.0.

  • activation (str, optional) – The activation function to use can be one of "sigmoid_then_log", "softplus". Defaults to "sigmoid_then_log".

forward(pos_scores: Tensor, neg_scores: Tensor)[source]

The forward function of BPRLoss.

Parameters:
  • pos_scores (torch.Tensor) – The positive scores.

  • neg_scores (torch.Tensor) – The negative scores.

training: bool

easygraph.nn.regularization module

class easygraph.nn.regularization.EmbeddingRegularization(p: int = 2, weight_decay: float = 0.0001)[source]

Bases: Module

Regularization function for embeddings.

Parameters:
  • p (int) – The power to use in the regularization. Defaults to 2.

  • weight_decay (float) – The weight of the regularization. Defaults to 1e-4.

forward(*embs: List[Tensor])[source]

The forward function.

Parameters:

embs (List[torch.Tensor]) – The input embeddings.

training: bool

Module contents