PyTorch Classification
Binary Classification
Sigmoid
torch.nn.Sigmoid()
torch.nn.functional.sigmoid(x)
Loss functions
loss = criterion(input, target)
torch.nn.BCELoss(weight: Optional[torch.Tensor] = None, reduction: str = 'mean')
torch.nn.BCEWithLogitsLoss(weight: Optional[torch.Tensor] = None, reduction: str = 'mean')
target should be numbers between 0 and 1. Type torch.float
Relationships
The following are equivalent
Sigmoid + BCELoss
BCEWithLogitsLoss
Multi-class Classification
Softmax and LogSoftmax
Softmax
torch.nn.Softmax(dim: Optional[int] = None)
torch.nn.functional.softmax(x, dim: Optional[int] = None)
LogSoftmax
torch.nn.LogSoftmax(dim: Optional[int] = None)
torch.nn.functional.log_softmax(x, dim: Optional[int] = None)
Loss functions
loss = criterion(input, target)
Negative Log-Likelihood Loss
torch.nn.NLLLoss(weight: Optional[torch.Tensor] = None, reduction: str = 'mean')
The input given through a forward call is expected to contain log-probabilities of each class. input has to be a Tensor of size either (minibatch, C) or (minibatch, C, $d_1$, $d_2$, ..., $d_K$) with K ≥ 1 for the K-dimensional case.
The target that this loss expects should be a class index in the range [0,C−1] where C = number of classes. Type torch.long
Loss for an instance
What it does is just taking the negative of the log-probability corresponding to the ground truth class. The goal is to make the inferenced probability distribution fit the ground truth probability distribution.
Cross Entropy Loss
torch.nn.CrossEntropyLoss(weight: Optional[torch.Tensor] = None, reduction: str = 'mean')
Relationships
The following are equivalent
Softmax + Log + NLLLoss
LogSoftmax + NLLLoss
CrossEntropyLoss
Notes
Possible to use CrossEntropyLoss for binary classification, but the target should be ground-truth labels.
Last updated
Was this helpful?