Cross Entropy Loss

Cross entropy is a measure of the difference between two probability distributions. Thus, in a classification model the cross entropy between the one-hot encoded label and the softmax activation of the network can be used to evaluate how good a prediction is. Averaging those values over a batch or a dataset yields the cross entropy loss.
Related concepts:
One-Hot EncodingSoftmax
External reference:
https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html