Softmax

Softmax is a vector-to-vector function which exponentiates each element then divides the results by the sum of the exponentiations of all the elements. The new vector has all elements positive, and their sum is equal to 1 -- thus it can be thought of as a probability distribution. The name "softmax" comes from the fact that the function tends to exacerbate the proportional difference between the highest element of the vector versus the others. It is very commonly used as the last activation function in neural networks set up for classification.
Related concepts:
Logits