- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 61字
- 2025-04-04 14:15:16
Softmax
The softmax function is a generalization of the sigmoid function. While the sigmoid gives us the probability for a binary output, softmax allows us to transform an un-normalized vector into a probability distribution. That means that the softmax will output a vector that will sum up to 1, and all of its values will be between 0 and 1.