- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 73字
- 2025-04-04 14:15:16
Activation functions
So far, you have seen two different activation functions: a step function and a sigmoid. But there are many others that, depending on the task, can be more or less useful.
Activation functions are usually used to introduce non-linearity. Without it, we will only have a linear combination of input going through another linear function.
We will now look at a few activation functions and their code in Keras, in detail.