1<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/SoftPlus.php">[source]</a></span>
2
3# Soft Plus
4A smooth approximation of the piecewise linear [ReLU](relu.md) activation function.
5
6$$
7{\displaystyle Soft-Plus = \log \left(1+e^{x}\right)}
8$$
9
10## Parameters
11This activation function does not have any parameters.
12
13## Example
14```php
15use Rubix\ML\NeuralNet\ActivationFunctions\SoftPlus;
16
17$activationFunction = new SoftPlus();
18```
19
20## References
21[^1]: X. Glorot et al. (2011). Deep Sparse Rectifier Neural Networks.