ncdl.nn.Softplus

class ncdl.nn.Softplus(*args: Any, **kwargs: Any)

Applies the Softplus function \(\text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x))\) element-wise.

SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive.

For numerical stability the implementation reverts to the linear function when \(input \times \beta > threshold\).

Parameters:
  • beta – the \(\beta\) value for the Softplus formulation. Default: 1

  • threshold – values above this revert to a linear function. Default: 20

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Output: \((*)\), same shape as the input.

scripts/activation_images/Softplus.png

Examples:

>>> m = nn.Softplus()
>>> input = torch.randn(2)
>>> output = m(input)
__init__(beta: int = 1, threshold: int = 20) None

Methods

__init__([beta, threshold])

extra_repr()

forward(input)

Attributes

beta

threshold