ncdl.nn.ReLU

class ncdl.nn.ReLU(*args: Any, **kwargs: Any)

Applies the rectified linear unit function element-wise:

\(\text{ReLU}(x) = (x)^+ = \max(0, x)\)

Parameters:

inplace – can optionally do the operation in-place. Default: False

Shape:
  • Input: \((*)\), where \(*\) means any number of dimensions.

  • Output: \((*)\), same shape as the input.

scripts/activation_images/ReLU.png

Examples:

  >>> m = nn.ReLU()
  >>> input = torch.randn(2)
  >>> output = m(input)


An implementation of CReLU - https://arxiv.org/abs/1603.05201

  >>> m = nn.ReLU()
  >>> input = torch.randn(2).unsqueeze(0)
  >>> output = torch.cat((m(input), m(-input)))
__init__(inplace: bool = False)

Methods

__init__([inplace])

extra_repr()

forward(input)

Attributes

inplace