Skip to main content

Mojo module

activations

The module contains implementations of activation functions.

Functions​

  • ​elu: Compute the Elu Op using the equation zifz>=0elsealphaβˆ—(ezβˆ’1)z if z >= 0 else alpha*(e^z -1).
  • ​leaky_relu: Compute the Leaky ReLU using the equation max(x,0)+negativeslopeβˆ—min(x,0)max(x, 0) + negative_slope * min(x, 0).
  • ​relu: Compute the Relu Op using the equation max(x,0)max(x, 0).
  • ​relu_n1: Compute the Relu N1 Op using the equation max(min(x,1),βˆ’1)max(min(x,1),-1).
  • ​sign: Compute the sign (0, 1) of the input value.