Skip to main content
Log in

Mojo module

activations

The module contains implementations of activation functions.

Functions

  • elu: Compute the Elu Op using the equation zifz>=0elsealpha(ez1)z if z >= 0 else alpha*(e^z -1).
  • gelu: Compute the GELU Op using the equation 0.5x(1+erf(x/sqrt(2)))0.5 * x * (1 + erf(x / sqrt(2))).
  • gelu_approximate: Compute the approximate GELU Op using the equation 0.5x(1+tanh(sqrt(2/pi)(x+0.044715x3)))0.5 * x * (1 + tanh(sqrt(2 / pi) * (x + 0.044715 * x^3))).
  • relu: Compute the Relu Op using the equation max(0,x)max(0, x).
  • relu_n1: Compute the Relu N1 Op using the equation max(min(x,1),1)max(min(x,1),-1).
  • sign: Compute the sign (0, 1) of the input value.

Was this page helpful?