activation
¶
Custom activation functions.
Classes¶
fastvideo.layers.activation.GeluAndMul
¶
GeluAndMul(approximate: str = 'none')
Bases: CustomOp
An activation function for GeGLU.
The function computes x -> GELU(x[:d]) * x[d:] where d = x.shape[-1] // 2.
Shapes
x: (batch_size, seq_len, 2 * d) or (num_tokens, 2 * d) return: (batch_size, seq_len, d) or (num_tokens, d)
Source code in fastvideo/layers/activation.py
fastvideo.layers.activation.NewGELU
¶
fastvideo.layers.activation.QuickGELU
¶
fastvideo.layers.activation.SiluAndMul
¶
Bases: CustomOp
An activation function for SwiGLU.
The function computes x -> silu(x[:d]) * x[d:] where d = x.shape[-1] // 2.
Shapes
x: (num_tokens, 2 * d) or (batch_size, seq_len, 2 * d) return: (num_tokens, d) or (batch_size, seq_len, d)
Functions¶
fastvideo.layers.activation.get_act_and_mul_fn
¶
get_act_and_mul_fn(act_fn_name: str) -> Module
Get an activation-and-mul (i.e. SiluAndMul) function by name.