fastvideo.v1.layers.mlp
#
Module Contents#
Classes#
MLP for DiT blocks, NO gated linear units |
API#
- class fastvideo.v1.layers.mlp.MLP(input_dim: int, mlp_hidden_dim: int, output_dim: Optional[int] = None, bias: bool = True, act_type: str = 'gelu_pytorch_tanh', dtype: Optional[torch.dtype] = None, prefix: str = '')[source]#
Bases:
torch.nn.Module
MLP for DiT blocks, NO gated linear units
Initialization
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- forward(x: torch.Tensor) torch.Tensor [source]#