fastvideo.v1.pipelines.lora_pipeline#

Module Contents#

Classes#

LoRAPipeline

Pipeline that supports injecting LoRA adapters into the diffusion transformer. TODO: support training.

Data#

API#

class fastvideo.v1.pipelines.lora_pipeline.LoRAPipeline(*args, **kwargs)[source]#

Bases: fastvideo.v1.pipelines.composed_pipeline_base.ComposedPipelineBase

Pipeline that supports injecting LoRA adapters into the diffusion transformer. TODO: support training.

Initialization

Initialize the pipeline. After init, the pipeline should be ready to use. The pipeline should be stateless and not hold any batch state.

convert_to_lora_layers() None[source]#

Converts the transformer to a LoRA transformer.

cur_adapter_name: str = <Multiline-String>[source]#
device: torch.device[source]#

‘device(…)’

exclude_lora_layers: List[str][source]#

[]

fastvideo_args: fastvideo.v1.fastvideo_args.FastVideoArgs[source]#

None

is_target_layer(module_name: str) bool[source]#
lora_adapters: Dict[str, Dict[str, torch.Tensor]][source]#

‘defaultdict(…)’

lora_layers: Dict[str, fastvideo.v1.layers.lora.linear.BaseLayerWithLoRA][source]#

None

set_lora_adapter(lora_nickname: str, lora_path: Optional[str] = None)[source]#

Loads a LoRA adapter into the pipeline and applies it to the transformer.

Parameters:
  • lora_nickname – The “nick name” of the adapter when referenced in the pipeline.

  • lora_path – The path to the adapter, either a local path or a Hugging Face repo id.

fastvideo.v1.pipelines.lora_pipeline.logger[source]#

‘init_logger(…)’