tests
¶
Modules¶
fastvideo.tests.conftest
¶
Functions¶
fastvideo.tests.conftest.distributed_setup
¶
Fixture to set up and tear down the distributed environment for tests.
This ensures proper cleanup even if tests fail.
Source code in fastvideo/tests/conftest.py
fastvideo.tests.utils
¶
Functions¶
fastvideo.tests.utils.compare_folders
¶
Compare videos with the same filename between reference_folder and generated_folder
Example usage:
results = compare_folders(reference_folder, generated_folder,
args.use_ms_ssim)
for video_name, ssim_value in results.items():
if ssim_value is not None:
print(
f"{video_name}: {ssim_value[0]:.4f}, Min SSIM: {ssim_value[1]:.4f}, Max SSIM: {ssim_value[2]:.4f}"
)
else:
print(f"{video_name}: Error during comparison")
valid_ssims = [v for v in results.values() if v is not None]
if valid_ssims:
avg_ssim = np.mean([v[0] for v in valid_ssims])
print(f"
Average SSIM across all videos: {avg_ssim:.4f}") else: print(" No valid SSIM values to average")
Source code in fastvideo/tests/utils.py
fastvideo.tests.utils.compute_video_ssim_torchvision
¶
Compute SSIM between two videos.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
video1_path
|
Path to the first video. |
required | |
video2_path
|
Path to the second video. |
required | |
use_ms_ssim
|
Whether to use Multi-Scale Structural Similarity(MS-SSIM) instead of SSIM. |
True
|
Source code in fastvideo/tests/utils.py
fastvideo.tests.utils.write_ssim_results
¶
write_ssim_results(output_dir, ssim_values, reference_path, generated_path, num_inference_steps, prompt)
Write SSIM results to a JSON file in the same directory as the generated videos.