下拉刷新
Repository Details
Shared bynavbar_avatar
repo_avatar
HelloGitHub Rating
10.0
1 ratings
Train-free DiT Model Caching Acceleration Framework
Collect
Share
465
Stars
Yes
Chinese
Python
Language
Yes
Active
2
Contributors
32
Issues
Yes
Organization
1.0.9
Latest
14
Forks
None
License
More
cache-dit image
This project is a framework that provides unified caching acceleration for Diffusers. It supports almost all DiT diffusion models, including Qwen-Image-Lightning, Qwen-Image, HunyuanImage, Wan, FLUX, etc. It can achieve efficient caching acceleration through simple code and significantly improve the inference speed without retraining the model.

Comments

Rating:
No comments yet