下拉刷新
Repository Details
Shared bynavbar_avatar
repo_avatar
HelloGitHub Rating
10.0
1 ratings
Microsoft's Open-Source 1-bit Large Model Inference Framework
FreeMIT
Claim
Collect
Share
21.1k
Stars
No
Chinese
Python
Language
Yes
Active
14
Contributors
149
Issues
Yes
Organization
None
Latest
2k
Forks
MIT
License
More
BitNet image
This is an inference framework designed by Microsoft specifically for CPU-based local inference and extreme compression (low-bit) of large models. It supports efficient and low-power inference for 1-bit/1.58-bit quantized models, compatible with models such as BitNet, Llama3-8B-1.58, and Falcon3. It is suitable for running large model inference tasks locally or on edge devices without the need for GPUs.

Comments

Rating:
No comments yet