下拉刷新
Repository Details
Shared bynavbar_avatar
repo_avatar
HelloGitHub Rating
0 ratings
Microsoft's Open-Source 1-bit Large Model Inference Framework
FreeMIT
Claim
Collect
Share
19.6k
Stars
No
Chinese
C++
Language
Yes
Active
14
Contributors
123
Issues
Yes
Organization
None
Latest
1k
Forks
MIT
License
More
BitNet image
This is an inference framework designed by Microsoft specifically for CPU-based local inference and extreme compression (low-bit) of large models. It supports efficient and low-power inference for 1-bit/1.58-bit quantized models, compatible with models such as BitNet, Llama3-8B-1.58, and Falcon3. It is suitable for running large model inference tasks locally or on edge devices without the need for GPUs.

Comments

Rating:
No comments yet