下拉刷新
Repository Details
Shared bynavbar_avatar
repo_avatar
HelloGitHub Rating
10.0
1 ratings
Train a Small Language Model from Scratch
FreeApache-2.0
Claim
Collect
Share
13.3k
Stars
No
Chinese
Python
Language
Yes
Active
4
Contributors
36
Issues
No
Organization
minimind-1
Latest
1k
Forks
Apache-2.0
License
More
minimind image
This is not only an implementation of a mini-language model, but also an introductory tutorial for LLMs, aimed at lowering the barrier to learning and getting started with LLMs. It provides the full process code and tutorials from data preprocessing to model training, fine-tuning, and inference. The smallest model has a parameter count of only 0.02B, which can be easily run on a regular GPU.

Comments

Rating:
No comments yet