Repository Details
Shared by
HelloGitHub Rating
0 ratings
Free•MIT
Claim
Discuss
Collect
Share
16k
Stars
No
Chinese
Jupyter Notebook
Language
No
Active
24
Contributors
263
Issues
No
Organization
0.2.8
Latest
2k
Forks
MIT
License
More

Inspired by the Kolmogorov-Arnold representation theorem, this project applies it to the design of neural networks. The main difference between KAN (Kolmogorov-Arnold Network) and the traditional Multilayer Perceptron (MLP) architecture is the application of activation functions. In KAN, activation functions are placed on the weights, resulting in a potentially more accurate and interpretable, albeit sometimes slower to train, network than the MLP.
Comments
Rating:
No comments yet