Repository Details
Shared by
HelloGitHub Rating
0 ratings
Free•MIT
Claim
Discuss
Collect
Share
15k
Stars
No
Chinese
Jupyter Notebook
Language
Yes
Active
21
Contributors
214
Issues
No
Organization
0.2.7
Latest
1k
Forks
MIT
License
More
Inspired by the Kolmogorov-Arnold representation theorem, this project applies it to the design of neural networks. The main difference between KAN (Kolmogorov-Arnold Network) and the traditional Multilayer Perceptron (MLP) architecture is the application of activation functions. In KAN, activation functions are placed on the weights, resulting in a potentially more accurate and interpretable, albeit sometimes slower to train, network than the MLP.
Comments
Rating:
No comments yet