下拉刷新
Repository Details
Shared bynavbar_avatar
repo_avatar
HelloGitHub Rating
0 ratings
A Brand-New Neural Network Architecture
FreeMIT
Claim
Collect
Share
15k
Stars
No
Chinese
Jupyter Notebook
Language
Yes
Active
21
Contributors
214
Issues
No
Organization
0.2.7
Latest
1k
Forks
MIT
License
More
pykan image
Inspired by the Kolmogorov-Arnold representation theorem, this project applies it to the design of neural networks. The main difference between KAN (Kolmogorov-Arnold Network) and the traditional Multilayer Perceptron (MLP) architecture is the application of activation functions. In KAN, activation functions are placed on the weights, resulting in a potentially more accurate and interpretable, albeit sometimes slower to train, network than the MLP.

Comments

Rating:
No comments yet