4 results found Sort:
End-to-end training of sparse deep neural networks with little-to-no performance loss.
Created
2019-11-25
93 commits to master branch, last one 2 years ago
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boos...
sparsity
scalability
deep-learning
randomization
classification
neuroevolution
sparse-training
complex-networks
generative-models
deep-learning-papers
deep-neural-networks
multi-layer-perceptron
scalable-deep-learning
sparse-neural-networks
evolutionary-algorithms
deep-learning-algorithms
artificial-neural-networks
adaptive-sparse-connectivity
restricted-boltzmann-machine
sparse-evolutionary-training
Created
2018-03-02
26 commits to master branch, last one 3 years ago
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Created
2021-05-28
9 commits to main branch, last one about a year ago
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
Created
2021-06-10
70 commits to main branch, last one about a year ago