9 results found Sort:

《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Created 2019-07-02
520 commits to master branch, last one 5 days ago
798
4.3k
apache-2.0
132
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
This repository has been archived (exclude archived)
Created 2018-04-24
643 commits to master branch, last one about a year ago
354
2.0k
other
47
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Created 2020-04-21
1,919 commits to develop branch, last one a day ago
42
268
apache-2.0
23
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers adva...
Created 2021-06-21
1,135 commits to main branch, last one 2 days ago
Group Fisher Pruning for Practical Network Compression(ICML2021)
Created 2021-05-14
6 commits to main branch, last one about a year ago
Using ideas from product quantization for state-of-the-art neural network compression.
Created 2020-10-23
6 commits to main branch, last one 3 years ago
16
73
bsd-3-clause
9
MUSCO: MUlti-Stage COmpression of neural networks
Created 2019-05-08
59 commits to master branch, last one 3 years ago
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
Created 2020-03-14
19 commits to master branch, last one 2 years ago
This is the official implementation of "DHP: Differentiable Meta Pruning via HyperNetworks".
Created 2020-03-30
23 commits to master branch, last one 2 years ago