9 results found Sort:

《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Created 2019-07-02
584 commits to master branch, last one 2 days ago
800
4.4k
apache-2.0
132
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
This repository has been archived (exclude archived)
Created 2018-04-24
643 commits to master branch, last one about a year ago
385
2.2k
other
51
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Created 2020-04-21
2,305 commits to develop branch, last one 22 hours ago
53
333
apache-2.0
24
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers adva...
Created 2021-06-21
1,282 commits to main branch, last one 5 hours ago
Group Fisher Pruning for Practical Network Compression(ICML2021)
Created 2021-05-14
6 commits to main branch, last one about a year ago
Using ideas from product quantization for state-of-the-art neural network compression.
Created 2020-10-23
6 commits to main branch, last one 3 years ago
16
71
bsd-3-clause
9
MUSCO: MUlti-Stage COmpression of neural networks
Created 2019-05-08
59 commits to master branch, last one 3 years ago
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
Created 2020-03-14
19 commits to master branch, last one 2 years ago
This is the official implementation of "DHP: Differentiable Meta Pruning via HyperNetworks".
Created 2020-03-30
23 commits to master branch, last one 2 years ago