9 results found Sort:

《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Created 2019-07-02
597 commits to master branch, last one 7 days ago
804
4.4k
apache-2.0
130
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
This repository has been archived (exclude archived)
Created 2018-04-24
643 commits to master branch, last one about a year ago
400
2.3k
other
49
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Created 2020-04-21
2,686 commits to develop branch, last one 24 hours ago
64
383
apache-2.0
22
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers adva...
Created 2021-06-21
1,377 commits to main branch, last one 11 hours ago
Group Fisher Pruning for Practical Network Compression(ICML2021)
Created 2021-05-14
6 commits to main branch, last one about a year ago
Using ideas from product quantization for state-of-the-art neural network compression.
Created 2020-10-23
6 commits to main branch, last one 4 years ago
16
71
bsd-3-clause
8
MUSCO: MUlti-Stage COmpression of neural networks
Created 2019-05-08
59 commits to master branch, last one 4 years ago
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
Created 2020-03-14
19 commits to master branch, last one 3 years ago
This is the official implementation of "DHP: Differentiable Meta Pruning via HyperNetworks".
Created 2020-03-30
23 commits to master branch, last one 3 years ago