Trending repositories for topic neural-architecture-search
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
This is a list of interesting papers and projects about TinyML.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
Differentiable architecture search for convolutional and recurrent networks
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
This is a list of interesting papers and projects about TinyML.
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
Differentiable architecture search for convolutional and recurrent networks
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
This is a list of interesting papers and projects about TinyML.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
Differentiable architecture search for convolutional and recurrent networks
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
Transforming Neural Architecture Search (NAS) into multi-objective optimization problems. A benchmark suite for testing evolutionary algorithms in deep learning.
Evolutionary Neural Architecture Search on Transformers for RUL Prediction
[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured Prediction
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
PyTorch implementation of "Efficient Neural Architecture Search via Parameters Sharing"
Unified Architecture Search with Convolution, Transformer, and MLP (ECCV 2022)
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
Evolutionary Neural Architecture Search on Transformers for RUL Prediction
Transforming Neural Architecture Search (NAS) into multi-objective optimization problems. A benchmark suite for testing evolutionary algorithms in deep learning.
Unified Architecture Search with Convolution, Transformer, and MLP (ECCV 2022)
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
This is a list of interesting papers and projects about TinyML.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured Prediction
This directory contains code necessary to run the GraphNAS algorithm.
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
A curated list of awesome resources combining Transformers with Neural Architecture Search
A Full-Pipeline Automated Time Series (AutoTS) Analysis Toolkit.
An autoML framework & toolkit for machine learning on graphs.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
Official PyTorch implementation of "DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models" (ICLR 2024)
[ISPRS 2024] LoveNAS: Towards Multi-Scene Land-Cover Mapping via Hierarchical Searching Adaptive Network
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
This is a list of interesting papers and projects about TinyML.
A curated list of automated machine learning papers, articles, tutorials, slides and projects
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
FedML - The Research and Production Integrated Federated Learning Library: https://fedml.ai
Differentiable architecture search for convolutional and recurrent networks
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Automated Deep Learning: Neural Architecture Search Is Not the End (a curated list of AutoDL resources and an in-depth analysis)
An autoML framework & toolkit for machine learning on graphs.
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
PyTorch implementation of "Efficient Neural Architecture Search via Parameters Sharing"
A Full-Pipeline Automated Time Series (AutoTS) Analysis Toolkit.
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
[ISPRS 2024] LoveNAS: Towards Multi-Scene Land-Cover Mapping via Hierarchical Searching Adaptive Network
Neural Pipeline Search (NePS): Helps deep learning experts find the best neural pipeline.
Evolutionary Neural Architecture Search on Transformers for RUL Prediction
This is a collection of our research on efficient AI, covering hardware-aware NAS and model compression.
Transforming Neural Architecture Search (NAS) into multi-objective optimization problems. A benchmark suite for testing evolutionary algorithms in deep learning.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning
This is a list of interesting papers and projects about TinyML.
[NeurIPS 2020] MCUNet: Tiny Deep Learning on IoT Devices; [NeurIPS 2021] MCUNetV2: Memory-Efficient Patch-based Inference for Tiny Deep Learning; [NeurIPS 2022] MCUNetV3: On-Device Training Under 256K...
A Full-Pipeline Automated Time Series (AutoTS) Analysis Toolkit.
A scalable graph learning toolkit for extremely large graph datasets. (WWW'22, 🏆 Best Student Paper Award)
A toolbox for receptive field analysis and visualizing neural network architectures
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Official PyTorch Implementation of HELP: Hardware-adaptive Efficient Latency Prediction for NAS via Meta-Learning (NeurIPS 2021 Spotlight)
[AAAI '23] PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
A paper collection about automated graph learning
Official repository for PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation