Skip to content

MarttiWu/Training-Free-NAS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

Training-Free-NAS

Training-Free Neural Architecture Search: A Review

Training-Free Neural Architecture Search (NAS) is an innovative approach within the field of NAS that aims to streamline the process of discovering optimal neural network architectures without relying on traditional training paradigms.

Unlike conventional NAS methods that involve iterative training and evaluation of numerous architectures to identify the best-performing one, training-free NAS explores architectures without training. Instead, it often leverages training-free score functions, zero-cost proxies, or analytical methods to estimate or directly predict the performance of neural network architectures, significantly reducing computational overhead and resource requirements.

These methods are designed to accelerate the architecture search process, making it more efficient, cost-effective, and accessible for researchers and practitioners. Training-free NAS contributes to advancing the field of deep learning by expediting the discovery of novel architectures and facilitating their deployment in various domains, including computer vision, natural language processing, etc.

This repository serves as a curated collection of research papers and benchmarks dedicated to training-free NAS, aiming to provide a comprehensive resource for understanding, exploring, and evaluating these innovative approaches in NAS.

Training-Free NAS Papers

RS BO RL DSA MA
Random Search Bayesian Optimization Reinforcement Learning Differentiable Search Algorithm Metaheuristic Algorithm
Title Venue Type Code Year
A Training-Free Neural Architecture Search Algorithm based on Search Economics TEVC MA GitHub 2023
Connection Sensitivity Matters for Training-free DARTS: From Architecture-Level Scoring to Operation-Level Sensitivity Analysis arXiv DSA - 2023
FreeREA: Training-Free Evolution-Based Architecture Search WACV MA GitHub 2023
Zero-Cost Operation Scoring in Differentiable Architecture Search arXiv DSA GitHub 2023
Auto-Scaling Vision Transformers without Training ICLR RL GitHub 2022
GradSign: Model Performance Inference with Theoretical Insights ICLR RS GitHub 2022
Global Convergence of MAML and Theory-Inspired Neural Architecture Search for Few-Shot Learning CVPR DSA GitHub 2022
Improving Neural Architecture Search by Mixing a FireFly Algorithm with a Training Free Evaluation IJCNN MA GitHub 2022
LiteTransformerSearch: Training-free On-device Search for Efficient Autoregressive Language Models AutoML-Conf MA GitHub 2022
MAE-DET: Revisiting Maximum Entropy Principle in Zero-Shot NAS for Efficient Object Detection ICML MA GitHub 2022
NASI: Label- and Data-agnostic Neural Architecture Search at Initialization ICLR DSA SM 2022
SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics CVPR MA and DSA GitHub 2022
A Training-Free Genetic Neural Architecture Search ICEA MA - 2021
A Feature Fusion Based Indicator for Training-Free Neural Architecture Search IEEE Access MA - 2021
Bayesian Neural Architecture Search using a Training-Free Performance Metric ASOC BO GitHub 2021
EPE-NAS: Efficient Performance Estimation Without Training for Neural Architecture Search ICANN RS GitHub 2021
KNAS: Green Neural Architecture Search ICML RS GitHub 2021
Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective ICLR DSA GitHub 2021
Neural Architecture Search without Training ICML RS GitHub 2021
Reliable and Fast Recurrent Neural Network Architecture Optimization arXiv MA - 2021
Training-Free Multi-objective Evolutionary Neural Architecture Search via Neural Tangent Kernel and Number of Linear Regions ICONIP MA GitHub 2021
Training-Free Hardware-Aware Neural Architecture Search with Reinforcement Learning JBE RL - 2021
Understanding and Accelerating Neural Architecture Search with Training-Free and Theory-Grounded Metrics arXiv MA and RL GitHub 2021
Zen-NAS: A Zero-Shot NAS for High-Performance Image Recognition ICCV MA GitHub 2021
Zero-Cost Proxies for Lightweight NAS ICLR RS,RL, and MA GitHub 2021
Towards NNGP-guided Neural Architecture Search arXiv RS GitHub 2020
Low-Cost Recurrent Neural Network Expected Performance Evaluation arXiv RS - 2018

NAS Benchmarks

Title Venue Unique Architectures Code
NAS-Bench-101: Towards Reproducible Neural Architecture Search ICML 423.6k Github
NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search ICLR 6.5k Github
NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size TPAMI 6.5k+32.8k Github
NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search arXiv 10^18+60k Github
NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search ICLR 14.6k Github
NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing IEEE Access 14k Github
TransNAS-Bench-101: Improving Transferability and Generalizability of Cross-Task Neural Architecture Search CVPR 7.3k VEGA
NAS-Bench-360: Benchmarking Neural Architecture Search on Diverse Tasks NeurIPS 15.6k Github
NAS-Bench-Zero: A Large Scale Dataset for Understanding Zero-Shot Neural Architecture Search - 14.9k+10.1k+9.7k -

Contributing

Contributions and feedback are welcome! Feel free to open issues or pull requests to improve existing content.

License

This repository is licensed under the MIT License.

Support

If you find this repository helpful, consider giving it a star ⭐️.

If you find this repository or the resources included herein helpful in your research or work, you can cite it using the following BibTeX entry:

@misc{Wu2023,
  author = {Wu, Meng-Ting},
  title = {Training-Free NAS},
  year = {2023},
  journal = {GitHub repository},
  publisher = {GitHub},
  howpublished = {\url{https://github.com/MarttiWu/Training-Free-NAS}},
}