Code used to obtain the results for my master thesis in computer sciences at ULB.
-
Updated
Jun 5, 2023 - Jupyter Notebook
Code used to obtain the results for my master thesis in computer sciences at ULB.
Neural architecture search for deep learning models using neuroevolution with Cultural Algorithms (preview)
卒業研究の実験のために書いたソースコードを改修したものです。全てのコードを1から書きました。(自動生成されたコードであるcython_wl_kernel.cppを除く)
Neural architecture search framework based on reinforcement learning:"A Novel Approach to Detecting Muscle Fatigue Based on sEMG by Using Neural Architecture Search Framework"
[TCAD'23] TransCODE: Co-design of Transformers and Accelerators for Efficient Training and Inference
A greedy approach for finding optimal architecture for Multi-Task Learning. Deprecated (see https://github.com/hav4ik/Hydra)
B.Tech Thesis/Project
Mutual information for fine-grained network analysis in neural architecture search
Search neural-net structure with differentiable attention
Repository for the "Regularized Meta-Learning for Neural Architecture Search" paper
Using genetic algorithms for optimizing machine/deep learning models : features selection, hyper parameters tuning.
This project provides a predictor-based NAS method, and it is namely Siamese-Predictor. The siamese-predictor is constructed with the proposed Estimation Code, which is the prior knowledge about the training procedure.
Applied evolutionary algorithms to automate deep neural network design for image classification. The network architecture was optimized by selecting the best feature extractor, number of hidden layers, number of neurons, and activation function from a predefined search space.
Smooth Variational Graph Embeddings for Efficient Neural Architecture Search
[TNNLS 2023] The official repo for the paper "HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel Neural Architecture Search".
AutoML4ETC, a tool to automatically design efficient and high-performing neural architectures for encrypted traffic classification.
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections, projection parameters, sequential sub-blocks, and normalization layers are removed. Experimental results confirm similar training speed and performance.
Differentiable neural architecture search
Neural architecture search with network morphism used for skin lesion analysis
Fast and Practical Neural Architecture Search (ICCV2019)
Add a description, image, and links to the neural-architecture-search topic page so that developers can more easily learn about it.
To associate your repository with the neural-architecture-search topic, visit your repo's landing page and select "manage topics."