A lightweight and extensible toolbox for image classification
-
Updated
Jun 7, 2024 - Python
A lightweight and extensible toolbox for image classification
Official PyTorch Implementation for the "Distilling Datasets Into Less Than One Image" paper.
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
Trained a multiclass classifer network using cifar100 dataset
Knowledge Distillation from VGG16 (teacher model) to MobileNet (student model)
classifying CIFAR-100 data set using MCSVM and Deep Conv Net
sensAI: ConvNets Decomposition via Class Parallelism for Fast Inference on Live Data
PyTorch implementation of 'ViT' (Dosovitskiy et al., 2020) and training it on CIFAR-10 and CIFAR-100
This project explores diverse approaches to image classification on the CIFAR-100 dataset. Starting from traditional CNNs combined with KNN classifiers, it progresses to ResNet50 with FCNN and culminates in the cutting-edge Vision Transformer (ViT) model.
IJCAI 2024, InfoMatch: Entropy neural estimation for semi-supervised image classification
Models with variable output classes designed for CIFAR-100
Feather is a module that enables effective sparsification of neural networks during training. This repository accompanies the paper "Feather: An Elegant Solution to Effective DNN Sparsification" (BMVC2023).
This repository includes official implementation and model weights of Data-Efficient Multi-Scale Fusion Vision Transformer.
Two case studies: effects of changing the learning rate on model perfomance for image classificaiton, and cardiac failure prediction using clinical data
Implementaiton of BSC-Densenet-121 in Pytorch from research paper "Adding Binary Search Connections to Improve DenseNet Performance".
Official PyTorch Code for "Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?" (https://arxiv.org/abs/2305.12954)
Kleines CNN zur Klassifizierung des CIFAR-10 Datensatzes.
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)
Code for You Only Cut Once: Boosting Data Augmentation with a Single Cut, ICML 2022.
Add a description, image, and links to the cifar100 topic page so that developers can more easily learn about it.
To associate your repository with the cifar100 topic, visit your repo's landing page and select "manage topics."