Skip to content

cavalleria/cavaface

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cavaface: A Pytorch Training Framework for Deep Face Recognition

python-url pytorch-url License: MIT Docker Pulls.

By Yaobin Li and Liying Chi

Introduction

This repo provides a high-performance distribute parallel training framework for face recognition with pytorch, including various backbones (e.g., ResNet, IR, IR-SE, ResNeXt, AttentionNet-IR-SE, ResNeSt, HRNet, etc.), various losses (e.g., Softmax, Focal, SphereFace, CosFace, AmSoftmax, ArcFace, ArcNegFace, CurricularFace, Li-Arcface, QAMFace, etc.), various data augmentation(e.g., RandomErasing, Mixup, RandAugment, Cutout, CutMix, etc.) and bags of tricks for improving performance (e.g., FP16 training(apex), Label smooth, LR warmup, etc)

Features

(click to collapse)
  • Backbone
    • ResNet(IR-SE)
    • ResNeXt
    • DenseNet
    • MobileFaceNet
    • MobileNetV3
    • EfficientNet
    • ProxylessNas
    • GhostNet
    • AttentionNet-IRSE
    • ResNeSt
    • ReXNet
    • MobileNetV2
    • MobileNeXt
  • Attention Module
    • SE
    • CBAM
    • ECA
    • GCT
  • Loss
    • Softmax
    • SphereFace
    • AMSoftmax
    • CosFace
    • ArcFace
    • Combined Loss
    • AdaCos
    • SV-X-Softmax
    • CurricularFace
    • ArcNegFace
    • Li-Arcface
    • QAMFace
    • Circle Loss
  • Parallel Training
    • DDP
    • Model Parallel
  • Automatic Mixed Precision
    • AMP
  • Optimizer
    • LRScheduler(faireq,rwightman)
    • Optim(SGD,Adam,RAdam,LookAhead,Ranger,AdamP,SGDP)
    • ZeRO
  • [Data Augmentation
    • RandomErasing
    • Mixup
    • RandAugment
    • Cutout
    • CutMix
    • Colorjitter
  • Distillation
    • KnowledgeDistillation
    • Multi Feature KD
  • Bag of Tricks
    • Label smooth
    • LR warmup

Installation

See INSTALL.md.

Quick start

See GETTING_STARTED.md.

Model Zoo and Benchmark

See MODEL_ZOO.md.

License

cavaface is released under the MIT license.

Acknowledgement

Contact