Skip to content

Knowledge distillation for masked FER using ResNet-18 in PyTorch.

Notifications You must be signed in to change notification settings

SridharSola/Knowledge-Distillation-FER

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge-Distillation-FER

Knowledge Distillation (KD) aims to achieve mask-invariant feature vectors so that the model focuses on non-occluded regions of the face. Our approach accomplishes this by learning in concert the correct expression recognition for masked and non-masked faces and how to push the embedding vectors of masked images and corresponding non-masked images closer. It does this through embedding-level KD. KD teaches the student model to neglect non-expression related information introduced by the mask by making the student model process masked images in a manner that produces an embedding like the non-masked embedding produced by the teacher model.

Methodology

Picture2

Results

Screenshot 2022-06-05 223333

GradCAM

Screenshot 2022-06-05 223443
Without KD
Screenshot 2022-06-05 223508
With KD

Releases

No releases published

Packages

No packages published