Skip to content

A Tensorflow Implementation of "A ConvNet for the 2020s" and "ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders" (ConvNeXt and ConvNeXtV2)

License

Notifications You must be signed in to change notification settings

IMvision12/ConvNeXt-tf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

To-do

  • Add convnextv1 and v2 pytorch
  • convert pytorch to tensorflow
  • weight conversion

ConvNeXt and ConvNeXtV2

This repository is about an implementation of the research paper "A ConvNet of the 2020s" and "ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders" using Tensorflow.

ConvNeXtV1 : ConvNeXt, a pure ConvNet model constructed entirely from standard ConvNet modules. ConvNeXts compete favorably with Transformers in terms of accuracy and scalability, achieving 87.8% ImageNet top-1 accuracy and outperforming Swin Transformers on COCO detection and ADE20K segmentation, while maintaining the simplicity and efficiency of standard ConvNets.

ConvNeXtV2: The paper proposed a fully convolutional masked autoencoder framework (FCMAE) and a new Global Response Normalization (GRN) layer to original ConvNeXtV1 model to enhance inter-channel feature competition. This co-design of self-supervised learning techniques and architectural improvement results in a new model family called ConvNeXt V2, which significantly improves the performance of pure ConvNets on various recognition benchmarks.

ConvNeXtV1 and ConvNeXtV2 block design:

References

[1] ConvNeXt paper: https://arxiv.org/abs/2201.03545

[2] ConvNeXtV2 paper: https://arxiv.org/abs/2301.00808

[3] Official ConvNeXt code: https://github.com/facebookresearch/ConvNeXt

[4] Official ConvNeXtV2 code: https://github.com/facebookresearch/ConvNeXt-V2

About

A Tensorflow Implementation of "A ConvNet for the 2020s" and "ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders" (ConvNeXt and ConvNeXtV2)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages