Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I add an attention mechanism to Mask R-CNN? Is there a specific code implementation available? #3019

Open
xiaoguaishoubaobao opened this issue Jan 29, 2024 · 1 comment

Comments

@xiaoguaishoubaobao
Copy link

No description provided.

@nyinyinyanlin
Copy link

@xiaoguaishoubaobao I think you will have to inject your own attention blocks at the stage you want. You need to look into model.py file and introduce your attention blocks at the place you want. It would be nice to implement an attention integrated version of this. But I guess, it's not so generic especially the placement of attention module. It could be at FPN outputs or ResNet, etc. I once extracted FPN+ResNet part of the model and fed FPN output features into CBAM in my Siamese Network model. It is possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants