You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@xiaoguaishoubaobao I think you will have to inject your own attention blocks at the stage you want. You need to look into model.py file and introduce your attention blocks at the place you want. It would be nice to implement an attention integrated version of this. But I guess, it's not so generic especially the placement of attention module. It could be at FPN outputs or ResNet, etc. I once extracted FPN+ResNet part of the model and fed FPN output features into CBAM in my Siamese Network model. It is possible.
No description provided.
The text was updated successfully, but these errors were encountered: