You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to know if the code for the attention mechanism interaction of your MIA module has been commented out. Does this seem to be inconsistent with the description in your paper? #12
OR because the default model of the code is resnet50, so convolution is used instead of the attention mechanism? If that's the case, could you please provide me with the code for SwinTransformer?