Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get the number of output features from the last layer and remove the last layer? #347

Open
srevandro opened this issue Apr 30, 2024 · 0 comments

Comments

@srevandro
Copy link

srevandro commented Apr 30, 2024

Hello,

I am trying to add the LoRA layer to the EfficientNet-B0 till B7 algorithms. However, I am not succeed in getting the number of output features and remove the last fully connected layer. Can you help me?

I tried the following code, but there is no "._fc" attribute on Python pre-trained algorithm.

Define EfficientNet-B0 with LoRA

class EfficientNetB0LoRA(nn.Module):

def __init__(self, num_classes, lora_rank):
    super(EfficientNetB0LoRA, self).__init__()
    # Load pre-trained EfficientNet-B0 model
    self.efficientnet_b0 = EfficientNet.from_pretrained('efficientnet-b0')
    # Get number of input features for the LoRALayer
    num_features = self.efficientnet_b0._fc.in_features
    # Replace the classifier with an identity layer
    self.efficientnet_b0._fc = nn.Identity()
    # Add LoRA layer
    self.lora = LoRALayer(num_features, num_classes, lora_rank)

def forward(self, x):
    x = self.efficientnet_b0(x)
    x = self.lora(x)
    return x
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant