Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make IPAdapter compatible to torch.compile #7988

Closed
wants to merge 2 commits into from

Conversation

rootonchair
Copy link
Contributor

What does this PR do?

Fixes #7985

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@yiyixuxu yiyixuxu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we also add a test?

comps = []
for i in l:
comps.append(i)
[ln0, ln1, attn, ff] = comps
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok but do we know why it has to be done this way? any documentation etc?
e.g. would this work?

for layer in self.layers:
    [ln0, ln1, attn, ff] = layers

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you try to unpack a ModuleList like above or just like in the original implementation. Following error occurs:

torch._dynamo.exc.Unsupported: UNPACK_SEQUENCE NNModuleVariable()

So I assume that any unpacking with ModuleList is not supported in torch.compile and only for loop through it is supported. Hence, I for loop the ModuleList to not breaking the graph and upack using python list

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also try moving ln0, ln1, attn, ff into a class, but it will affect the weight loading

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also try moving ln0, ln1, attn, ff into a class, but it will affect the weight loading

hi @rootonchair, you can use the IPAdapterPlusImageProjectionBlock class, but you have to update the loading function too.
I think the final result would be really good, as the code consistency in the library would improve

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@yiyixuxu yiyixuxu requested a review from sayakpaul May 20, 2024 15:46
@yiyixuxu
Copy link
Collaborator

cc @fabiorigano too

@fabiorigano
Copy link
Contributor

fabiorigano commented May 20, 2024

hi @yiyixuxu thanks for adding me here
IPAdapterPlusImageProjection was added some time ago, I am not sure why it has been implemented in this way

I think it would be nice to have a forward function that uses the

class IPAdapterPlusImageProjectionBlock(nn.Module):

that we introduced with the Face ID PR #7186

@yiyixuxu
Copy link
Collaborator

thanks @fabiorigano!
cc @rootonchair, does face id currently work with torch.compile?

@rootonchair
Copy link
Contributor Author

@yiyixuxu yes it does. As @fabiorigano suggested, we could utilize IPAdapterPlusImageProjectionBlock however weight loading also need to be updated

@sayakpaul
Copy link
Member

This works for me. Thank you, @rootonchair!

Perhaps the use of IPAdapterPlusImageProjectionBlock could live in a different PR? @yiyixuxu WDYT?

@fabiorigano
Copy link
Contributor

I can open a new PR to use IPAdapterPlusImageProjectionBlock, if @rootonchair agrees

@sayakpaul
Copy link
Member

That would be great, thank you!

@rootonchair
Copy link
Contributor Author

Sure @fabiorigano, thank you

@sayakpaul
Copy link
Member

@rootonchair does #7994 work for you?

@rootonchair
Copy link
Contributor Author

@sayakpaul yes it does. Fantastic works

@sayakpaul
Copy link
Member

Fantastic. Would you mind closing the PR then? I am sorry about the inconvenience here.

@rootonchair
Copy link
Contributor Author

Ah, no problem. I assume closing #7985 too?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

IPAdapter not compatible to torch.compile
5 participants