Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use 'explain' method in class 'AlternatingLeastSquares' #646

Open
singsinghai opened this issue Feb 14, 2023 · 4 comments
Open

Can't use 'explain' method in class 'AlternatingLeastSquares' #646

singsinghai opened this issue Feb 14, 2023 · 4 comments

Comments

@singsinghai
Copy link

singsinghai commented Feb 14, 2023

I found in implicit/cpu/als.py the code :
def explain(self, userid, user_items, itemid, user_weights=None, N=10)
But when I tried to use it I got error below:
AttributeError: 'AlternatingLeastSquares' object has no attribute 'explain'
I have pip installed the latest version of implicit but still does not work out. Can you help me clarify if I'm missing something?

@benfred
Copy link
Owner

benfred commented Feb 24, 2023

Are you using the GPU model (like does model.__class__ show implicit.gpu.als.AlternatingLeastSquares? The GPU code doesn't have this method implemented - but you can convert to a CPU model with model.to_cpu() - and then call explain on that.

@singsinghai
Copy link
Author

Are you using the GPU model (like does model.__class__ show implicit.gpu.als.AlternatingLeastSquares? The GPU code doesn't have this method implemented - but you can convert to a CPU model with model.to_cpu() - and then call explain on that.

Dear benfred,

Thanks for the explanation, it works for me. Can I ask a further question: The top_contributions is "a list of the top N (itemid, score) contributions for this user/item pair", but what score does it base on? Is it the initial event_strength of the sparse matrix we passed in for training, or is it the matrix after we have filled in using co-similarity scores?

@dminovski0
Copy link

Can the explain method be used to explain similar users, and why it recommended some users to a specific user?

@essefi-ahlem
Copy link

essefi-ahlem commented Apr 20, 2023

I have another question @benfred @ita9naiwa :
when recommending items for this user id=1, I get a score of 1.35 for item 8708
image
when i want to explain why did i get item 8708 recommended for user 1, the first parameter of explain is supposed to be "The total predicted score for this user/item pair", I thought it needs to be equal to what I got from recommend which we found it equal to 1.35, but here it is equal to another score = 0.56
image
So my question what is the difference between the score given in recommend and the score in explain?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants