-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Potential error in the calculation of precision@k #713
Comments
I have also found that the calculation of precision@k is incorrect. When trying to find out what was wrong, I noticed that the p@k results were identical to the results I got when calculating the recall@k myself, so I think the implicit library's p@k may be returning the recall@k instead of the precision. |
Its because of |
I want to point to a strange behavior in how the implicit package calculates evaluation metrics. In particular, precision@k should decrease as
k
increases. However, for the dataset that we tested, precision@k calculated viaprecision_at_k
increases withk
. Upon checking the code, the reason for this behavior is due to line 444 (pr_div += fmin(K, likes.size())
) inranking_metrics_at_k
. If the number of items that the user likes is smaller thank
, the code effectively truncates the denominator to the number of liked items while the numerator, which is the number of relevant/true recommended items, can increase ask
increases.I don't understand why precision@k is calculated in this package in such a manner. I have not found any other reference for this formula. Other packages tested with our dataset generated precision@k that decreases with k. If there is a reason or reference for this approach, please share.
If this is indeed an error, there are other occurrences of this truncation in here and here which should be fixed too as they introduce error i the calculation of ndcg@k.
The text was updated successfully, but these errors were encountered: