-
-
Notifications
You must be signed in to change notification settings - Fork 960
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Guarantee weights_below
to be finite in MOTPE
#5435
Conversation
Probably, what we should do is as follows: def _calculate_leave_one_out_hypervolume_contributions(
loss_vals: np.ndarray, ref_point: np.ndarray
) -> np.ndarray:
n_below = len(loss_vals)
on_front = _is_pareto_front(loss_vals, assume_unique_lexsorted=False)
pareto_sols = loss_vals[on_front]
contribs = np.zeros(n_below, dtype=float)
hv = WFG().compute(pareto_sols, ref_point)
if np.isfinite(hv):
leave_one_out_masks = ~np.eye(pareto_sols.shape[0]).astype(bool)
hvs_leave_one_out = np.asarray(
[WFG().compute(pareto_sols[loo], ref_point) for loo in leave_one_out_masks]
)
contribs[on_front] = hv - hvs_leave_one_out
else:
_, inv, pareto_sol_counts = np.unique(
pareto_sols, axis=0, return_counts=True, return_inverse=True
)
on_front_and_not_duplicated = np.zeros(n_below, dtype=bool)
# pareto_sol_counts[inv, i] is how many times we see pareto_sols[i] in pareto_sols.
on_front_and_not_duplicated[on_front] = pareto_sol_counts[inv] > 1
contribs[on_front_and_not_duplicated] = 1.0
return contribs In the code above, we separate the process for the finite
The second point follows the original concept of the leave-one-out contribution, meaning that if a solution duplicates, its hypervolume contribution becomes zero. Then the corresponding lines in if n_below == 0:
weights_below = np.asarray([])
elif n_below == 1:
weights_below = np.asarray([1.0])
else:
worst_point = np.max(loss_vals, axis=0)
ref_point = np.maximum(1.1 * worst_point, 0.9 * worst_point)
ref_point[ref_point == 0] = EPS
contribs = _calculate_leave_one_out_hypervolume_contributions(lvals, ref_point) + EPS
weights_below = np.clip(contribs / np.max(contribs), 0, 1) We can separate |
Thank you for your detailed advice! However, some of them are beyond the scope of this PR because it is just intended to fix the issue.
|
Co-authored-by: Shuhei Watanabe <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Co-authored-by: Shuhei Watanabe <[email protected]>
@not522 Could you review this PR? |
This pull request has not seen any recent activity. |
@y0z Could you review this PR? (I am aware that you are very busy with another higher priority project; this re-assignment could be a formality.) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Motivation
Resolve #5374
Description of the changes
I added code to guarantee
weights_below
to be finite in MOTPE. Weights fall-back to uniform if it contains inf, nan, or a value large enough to be inf when summed.