-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Balanced Meta-Softmax mixed with CosFace #96
Comments
There are something I've thought about:
|
Ok, I have some troubles, with the сonvergence. What would you suggest to make it work well since the some datasets are very long-tailed? |
I don't have many experience in handling those long-tailed classification either. class CosFaceLoss(ArcfaceLossSimple):
def __init__(self, margin=0.35, scale=64.0, from_logits=True, label_smoothing=0, sample_per_class=None, **kwargs):
super(CosFaceLoss, self).__init__(margin, scale, from_logits, label_smoothing, **kwargs)
self.sample_per_class = sample_per_class
def call(self, y_true, norm_logits):
if self.batch_labels_back_up is not None:
self.batch_labels_back_up.assign(tf.argmax(y_true, axis=-1))
pick_cond = tf.cast(y_true, dtype=tf.bool)
logits = tf.where(pick_cond, norm_logits - self.margin, norm_logits)
if self.sample_per_class is not None:
logits *= self.sample_per_class # * or +
logits *= self.scale
return tf.keras.losses.categorical_crossentropy(y_true, logits, from_logits=self.from_logits, label_smoothing=self.label_smoothing) Anyway, I didn't take any test on this... |
Hello @leondgarse, I've implemeted Balanced Meta-Softmax (https://github.com/jiawei-ren/BalancedMetaSoftmax) with CosFace loss function, but I have problem with the сonvergence. Test metric fall down like AgeDB. Could you implement this on your great framework?
The text was updated successfully, but these errors were encountered: