Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to predict each class probability #28

Open
farhantandia opened this issue Apr 27, 2021 · 2 comments
Open

How to predict each class probability #28

farhantandia opened this issue Apr 27, 2021 · 2 comments

Comments

@farhantandia
Copy link

farhantandia commented Apr 27, 2021

how to predict the class probability? when I set the output to arcface output (softmax) w.r.t number of class, I got an error when run this model = Model(inputs=model.input[0], outputs=model.layers[-1].output) ,-1 instead of -3
`---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
in
----> 1 model = Model(inputs=model.input[0], outputs=model.layers[-1].output)

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\keras\engine\training.py in new(cls, *args, **kwargs)
240 # Functional model
241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top
--> 242 return functional.Functional(*args, **kwargs)
243 else:
244 return super(Model, cls).new(cls, *args, **kwargs)

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\training\tracking\base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
--> 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\keras\engine\functional.py in init(self, inputs, outputs, name, trainable)
113 # 'arguments during initialization. Got an unexpected argument:')
114 super(Functional, self).init(name=name, trainable=trainable)
--> 115 self._init_graph_network(inputs, outputs)
116
117 @trackable.no_automatic_dependency_tracking

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\training\tracking\base.py in _method_wrapper(self, *args, **kwargs)
455 self._self_setattr_tracking = False # pylint: disable=protected-access
456 try:
--> 457 result = method(self, *args, **kwargs)
458 finally:
459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\keras\engine\functional.py in _init_graph_network(self, inputs, outputs)
189 # Keep track of the network's nodes and layers.
190 nodes, nodes_by_depth, layers, _ = _map_graph_network(
--> 191 self.inputs, self.outputs)
192 self._network_nodes = nodes
193 self._nodes_by_depth = nodes_by_depth

~\anaconda3\envs\kaggle\lib\site-packages\tensorflow\python\keras\engine\functional.py in _map_graph_network(inputs, outputs)
929 'The following previous layers '
930 'were accessed without issue: ' +
--> 931 str(layers_with_complete_input))
932 for x in nest.flatten(node.outputs):
933 computable_tensors.add(id(x))

ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_37:0", shape=(None, 51), dtype=float32) at layer "softmax_output". The following previous layers were accessed without issue:

[-1].output :
<tf.Tensor 'class_output/Softmax_3:0' shape=(None, 10) dtype=float32>
while [-3] return feature vectors which suitable for visualization.
thanks.

@ozora-ogino
Copy link

Hi @farhantandia,
Here is my DNN implementation with ASoftmax

class DNN(tf.keras.models.Model):
    def __init__(self, num_classes=10):
        super(DNN, self).__init__()
        weight_decay = 1e-4
        self.layer_1 = tf.keras.layers.Dense(32, activation='relu')
        self.layer_2 = tf.keras.layers.Dense(10)
        self.out = ArcFace(n_classes=num_classes, regularizer=regularizers.l2(weight_decay))

    def call(self, x, training=False):
        if training:
            x, y = x[0], x[1]
        x = self.layer_1(x)
        x = self.layer_2(x)
        if training:
            out = self.out([x, y])
        else:
            # Prediction 
            # Thanks to this, you don't need to pass labels to model when you predict
            out = tf.nn.softmax(x @ self.out.W)
        return out

Sample code is following;

odel = DNN()
optimizer = tf.keras.optimizers.Adam()
loss = tf.keras.losses.categorical_crossentropy

model.compile(loss=loss, optimizer=optimizer, metrics=['acc'])
model.fit([x_train, tf.keras.utils.to_categorical(y_train, 10)], tf.keras.utils.to_categorical(y_train, 10), batch_size=512, epochs=10)

pred = model.predict(x_test)

I hope it works in your situation.

@ozora-ogino
Copy link

I implemented my own ArcFace.
https://github.com/ozora-ogino/asoftmax-tf

This works correctly and I believe this can solve your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants