Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does MorphNet support conv3d? #61

Open
notabigfish opened this issue May 12, 2019 · 3 comments
Open

Does MorphNet support conv3d? #61

notabigfish opened this issue May 12, 2019 · 3 comments
Labels
enhancement New feature or request

Comments

@notabigfish
Copy link

Hi,

I'm applying MorphNet on tf.contrib.layers.conv3d(), but it seems that no ops are found.
INFO:tensorflow:OpRegularizerManager starting analysis from: [<tf.Operation 'fc/Relu' type=Relu>].
INFO:tensorflow:OpRegularizerManager found 46 ops and 0 sources.
INFO:tensorflow:OpRegularizerManager regularizing 0 groups.
WARNING:tensorflow:No supported ops found.

Here is my code:
imgs = tf.placeholder(tf.float32, (4, 8, 24, 24, 3))
net = tf.contrib.layers.conv3d(imgs, num_outputs=64, kernel_size=7, name='conv1')
net = tf.nn.max_pool3d(net, [1, 2, 2, 2, 1], [1, 2, 2, 2, 1], padding='VALID', name='maxpool1')
net = tf.contrib.layers.batch_norm(net, center=True, scale=True, is_training=True, scope='bn1')
logits = tf.contrib.layers.fully_connected(net, 10, scope='fc')
network_regularizer = model_size_regularizer.GammaModelSizeRegularizer([logits.op], gamma_threshold=1e-3)
regularization_strength = ....

Does MorphNet support 3d convolution? Thanks!

@notabigfish
Copy link
Author

Sorry, I found the mistake in my code.
For fully connected layer, network_regularizer should be GroupLassoFlopsRegularizer.

@eladeban
Copy link
Contributor

Dear not a big fish,

I am sorry to say that for now MorphNet does NOT support conv3d.

What you have observed is the op_regularizer_manager that looking for an op called FusedBatchNorm and unfortunately for 3d convolutions this op does not exist. So it appears there is nothing to regularize. Further more, even without batch norm, the cost calculator will need to change in order to correctly handle this case.

Furthermore, It is true that fully_connected layers are regularized via GroupLassoFlopsRegularizer but again this is not what you want, as the logits have a fixed dimension (number of labels, 10 in your example), and therefore there is nothing to learn here. This is always true for the last layer of the network, regardless of if its a fully connected or a conv layer.

We are planning to support Conv3D in the near future but I don't have an exact date. If you would like to contribute and try to code it, we will be happy to share with you the way we see this implemented.

Stay tuned,

Elad

@eladeban eladeban added the enhancement New feature or request label May 14, 2019
@notabigfish
Copy link
Author

@eladeban
Dear Elad,

thanks for the detailed explanation. I appreciate this opportunity and would like to try to contribute. Honestly I really don’t know what the first move should be and I’d love to hear your ideas.

Best regards,
notabigfish

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants