Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Helper functions for making symmetrical autoencoders with tied params (WIP) #392

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

neosatrapahereje
Copy link
Contributor

Hi!

I added a couple of helper methods for making symmetrical autoencoders using the InverseLayer. Everything is now in lasagne/layers/autoenc.py (including moving the InverseLayer there).

Have a nice weekend!

@f0k
Copy link
Member

f0k commented Aug 24, 2015

Looks like a good start!

There are several lines not covered in the tests: https://coveralls.io/builds/3368850/source?filename=lasagne%2Flayers%2Fautoenc.py
When running the tests locally, make sure you get 100% coverage for all the files you've changed.

I'll review and add some more specific comments!

@@ -18,6 +18,7 @@
layers/merge
layers/embedding
layers/special
layers/autoenc
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd prefer to have special the last one in the list before the ones that need to be imported explicitly. special is the bin for everything falling through the previous categories. So please add autoenc before that!

@alexjc
Copy link

alexjc commented Nov 20, 2015

@neosatrapahereje Hi, nice work! Very interested in using this. Do you have any plans for this PR?

@benanne
Copy link
Member

benanne commented Nov 20, 2015

It looks like this PR was abandoned, but note that you can use arbitrary Theano expressions as layer parameters in Lasagne since last month: http://benanne.github.io/2015/11/10/arbitrary-expressions-as-params.html (see "Autoencoders with tied weights")

I don't know what you have in mind of course, but perhaps this suffices :)

@alexjc
Copy link

alexjc commented Nov 20, 2015

Thanks @benanne, I will look into it. I'll be adding Lasagne-based auto encoders to scikit-neuralnetwork and I think that will be a good starting point!

…parameters and moved InverseLayer to autoenc.py (included the respective tests in test_autoenc.py)
@neosatrapahereje
Copy link
Contributor Author

@alexjc Thanks for your interest in these functions. @f0k helped me to get this code to the current state, and hopefully, will review this PR. Otherwise, you can use the methods directly from here:

https://gist.github.com/neosatrapahereje/69562642b92996fba408

@f0k
Copy link
Member

f0k commented Nov 23, 2015

We've squashed and rebased this PR, so it's mergeable again. We just need to decide:

  1. Do we want to move InverseLayer to lasagne.layers.autoenc along with the helper functions, or should we add the helper functions somewhere else?
  2. Is the code okay? Personally, I find the n_idx and b_idx a bit confusing, and the code will do funny things if the input network is not just a "tower". Can we protect against that?

@rmanor
Copy link

rmanor commented Jun 11, 2016

This PR seems very useful.
Are there any plants to commit this?
Thanks.

@f0k
Copy link
Member

f0k commented Jun 14, 2016

This PR seems very useful.
Are there any plants to commit this?

In hindsight I think this may be too specific / not general enough to be included. It only supports simple architectures of a single stack of layers. We could turn it into a Lasagne Recipe, though, for easier copy/pasting. For now, the easiest will be to copy/paste the functions from https://github.com/neosatrapahereje/Lasagne/blob/a61947c/lasagne/layers/autoenc.py into your code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants