Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backpropagation for (some) user-defined activation functions #58

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

pjvm742
Copy link

@pjvm742 pjvm742 commented Sep 14, 2023

This simple change would half address issue #31 . It generalises the training procedure for the sigmoid to other invertible functions.
It doesn't work for most non-invertible activation functions.[a] Those could be addressed by a different training procedure which would require the weighted sum input to a node instead of just the resulting value, and would use the derivative proper instead of a "differential expression" in terms of the function value. I might implement this at some point, but it would be more of a hassle.
[a] one exception is ReLU, which is non-invertible as it has the same value for all negative inputs; it just so happens that the derivative also has the same value for all of those inputs.

I am unhappy with the naming - "differential expression" was the best I could come up with, hopefully someone knows a better term for this.

If there are any adjustments I need to make, please let me know.

Note: requires the user to specify a "differential expression" for the
activation function, by which I mean its derivative in terms of its
function value.
Thus limited to strictly increasing, differentiable functions.
@pjvm742 pjvm742 changed the title backpropagation for user-defined activation functions backpropagation for (some) user-defined activation functions Sep 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant