-
Notifications
You must be signed in to change notification settings - Fork 549
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[v2 FEATURE]: Increase customizability of activation functions #878
Comments
I don't like the design pattern specifying both the initializer function (or an alias) as well as the parameters as inputs into a separate class. At that point, you're using composition without actually designing around it, so it's an all-around loss. For example, compare the following two design patterns using the class Bar(ABC):
def __init__(*args, **kwargs):
...
class BarA(Bar): ...
class BarB(Bar): ...
def get_bar(bar_type: str) -> type[Bar]: ...
I think (2) is significantly more clear of what's going on, as it doesn't require users to cross reference any functions or initializers to see what's going on. All of this is to say, that I'm fine with moving away from the usage of |
Thanks for sharing your thoughts. I agree explicitly passing the activation function into the MessagePassing and Predictor blocks is a clear way to improve both customizability and code readability. Do you disagree with this idea because you think passing a callable function into a neural network builder as an attribute is not good practice? |
Is your feature request related to a problem? Please describe.
Currently, only the activation functions defined in nn/utils.py can be used, which limits the customizability of activation functions.
Discussion
We can provide the flexibility to accept both
Activation
andnn.Module
objects as inputs for activation functions (floowing the format like this). Additionally, we can add an activation attribute to the neural network, similar to what PyTorch Geometric does (link). I am open to any suggestions.The text was updated successfully, but these errors were encountered: