Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example of MLP architecture #93

Open
pplonski opened this issue Dec 15, 2023 · 0 comments
Open

Example of MLP architecture #93

pplonski opened this issue Dec 15, 2023 · 0 comments

Comments

@pplonski
Copy link

Thank you for this package. I'm looking for some example on how to implement simple MLP (Multi Layer Perceptron) with this package. Any code snippets or tutorials are welcome.

Below is some code that I glue, but I have no idea on how to do backpropagation, I would like to have fit() method implemented.

Thank you!

from numpy_ml.neural_nets.losses import CrossEntropy, SquaredError
from numpy_ml.neural_nets.utils import minibatch
from numpy_ml.neural_nets.activations import ReLU, Sigmoid
from numpy_ml.neural_nets.layers import FullyConnected
from numpy_ml.neural_nets.optimizers.optimizers import SGD

optimizer = SGD()
loss = SquaredError()

class MLP:

    def __init__(self):
        self.nn = OrderedDict()
        self.nn["L1"] = FullyConnected(
            10, act_fn="ReLU", optimizer=optimizer
        )
        self.nn["L2"] = FullyConnected(
            1, act_fn="Sigmoid", optimizer=optimizer
        )

    def forward(self, X, retain_derived=True):
        Xs = {}
        out, rd = X, retain_derived
        for k, v in self.nn.items():
            Xs[k] = out
            out = v.forward(out, retain_derived=rd)
        return out, Xs
        
    def backward(self, grad, retain_grads=True):
        dXs = {}
        out, rg = grad, retain_grads
        for k, v in reversed(list(self.nn.items())):
            dXs[k] = out
            out = v.backward(out, retain_grads=rg)
        return out, dXs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant