Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross Product vs Element by Element Multiplication #126

Open
pavelbrn opened this issue Sep 20, 2021 · 2 comments
Open

Cross Product vs Element by Element Multiplication #126

pavelbrn opened this issue Sep 20, 2021 · 2 comments

Comments

@pavelbrn
Copy link
Contributor

The following comment really confused me:

# Use matrix cross product (*) to simultaneously
# calculate the derivative for each weight
d_w1 = -x1*(targets - predictions)
...

https://ml-cheatsheet.readthedocs.io/en/latest/linear_regression.html#id4

In Numpy the cross product method would be np.cross() and not *, both versions give different results. Which is the correct version?

@ivanistheone
Copy link
Contributor

Definitely no cross product invovled... cross product is a "hack" for getting perpendicular vectors and only applies for 3D vectors.

I'm guessing what is meant is just the "matrix product"

  • think of x1 is also a row vector
  • and (targets - predictions) as a column vector
  • then the Python matrix-multiply operator @ corresponds to row-times-column style matrix product which is what we want
    (equivalent to computing the dot product, .dot)

Intuitively, the d for each weight has a contribution from each data point (n=200) so dot product by it's summy nature is the convenient tool for doing this.

see

import numpy as np

u = np.array([1,3,3])
v = np.array([2,2,3])

print("The dot product between u and v can be computer as...")
print("The sum of the elementwise-wise products", sum(u.T*v))
print("The matrix product", u@v)
print("Or by calling the .dot product on one-a-dem vecs", u.dot(v), v.dot(u))

@pavelbrn If you have time to fix this, perhaps you can open a PR with change:

# Use matrix cross product (*) to simultaneously
# calculate the derivative for each weight
d_w1 = -x1*(targets - predictions)

to

# Use dot product to calculate the derivative for each weight
d_w1 = -x1.dot(targets - predictions)

(I removed the whole "simulatanous" part because it doesn't apply here. Simultanous way would be to compute d as 3D vector where matrix-vector product would be useful, but the code shows cleaner coefficient-by-coefficient approach so matrix product not involved

@pavelbrn
Copy link
Contributor Author

pavelbrn commented Sep 21, 2021

@ivanistheone Thank you for the good explanation! I opened up a new PR with the changes you proposed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants