-
Notifications
You must be signed in to change notification settings - Fork 715
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cost function in backpropogation section is confusing and possibly incorrect. #144
Comments
The factor of (1/2) is often included for mathematical convenience. When you take the derivative of the cost function to perform gradient descent (a common optimization algorithm), the (1/2) cancels out when computing the gradient, simplifying the expressions and computations. Also I believe that this cost function is just for an instance and they have not yet taken the mean in this case: Equation for Mean Squared Error with (1/2) multiplied for convenience: And then after taking the differentiation: PS: If I am wrong please correct me, thanks. |
Wikipedia defines the cost function MSE as follows:
Yet, the ml cheatsheet uses the following formulas in the backpropogation section.
This is particularly confusing since the Linear Regression and Gradient Descent section defines it correctly:
The text was updated successfully, but these errors were encountered: