Skip to content
Discussion options

You must be logged in to vote

Linear regression model

Assumptions

  1. Homoschedastic and independent error:
  1. Connection between observation, covariants and error:

Likelihood calculations

We get the following likelihood for the observation, conditional on the variance and the coefficients.

With

and

The important part is the step from row 3 to row 4. By choosing beta_hat this way and adding and substracting Xbeta_hat at the same time inside the paranthesis of (y-Xbeta), it becomes ((y-Xbeta_hat)+(Xbeta_hat-Xbeta)). This leads finally to just the two terms left in row 4 in the exponent.

Posterior

Choose a non-informative prior (other choices are possible but more complicated):

Then the posterior distribution is:

Co…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by SebastianVeuskens
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment