You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The original prior probability prior = Pmf(1, hypos) in Ch.4 is initialised with 1:
It's better to be initialised with prior2 = Pmf(Fraction(1, len(hypos)), hypos), which is a uniform distribution, and its sum adds up to 1.
The results and the final plot are the same under np.allclose(posterior, posterior2):
Also, for a better understanding, the loop inside the function should be given a more detailed explanation:
For example:
The reason we can use a loop to multiply likelihood is that each coin-flipping experiment is independent of others, hence the $P(\theta)$ represents the probability of a coin landing in its head in the Bayesian theorem, $P(D_x|\theta)$ represents $x$-th coin-flipping experiment, where $D_x$ can be 'H' or 'T'. The loop represents the process of $P(D_1|\theta) \times P(D_2|\theta) \times ... \times P(D_n|\theta) = P(D|\theta)$ since they are independent of each other.
Cheers,
Yifan
The text was updated successfully, but these errors were encountered:
iamyifan
changed the title
Proper Prior Probability Initialisation in Ch.4
Chapter 4 - Proper Prior Probability Initialisation
Feb 27, 2024
Hi developer,
The original prior probability
prior = Pmf(1, hypos)
in Ch.4 is initialised with 1:It's better to be initialised with
prior2 = Pmf(Fraction(1, len(hypos)), hypos)
, which is a uniform distribution, and its sum adds up to 1.The results and the final plot are the same under
np.allclose(posterior, posterior2)
:Also, for a better understanding, the loop inside the function should be given a more detailed explanation:
For example:$P(\theta)$ represents the probability of a coin landing in its head in the Bayesian theorem, $P(D_x|\theta)$ represents $x$ -th coin-flipping experiment, where $D_x$ can be $P(D_1|\theta) \times P(D_2|\theta) \times ... \times P(D_n|\theta) = P(D|\theta)$ since they are independent of each other.
The reason we can use a loop to multiply likelihood is that each coin-flipping experiment is independent of others, hence the
'H'
or'T'
. The loop represents the process ofCheers,
Yifan
The text was updated successfully, but these errors were encountered: