Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chapter 4 - Proper Prior Probability Initialisation #73

Open
iamyifan opened this issue Feb 27, 2024 · 0 comments
Open

Chapter 4 - Proper Prior Probability Initialisation #73

iamyifan opened this issue Feb 27, 2024 · 0 comments

Comments

@iamyifan
Copy link

Hi developer,

The original prior probability prior = Pmf(1, hypos) in Ch.4 is initialised with 1:
image

It's better to be initialised with prior2 = Pmf(Fraction(1, len(hypos)), hypos), which is a uniform distribution, and its sum adds up to 1.

The results and the final plot are the same under np.allclose(posterior, posterior2):
image
image

Also, for a better understanding, the loop inside the function should be given a more detailed explanation:
image

For example:
The reason we can use a loop to multiply likelihood is that each coin-flipping experiment is independent of others, hence the $P(\theta)$ represents the probability of a coin landing in its head in the Bayesian theorem, $P(D_x|\theta)$ represents $x$-th coin-flipping experiment, where $D_x$ can be 'H' or 'T'. The loop represents the process of $P(D_1|\theta) \times P(D_2|\theta) \times ... \times P(D_n|\theta) = P(D|\theta)$ since they are independent of each other.

Cheers,
Yifan

@iamyifan iamyifan changed the title Proper Prior Probability Initialisation in Ch.4 Chapter 4 - Proper Prior Probability Initialisation Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant