Hacker News new | past | comments | ask | show | jobs | submit login

Nice! There are many books that cover this, even the docs for Pyro/other libraries are useful, it just depends on your preference for how material is presented + your background.

Bishop's "Pattern Recognition and Machine Learning" has a chapter on PGM's that's free online: https://www.microsoft.com/en-us/research/wp-content/uploads/...

Murphy's "Machine Learning: A Probabilistic Perspective" is another behemoth that covers this stuff, but it's really just your preference.

I say "aspire" because (1) depending on your background, it will likely be something that takes awhile to internalize and really understand, and you will probably realize many times over that you thought you understood something that you actually didn't (2) by learning PGM's, you learn a lot of Bayesian statistics as a side effect, hence why even learning a little bit about them is rewarding.

Once you learn a bit, I would use Pyro/other libraries and try to actually build PGM's for toy problems (or non-toy problems too..) because (1) it will force you to admit to yourself that you don't understand something, (2) the documentation for a lot of these libraries is also useful learning material, and (3) you will see once you learn these libraries that it is fairly easy to do something that would be astoundingly complex if you were to try and do it by hand.

You can basically build most standard ML algorithms as a PGM, so e.g. you can try to do logistic regression as a PGM and compare the results to scikit-learn.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: