Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Go download David MacKay's Information Theory, Inference and Learning Algorithms (free book). Go through the part on Bayes and the part on Neural Nets (and the info. theory part if you want to, which is fascinating but not as directly relevant), which is a total of roughly 20-30 chapters, some very short. Do as many exercises as you can do (i.e. try them all, fail and come back later if necessary), and try implementing those algorithms. That will get you boned up on this stuff generally.

From there:

Standard references are Hastie and Tibshirani which you already have, Pattern Recognition by Duda Hart and Stork, and PRML by Chris Bishop (though I found it boring - too many unmotivated equations). All of Statistics and especially All of Nonparametric Statistics by Wasserman are both excellent books which will fairly rapidly get you introduced to large swaths of statistical models. Papoulis (1993) is quite a good reference on statistics in general, and Joy & Cover is the usual reference of choice for information theory (which is very relevant to what you're interested in), but neither of those are much fun to actually read.

You seem less interested in classification/ML problems and more interested in straight-up stats and/or timeseries stuff. So some slightly deeper references:

- Given your interests you might absolutely love Kevin Murphy's PhD thesis on Dynamic Bayes Nets, which are excellent for describing phenomena in all three fields you mentioned.

- Check out Geoff Hinton's work, especially on deep belief nets (there's a Google tech talk and a lot of papers).

- Hinton and Ghahramani have a tutorial called "Parameter Estimation for Linear Dynamical Systems", which could be directly applicable to the models you're talking about

- If you're interested in these dynamic, causal models you'll want to learn about EM (which you should know already since you know HMMs), and its generalization Variational Bayes. MacKay has a terse chapter on variational inference; http://www.variational-bayes.org/vbpapers.html has more. One of those is an introductory paper by Ghahramani and some others, which is nice.

- Pretty much everything on http://videolectures.net will excite you.

Some of those references (esp. the VB stuff) can get slightly hairy in terms of the maths level required (depending on your background). Bayesian Data Analysis with R (by Jim Albert), or Crawley's R book (for a more frequentist approach), can get you started using R which can avoid you needing to implement all this stuff yourself, as much of it is already implemented. This might be your fastest route to writing code that does cool stuff - understand what the algo is, use somebody else's implementation, apply it to your own problem.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: