Bayesian Methods for the Physical Sciences
A.A. 2010/2011
by Stefano Andreon
You need in advance: a) to be able of drawing plots and make simple operations on numbers and vectors, and b) that JAGS runs on your computer. JAGS user manual is here. See my notes here. Files, to test your reading of JAGS output, are here: CODAchain1.txt and CODAindex.txt.
Preliminary program, based on 2009/2010 program.
Lecture 1)
Probability axioms (inclusive of marginalization & the bad it may
occurs following other rules). Carla Bruni & neutrino mass. Neutrino
mass upper limit. Introduction to JAGS.
Student work: re-compute of p(T) of the most distant cluster known, JKCS041, (alias marginalization; p(T,nh) is here),
compute neutrino mass upper limit, with informative prior, and with a
sloppy mass prior. Small role of the prior where the likelihood is very
near to zero.
Lecture 2)
Poisson (count) model. Combining different measurements. Binomial
model. Again on the prior: the precise shape does not matter, but its
shape does, alias the Malmquist/Eddington bias (i.e. the Bayes theorem
re-discovered two hundred years later). The bad of forgetting to
account for the existence of boundaries.
Student work: use JAGS to compute the posterior for a Poisson and a Binomial model (dry merger rate).
Lecture
3)
Multi-parameters models. Source plus background (both Poisson).
Combining different measurements of a source flux with different
background levels. Measuring the intrinsic scatter, from data with
heteroscedastic (different from point to point) errors. Comparison with
state-of-art standard, Robust. Accounting for a contaminating population. Overdispersed version of the source plus
background model, alias "cosmic variance".
Student work: use
JAGS to compute the posterior for source plus background model, for
combining different measurements, for computing the cluster velocity
dispersion. Data for the last exercise. How to compute (and plot) the error on the model (shading).
Lecture 4)
Regression. Remember selection effects! Do you regress for
predicting, to infer the parameters, or for establishing if a
trend is there?
Starting easy: no error on predictor variable, heteroscedastic errors on y, and instrinsic scatter.
Student work: test teacher statements, by computing yourself E(x|y), E(y|x) from this sample, generated with JAGS. Its CODAindex is here. Has the constant of fine structure changed? Data courtesy of Molaro et al. (2010, in preparation).
Lecture 5)
Regression, full problem: heteroscedastic errors on x, y, and
instrinsic scatter (x is a difference of two Poisson variate!). Mixture
of regressions (i.e. regressions in presence of a contaminating
population).
Student work: compute how the mass of clusters depends on richness, using data from Andreon & Hurn
(2010, MNRAS, 404, 1922). Compute the predicted mass (predictive
posterior) of a cluster without a known mass. Finally, make the
analysis of my last paper, Andreon (2010, MNRAS, 407, 263). Did that alone, no hints,
no
suggestions, just data and data!
Compute the stellar and gas baryon fraction, its dependence with mass,
and the stellar+gas baryon fraction. Intrinsic scatter is there,
heteroscedastic errros, different sample for different measurements,
i.e. the routine astronomical job.
Homeworks.
Give a look to my Bayesian primer for astronomers, to my Inference Page, to my book , and maybe to my homepage.
The lectures will hold in April-May 2011 to PhD students, post-doc and stuff of the Universita' degli
Studi di Milano-Bicocca, Universita' degli Studi di Bologna and open to all people if registered (mail to Stefano Andreon).