This module handbook serves to describe contents, learning outcome, methods and examination type as well as linking to current dates for courses and module examination in the respective sections.
Module version of SS 2019 (current)
There are historic module descriptions of this module. A module description is valid until replaced by a newer one.
|available module versions|
|SS 2019||SS 2012||WS 2011/2|
MA3402 is a semester module in English language at Master’s level which is offered in summer semester.
This Module is included in the following catalogues within the study programs in physics.
- Catalogue of non-physics elective courses
|Total workload||Contact hours||Credits (ECTS)|
|150 h||45 h||5 CP|
Content, Learning Outcome and Preconditions
- know how discrete and continuous random variables/vectors are generated using statistical software such as R
- understand Bayesian principles, such as prior, posterior distributions
- understand the theory of MCMC algorithms from selected examples
- are able to construct MCMC algorithms to simulate from the posterior distributions and to assess convergence of MCMC simulations
- know how to use Bootstrap and Jacknife methods to estimate standard errors of estimators
- know how to apply the EM algorithm to missing data problems
- are able to program statistical algorithms in the statistical software package R
Software knowledge in R
Bachelor 2019: MA0009 Introduction to Probability and Statistics,
MA2404 Markov Chains, Software knowledge in R
Courses, Learning and Teaching Methods and Literature
Courses and Schedule
|VO||2||Computational Statistics [MA3402]||Ankerst, D.||
Wed, 08:15–09:45, Interims II 004
|UE||1||Computational Statistics (Exercise Session) [MA3402]||Ankerst, D. Selig, K.||dates in groups|
Learning and Teaching Methods
Tanner, M. A. (1996): Tools for Statistical Inference, 3rd ed. Springer-Verlag, Berlin.
Efron, B., Tibshirani, R.J. (1993): An introduction to the bootstrap. Chapman & Hall, London.
Chernick, M.R. (1999): Bootstrap methods: a practitioner's guide. Wiley, New York.
Gelman, A., Carlin, J.B., Stern H.S. and Rubin, D.B. (2004): Bayesian Data Analysis. Chapman & Hall, London.
Rizzo, M (2008): Statistical computing with R, Chapman & Hall/CRC, New York.
For additional topics: Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning, 2nd edition, electronic edition continually updated at https://web.stanford.edu/~hastie/ElemStatLearn/.
Description of exams and course work
The exam may be repeated at the end of the semester.