de | en

Computational Statistics

Module MA3402

This Module is offered by TUM Department of Mathematics.

This module handbook serves to describe contents, learning outcome, methods and examination type as well as linking to current dates for courses and module examination in the respective sections.

Module version of WS 2020/1 (current)

There are historic module descriptions of this module. A module description is valid until replaced by a newer one.

Whether the module’s courses are offered during a specific semester is listed in the section Courses, Learning and Teaching Methods and Literature below.

available module versions
WS 2020/1SS 2020SS 2019SS 2012WS 2011/2

Basic Information

MA3402 is a semester module in English language at Master’s level which is offered in summer semester.

This Module is included in the following catalogues within the study programs in physics.

  • Catalogue of non-physics elective courses
Total workloadContact hoursCredits (ECTS)
150 h 45 h 5 CP

Content, Learning Outcome and Preconditions


Computational statistics methods are required when analyzing complex data structures. In this course you will learn the basics of recent computational statistics methods such as Markov Chain Monte Carlo (MCMC) methods, expectation-maximization (EM) algorithm and the bootstrap. Emphasis will be given to basic theory and applications. In particular the following topics will be covered: Random variable generation: discrete, continuous, univariate, multivariate, resampling. Numerical methods for integration, root-finding and optimization. Bayesian inference: posterior distribution, hierarchical models, Markov chains, stationary and limiting distributions, Markov Chain Monte Carlo Methods (MCMC): Gibbs sampling, Metropolis-Hastings algorithm, implementation, convergence diagnostics, software for MCMC, Model adequacy and model choice. EM Algorithm: Theory, EM in exponential family, computation of standard errors. Bootstrap and Jacknife methods: empirical distribution and plug-in, bootstrap estimate of standard errors, jacknife and relationship to bootstrap, confidence intervals based on bootstrap percentiles, permutation tests and extensions. If time permits these optional topics will be discussed: association rules and a priori algorithm for market basket analysis, introduction to unsupervised learning and cluster analysis and principal component analysis.

Learning Outcome

Upon completion of the module, students
- know how discrete and continuous random variables/vectors are generated using statistical software such as R
- understand Bayesian principles, such as prior, posterior distributions
- understand the theory of MCMC algorithms from selected examples
- are able to construct MCMC algorithms to simulate from the posterior distributions and to assess convergence of MCMC simulations
- know how to use Bootstrap and Jacknife methods to estimate standard errors of estimators
- know how to apply the EM algorithm to missing data problems
- are able to program statistical algorithms in the statistical software package R


MA1401 Introduction to Probability, MA2402 Basic Statistics, MA2404 Markov Chains,
Software knowledge in R
Bachelor 2019: MA0009 Introduction to Probability and Statistics,
MA2404 Markov Chains, Software knowledge in R

Courses, Learning and Teaching Methods and Literature

Courses and Schedule

Please keep in mind that course announcements are regularly only completed in the semester before.

VO 2 Computational Statistics Czado, C.
Responsible/Coordination: Kraus, D.
Thu, 10:15–11:45, BC1 BC1 2.02.01
and singular or moved dates
UE 1 Computational Statistics (Exercise Session) Czado, C. Kraus, D. dates in groups

Learning and Teaching Methods

The module is offered as lectures with accompanying practice sessions. In the lectures, the contents will be presented in a talk with illustrative examples, as well as through discussion with the students. The lectures should motivate the students to carry out their own analysis of the themes presented and to independently study the relevant literature. Attached to the lecture, practice sessions will be offered, in which exercise sheets and solutions will be available. In this way, students can deepen their understanding of the methods and concepts taught in the lectures and independently check their progress.


blackboard and slides


Gamerman, D. and Lopes, H.F. (2006): Markov Chain Monte Carlo. Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, Chapman & Hall/CRC, New York.
Tanner, M. A. (1996): Tools for Statistical Inference, 3rd ed. Springer-Verlag, Berlin.
Efron, B., Tibshirani, R.J. (1993): An introduction to the bootstrap. Chapman & Hall, London.
Chernick, M.R. (1999): Bootstrap methods: a practitioner's guide. Wiley, New York.
Gelman, A., Carlin, J.B., Stern H.S. and Rubin, D.B. (2004): Bayesian Data Analysis. Chapman & Hall, London.
Rizzo, M (2008): Statistical computing with R, Chapman & Hall/CRC, New York.
For additional topics: Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning, 2nd edition, electronic edition continually updated at

Module Exam

Description of exams and course work

The module examination is based on a written exam (60 minutes). In the exam, students are asked to write statistical algorithms to solve specific problems in a similar fashion as they have been performed in the homework. They may be asked to interpret R code and output, demonstrating that they have successfully learned how to program and interpret the output of packages in R. They are asked to recall the definitions of the important algorithms, such as the Gibbs sampler or the Metropolis-Hastings algorithm, the EM-algorithm and bootstrap.

Exam Repetition

The exam may be repeated at the end of the semester.

Top of page