Introduction to Deep Learning
This module handbook serves to describe contents, learning outcome, methods and examination type as well as linking to current dates for courses and module examination in the respective sections.
Module version of SS 2017
There are historic module descriptions of this module. A module description is valid until replaced by a newer one.
Whether the module’s courses are offered during a specific semester is listed in the section Courses, Learning and Teaching Methods and Literature below.
|available module versions|
|SS 2018||SS 2017|
IN2346 is a semester module in English language at Master’s level which is offered in summer semester.
This Module is included in the following catalogues within the study programs in physics.
- Catalogue of non-physics elective courses
|Total workload||Contact hours||Credits (ECTS)|
|180 h||60 h||6 CP|
Content, Learning Outcome and Preconditions
- Machine learning basics 1: linear classification, maximum likelihood
- Machine learning basics 2: logistic regression, perceptron
- Introduction to neural networks and their optimization
- Stochastic Gradient Descent (SGD) and Back-propagation
- Training Neural Networks Part 1:
regularization, activation functions, weight initialization, gradient flow, batch normalization, hyperparameter optimization
- Training Neural Networks Part 2: parameter updates, ensembles, dropout
- Convolutional Neural Networks, ConvLayers, Pooling, etc.
- Applications of CNNs: e.g., object detection (from MNIST to ImageNet), visualizing CNN (DeepDream)
- Overview and introduction to Recurrent networks and LSTMs
- Recent developments in deep learning in the community
- Overview of research and introduction to advanced deep learning lectures.
MA0902 Analysis for Informatics
MA0901 Linear Algebra for Informatics
Courses, Learning and Teaching Methods and Literature
Courses and Schedule
|VI||4||Introduction to Deep Learning (IN2346)||Avetisyan, A. Dai, A. Dendorfer, P. Leal-Taixe, L. Lohr, Q. … (insgesamt 7)||
Tue, 14:00–16:00, MI HS1
Thu, 10:00–12:00, virtuell
and singular or moved dates
Learning and Teaching Methods
The practical sessions will be key, students shall get familiar with Deep Learning through hours of training and testing. They will get familiar with frameworks like PyTorch, so that by the end of the course they are capable of solving practical real-world problems with Deep Learning.
Description of exams and course work
- After each practical session, the students will have to provide the written working code to the teaching assistant for evaluation. The students will be awarded a bonus in case they successfully complete all practical assignments.
The exam takes the form of a written test. Questions allow to assess acquaintance with the basic concepts and algorithms of deep learning concepts, in particular how to train neural networks. Students demonstrate the ability to design, train, and optimize neural network architectures, and how to apply the learning frameworks to real-world problems (e.g., in computer vision). An important aspect for the student is to understand the basic theory behind the training process, which is mainly coupled with optimization strategies involving backprop and SGD. Students can use networks in order to solve classification and regression tasks (partly motivated by visual data).
The exam may be repeated at the end of the semester.