de | en

Convex Optimization

Module EI74351

This Module is offered by TUM Department of Electrical and Computer Engineering.

This module handbook serves to describe contents, learning outcome, methods and examination type as well as linking to current dates for courses and module examination in the respective sections.

Basic Information

EI74351 is a semester module in English language at Master’s level which is offered in winter semester.

This Module is included in the following catalogues within the study programs in physics.

  • Catalogue of non-physics elective courses
Total workloadContact hoursCredits (ECTS)
180 h 60 h 6 CP

Content, Learning Outcome and Preconditions

Content

Introduction: basic definitions and fundamentals, problem statement.
Convex analysis: convex sets, convex functions.
Linear programming: extremal points and directions, simplex algorithm.
Optimality conditions: Fritz John conditions, Karush-Kuhn-Tucker conditions, constraint qualifications.
Lagrangian duality: duality theorems.
Algorithms: general concept, unconstrained optimization, constrained optimization.
Solutions for the dual problem: sub gradient method, cutting plane algorithm.
Interior-point method: barrier functions, IP algorithm.
Applications: problems form multi-user information theory, resource allocation, parameter optimization in layered and distributed communication systems.

Learning Outcome

After successfully passing the module, the students are able to characterize given mathematical optimization problems in terms of convex analysis on convex sets and convex functions and to derive optimality conditions related to F. John (FJ) and W. Karush, H. W. Kuhn and A. W. Tucker (KKT) for equality and inequality constraints, including the discussion of qualification constraints. Furthermore, the students are able to derive and to apply the weak and strong duality theorem and the saddle point theorem for the analysis and the appropriate reformulation of given optimization problems into prior and dual optimization problems and their related primal reconstruction algorithms. The student will be able to derive and apply gradient and subgradient based optimization methods and to take into account respective algorithms for a step size control. The students are also able to derive and apply the cutting plane method to reformulate convex optimization problems into a series of linear master problems. Finally, the students are able to derive and to apply a variety of state-of-the art algorithms, namely the simplex algorithm for linear programs, the gradient descent and the Newton algorithms for convex optimization problems, the Armijo-Goldstein rule for step-size control, and a few basis versions of interior point solvers.

Preconditions

Linear Algebra and Calculus.

Courses, Learning and Teaching Methods and Literature

Courses and Schedule

TypeSWSTitleLecturer(s)Dates
VI 4 Convex Optimization Hotz, M. Utschick, W. Wed, 13:15–14:45, N1070
Fri, 11:30–13:00, N1095

Learning and Teaching Methods

Learning method:
In addition to the individual methods of the students, consolidated knowledge is aspired by repeated lessons in exercises and tutorials.
Teaching method:
During the lectures, students are instructed in a teacher-centric style. The exercises are held in a student-centenered way.

Media

The following kinds of media are used:
- Presentations
- Lecture notes
- Exercises with solutions as download.

Literature

The following literature is recommended:
- M. S. Bazare, H. D. Serail, and C. M. Shetty. Nonlinear Programming: Theory and Algorithms. Wiley, 2006.
- D. Bertsekas and A. Nedic. Convex Analysis and Optimization. Athena Scientific, 2003.
- S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge, 2004.

Module Exam

Description of exams and course work

Written examination, 90 min (evaluation of basic theoretical concepts presented in the lecture and tutorials). Up to 20% of the examination can be conducted in the form of multiple choice questions.With maximally 5 sheets of A4 paper as helping material, the students formulate, classify, and solve convex optimization problems. They answer comprehension questions and are able to discuss the different algorithms to obtain the solutions.

Exam Repetition

There is a possibility to take the exam in the following semester.

Top of page