MA-INF 1213: Randomized Algorithms & Probabilistic Analysis 2017

Oral Exams

Oral exams took place from the 2nd to the 4th of August.

Re-exams take place on the 27th of September. Please schedule the time of your exam with Christiane Andrade in room II.62.


When Where Start Lecturer
Monday, 10:15-11:45
Thurs., 10:15-11:45
LBH / Hörsaal III.03a April 20 Röglin (June-),
Schmidt (-June)

Course Material

The Lecture Notes cover the lecture except for one possible new chapter. Part I is largely based on the following two books:

  • [MR95] Rajeev Motwani and Prabhakar Raghavan. Randomized Algorithms. ISBN: 978-0521474658, Cambridge University Press, 1995.
  • [MU05] Michael Mitzenmacher and Eli Upfal. Probability and Computing. ISBN: 978-0521835404, Cambridge University Press, 2005.


Date Contents
April 201 Discrete Event Spaces and Probabilities
1.1 Discrete Probability Spaces
1.2 Independent Events & Conditional Probability
April 241.2 (continued) Conditional Probability
1.3 Applications
1.3.1 Minimum Cut, Karger's Contract Algorithm
April 271.3.1 (continued) Karger's Contract algorithm
1.3.1 FastCut
May 1 no lecture
May 41.3.1 (continued) FastCut
1.3.1 Reservoir Sampling
May 82.1 Random Variables and Expected Values
2.1.1 Integer Random Variables
May 112.1.2 Conditional Expectation
2.2 Binomial and Geometric Distribution
2.3 Applications
2.3.1 Randomized QuickSort
May 152.3.2 Randomized Approximation Algorithms
3 Concentration Bounds: Markov's Inequality
May 183.1 Chebyshev's Inequality
3.3 Applications
3.3.1 A sublinear algorithm
May 223.3.1 A sublinear algorithm
May 25 no lecture
May 293.2 Chernoff/Rubin bounds
3.3.2 Estimating a parameter
4 Random Walks
4.1 Useful math
June 14.1 (continued) Useful math
4.2 Applications
4.2.1 An Algorithm for 2-SAT
June 5 no lecture
June 8 no lecture
June 124.2.2 Algorithms for 3-SAT
June 15 no lecture
June 196 Knapsack Problem and Multiobjective Optimization
6.1 Nemhauser-Ullmann Algorithm
June 226.2 Number of Pareto-optimal Solutions
6.2.1 Upper Bound
June 266.2.1 Upper Bound (continued)
6.3 Multiobjective Optimization
June 296.2.2 Lower Bound
July 3 6.4 Core Algorithms
July 6 7 Smoothed Complexity of Binary Optimization Problems
July 10 7 Smoothed Complexity of Binary Optimization Problems (continued)
July 13 8 Successive Shortest Path Algorithm
8.1 The SSP Algorithm
8.1.1 Elementary Properties
8.2 Smoothed Analysis
8.2.1 Terminology and Notation
8.2.2 Proof of the Upper Bound
July 17 8.2.3 Further Properties of the SSP Algorithm
8.2.4 Proof of Lemma 8.7
July 20 9 The 2-Opt Algorithm for the TSP
9.1 Overview of Results
9.2 Polynomial Bound for phi-Perturbed Graphs
9.3 Improved Analysis
July 24 9.3 Improved Analysis (continued)
10 The k-Means Method


When Where Start Lecturer
Thursday, 13:00-13:45 LBH, II.57 May 4th Schmidt
Thursday, 14:15-15:00 LBH, II.57 May 4th Schmidt

Problem Sets


The lecture has two parts. First, we consider the design and analysis of randomized algorithms. Many algorithmic problems can be solved more efficiently when allowing randomized decisions. Additionally, randomized algorithms are often easier to design and analyze than their (known) deterministic counterparts. For example, we will see an elegant algorithm for the minimum cut problem. Randomized algorithms can also be more robust on average, like randomized Quicksort.

The analysis of randomized algorithms builds on a set of powerful tools. We will get to know basic tools from probabily theory, very useful tail inequalities and techniques to analyze random walks and Markov chains. We apply these techniques to develop and analyze algorithms for important algorithmic problems like sorting and k-SAT.

Statements on randomized algorithms are either proven to hold on expectation or with high probability over the random choices. This deviates from the classical algorithm analysis but is still a worst-case analysis in its core. In the second part of the lecture, we learn about probabilistic analysis of algorithms. There are a number of important problems and algorithms for which worst-case analysis does not provide useful or empirically accurate results. One prominent example is the simplex method for linear programming whose worst-case running time is exponential while in fact it runs in near-linear time on almost all inputs of interest. Another example is the knapsack problem. While this problem is NP-hard, it is a very easy optimization problem in practice and even very large instances with millions of items can be solved efficiently. The reason for this discrepancy between worst-case analysis and empirical observations is that for many algorithms worst-case instances have an artificial structure and hardly ever occur in practical applications.

In smoothed analysis, one does not study the worst-case behavior of an algorithm but its (expected) behavior on random or randomly perturbed inputs. We will prove, for example, that there are algorithms for the knapsack problem whose expected running time is polynomial if the profits or weights are slightly perturbed at random. This shows that instances on which these algorithms require exponential running time are fragile with respect to random perturbations and even a small amount of randomness suffices to rule out such instances with high probability. Hence, it can be seen as an explanation for why these algorithms work well in practice. We will also apply smoothed analysis to the simplex method, clustering problems, the traveling salesman problem, etc.

Page Tools