BRICS · Contents · Programme · Prerequisites

Concentration of Measure and Applications to Analysis of Algorithms

A BRICS Mini-Course
September 5, 8, 10, 15, 17 and 19, 1997

Lectures by
Devdatt Dubhashi

Alessandro Panconesi

Course Contents

The study of tail probabilities of the sums of random variables, P[X1+X2+...+Xn > t] is both a classical problem in probability theory and a very imporatnt tool for the analysis of probabilistic algorithms. In the latter, however, we are often confronted with some special problems: first, one is required to consider functions other than the sum and secondly, the variables under consideration are not necessarily independent. In this mini-course, we focus on concepts and methods that allow classical results of probability theory that are derived for sums under the assumption of independence of the involved variables, to be extended or modified to these settings, providing a tool-kit for the high probability analysis of randomised algorithms. These tools will be illustrated by applications to problems in diverse areas such as data structures, random structures, distributed computing, computational learning theory and computational biology.

Course Outline

The first lecture will introduce the themes of the course and outline its contents. This will be non-technical, widely accessible and will last one cademic hour. The remaining lectures will last for two academic hours each. Each such lecture will be split into two units, one developing the necessary tools and the other illustrating the tools with applications.


Friday September 5, 1997, 15:15-16:00 in Auditorium D4


A general survey of results on the tails of sums in classical probability and of the use of tail probability bounds in the analysis of randomised algorithms, highlighting how in the latter we are usually required to consider more general functions of random variables than their sums and the assumption of independence cannot be made.

Monday September 8, 1997, 15:15-16:00 in Auditorium D4

Chernoff-Hoeffding Bounds

The central technique for the high probability analysis of algorithms are the Chernoff-Hoeffding bounds. Various useful forms of these bounds will be derived illustrating the basic technique. Examples will be given to demonstrate their use for the high probability analysis of algorithms.

Wednesday September 10, 1997, 15:15-17:00 in Auditorium D4

Chernoff-Hoeffding Bounds II

More applications of the CH bounds.

Monday September 15, 1997, 15:15-17:00 in Auditorium D4

Negative Dependence

A commonly occuring situation of dependence in the analysis of algorithms where the CH bounds can be expected to hold intuitively is that of negative dependence: when one set of variables is ``high'', a different set is ``low''. A small general theory of such dependence will be developed and it will be shown that CH bounds can be applied per se. Examples of applications will be given.

Wednesday September 17, 1997, 15:15-17:00 in Auditorium D4

Martingales and The Method of Bounded Differences

A classical body of concepts and techniques of probability theory invoked for dependence such as that involved in ``casino gambling'' situations is the theory of martingales. An elementary introduction to this theory will be given, illustrated with applications to the analysis of random structures and algorithms.

Friday September 19, 1997, 15:15-17:00 in Auditorium D4

Martingales and The Method of Bounded Differences, II

Further development of martingale inequalities and applications.

CH Bounds for Limited Independence (if time permits)

CH-like bounds can be obtained under weaker assumptions of independence - for instance where any subset of variables of some bounded size is independent (even though the entire set may not be).


The course will be of interest to computer scientists interested in the analysis of randomised algorithms and randomised complexity and to probabilists interested in applications to discrete algorithms. The first lecture is intended to be accessible to as wide an audience as possible; the remaining will become progressively more involved technically. However, the only prerequisite for the course is a general familiarity with the elementary concepts of discrete probability (random variables, expectations) and randomised algorithms.