BRICS · Contents · Lecturer · Programme · Lecture Notes

Randomness and Computation

A BRICS Mini-Course
May 2-5, 8 and 9, 1995

Lectures by
Aravind Srinivasan
National University of Singapore


Course Contents

The main course will consist of Lectures 1, 2 and 3 as listed above. The contents of the lectures are outlined below. Lecture 1 in particular is intended to be accessible to as wide an audience as possible. In addition, there will be additional sessions (Lectures 4-6 below) intended for people interested in pursuing specialised research in the area covering the remaining topics. Special announcements will appear for these sessions.

About the Lecturer

I was born in India, got my undergraduate degree at the Indian Institute of Technology (Madras), and a PhD from Cornell University, USA, in 1993. I spent a year-and-a-half as a joint postdoc at DIMACS and the Institute for Advanced Study (USA), and am now on a sequence of visiting positions at the Weizmann Institute, University of Warwick, and the Max-Planck Institute. I will be taking up a lecturership at the National University of Singapore in June '95.

My hobbies and interests are music, swimming, teaching, trying to study mathematics-related subjects, and dabbling with philosophy, of which I don't know much.

In terms of research, I am interested in theoretical computer science in general, and in all aspects of randomized computation in particular. I am especially intrigued by two aspects of the power that randomness and ``probabilistic thinking'' seem to add to computing:

(i) The area of de-randomization-coming up with general techniques to ``automatically remove the randomness'' from a class of randomized algorithms. As is well-known, this had led, for quite some natural problems, to the best-known deterministic algorithm being obtained by ``randomization plus de-randomization''.

(ii) The power that random choices seem to add to distributed computation under the constraint of locality-how local random choices seem to, with high probability, complete a consistent global picture.

My most recent work has involved a mixture of results in randomized computation: running randomized algorithms using very weak sources of randomness, randomized distributed algorithms, de-randomization and explicit constructions, approximation algorithms via randomization, and the computational applications of tools from probabilistic combinatorics. I have also worked on graph algorithms (sequential, parallel and distributed), and in scheduling theory.

I am looking forward to a pleasant interaction with all of you.

Programme

Tuesday May 2, 1995, 09:15-11:00 in Colloquium B2

Lecture 1.

We will present a quick overview of the role of randomness in computation, and present some simple required tools (linearity of expectation, large deviation bounds etc.) via the color-coding method of Alon, Yuster and Zwick, and the randomized rounding approach of Raghavan and Thompson. This lecture will be introductory in nature.

Wednesday May 3, 1995, 14:15-16:00 in Colloquium D

Lecture 2.

One of the major applications of randomization to computation is in symmetry-breaking, especially in parallel and distributed computation. There are beautiful theoretical problems here, and given the clear role for networks and distributed computation in the future, all this is also bound to have an impact on practice. We will present the problem of distributed symmetry-breaking through a type of contention-resolution problem that arises in multiple-access channels (e.g., ALOHA, Ethernet), and more recently in PRAM emulation and in optical computing. We will present and analyze some recent results on this problem.

Thursday May 4, 1995, 15:15-17:00 in Colloquium B4

TALK: Improved Approximations of Packing and Covering Problems

This will be a talk for people interested in algorithms, specifically randomised approximation algorithms.

Several important NP-hard combinatorial optimization problems can be posed as packing/covering integer programs; the randomized rounding technique of Raghavan & Thompson is a powerful tool to approximate them well. We present one elementary unifying property of all these integer programs, and use the FKG correlation inequality to derive an improved analysis of randomized rounding on them. This also yields a pessimistic estimator, thus presenting deterministic polynomial-time algorithms for them with approximation guarantees significantly better than those known.

The talk will be self-contained and will not assume any prior knowledge of this area.

Friday May 5, 1995, 09:15-11:00 in G 3.3

Lecture 3.

One of the surprising outcomes of research in randomized computation is the area of de-randomization-coming up with general techniques to ``automatically remove the randomness'' from a class of randomized algorithms. This had led, for quite some natural problems, to the best-known deterministic algorithm being obtained by ``randomization plus de-randomization''. We shall present some of these tools, such as limited independence, small-bias probability spaces and the method of conditional probabilities, and discuss some of their several applications.

Friday May 5, 1995, 13:15-15:00 in Colloquium B4

Lecture 4.

We will focus on a powerful statistical tool, random sampling, in this talk. After presenting some basic applications of random sampling, we will discuss the recent amazing results of Karger and co-authors on the structure of cuts and related problems in graphs, and applications to the minimum spanning tree problem.

Monday May 8, 1995, 09:15-11:00 in Colloquium B4

Lecture 5.

Tools from probabilistic combinatorics have played a key role in randomized computation. We will present one such powerful tool, the Lovasz local lemma, and present its elegant application to message routing, due to Leighton, Maggs and Rao. If time permits, we will briefly describe the recent extension of the local lemma due to Goldberg, this speaker, and Sweedyk, and mention its applications.

Tuesday May 9, 1995, 15:15-17:00 in Colloquium B4

Lecture 6.

In practice, programs get their ``random'' bits by using pseudo-random number generators. Empirically, randomized algorithms perform well in general, but there are reports of algorithms giving quite different results under different pseudo-random generators, e.g., in Monte-Carlo simulations and in some graph algorithms. De-randomization techniques are a step toward addressing this problem, but are not universally applicable. Researchers have thus studied the question of running randomized algorithms using random sources that have very weak random properties. We will discuss some of the beautiful work of Zuckerman and co-authors in this area. This line of work represents, in the speaker's opinion, some of the deepest results in randomized computation, and has also had surprising impact on areas as diverse as explicit constructions of expanding graphs, the hardness of approximation, randomized space-bounded computation, time-space tradeoffs, and data structures.

Lecture Notes

BRICS-NS-95-6