Communications and Signal Processing Seminar

A Common Perspective Over Randomized Estimation and FDR Control in Multiple Testing

Dr. Gilles Blanchard

The union bound is perhaps the most elementary probabilistic tool available in statistical learning to control estimation error uniformly over a (discrete) space of models; it is also used in multiple testing to control the family-wise error rate (FWER) of a collection of tests (then known as the Bonferroni correction).
The goal of this talk is to introduce a new elementary probabilistic lemma, that can be seen as a sort of randomized union bound. It allows to cast a common light over some recent new approaches in statistical learning and multiple testing: respectively, randomized estimation, and False discovery (FDR) rate control (in the distribution-free setting).
This allows us to show a link between these seemingly disjoint areas, and draw a clear parallel with the classical use of the union bound. Besides recovering previous results of both areas under a common roof, this point of view allows us to develop some interesting extensions.

This Is a joint work with Francois Fleuret.

GB's research area is in statistics and machine learning. He received his PhD in applied mathematics in 2001 in the Universite Paris Nord and joined the French national research institute CNRS the same year. In 2002 he moved to the IDA machine learning group in the Fraunhofer Institute in Berlin where he has been working since. He has been in an invited faculty position at the University of Chicago dept. of statistics for the first semester of 2007.

Sponsored by

Prof. Clayton Scott