Robust Inference using Max Mixtures
Many robot systems are sensitive to sensing errors and outliers: even small amounts of poor data can wreak havoc on the quality of their output. Robot mapping is a familiar example— even a single erroneous "loop closure" (caused by two different locations being incorrectly associated with each other) can cause a mapping system to fail.
I will describe our recent work in "max" mixtures, a probabilistic mixture model formulation that allows more realistic error models to be incorporated into an inference problem. With these more flexible representations, the need for explicit filtering and outlier rejection is reduced or even eliminated. Unlike the more conventional "sum" mixtures, we show that the "max" mixture formulation permits very fast inference. We will present results from the mapping domain, showing how max mixtures can be used to overcome perceptual aliasing from within a principled Bayesian framework.