A Model Selection criterion for the Mixture Reduction problem based on the Kullback-Leibler Divergence
paper Menu
In order to be properly addressed, many practical problems require an accurate stochastic characterization of the involved uncertainties. In this regard, a common approach is the use of mixtures of parametric densities which allow, in general, to arbitrarily approximate complex distributions by a sum of simpler elements. Nonetheless, in contexts like target tracking in clutter, where mixtures of densities are commonly used to approximate the posterior distribution, the optimal Bayesian recursion leads to a combinatorial explosion in the number of mixture components. For this reason, many mixture reduction algorithms have been proposed in the literature to keep limited the number of hypotheses, but very few of them have addressed the problem of finding a suitable model order for the resulting approximation. The commonly followed approach in those algorithms is to reduce the mixture to a fixed number of components, disregarding its features which may vary over time. In general, finding an optimal number of mixture components is a very difficult task: once a meaningful optimality criterion is identified, potentially burdensome computational procedures must be devised to reach the optimum. In this work, by exploiting the optimal transport theory, an efficient and intuitive model selection criterion for the mixture reduction problem is proposed.