Minimax Optimal Procedures for Joint Detection and Estimation

2026-04-24Information Theory

Information Theory
AI summary

The authors study a problem where you need to both decide between two complicated guesses about data and then estimate a hidden parameter, all while not knowing the exact data distribution. They look at two ways to approach this: one Bayesian and one like the Neyman-Pearson testing method. Their main finding is that the best strategy involves maximizing a special measure called f-similarity to find the toughest distributions to deal with. The authors also focus on a specific uncertainty model and improve existing algorithms to make their methods faster and more stable, showing numerical examples for both approaches.

composite hypothesesparameter estimationdistributional uncertaintyBayesian formulationNeyman-Pearson lemmaf-similarityminimax procedureband-type uncertainty modelnumerical stabilityconvergence speed
Authors
Dominik Reinhard, Michael Fauß, Abdelhak M. Zoubir
Abstract
We investigate the problem of jointly testing a pair of composite hypotheses and, depending on the test result, estimating a random parameter under distributional uncertainties. Specifically, it is assumed that the distribution of the data given the parameter of interest, is subject to uncertainty. Both, a Bayesian formulation and a Neyman-Pearson-like formulation, are considered. It is shown that the optimal policy induces an $f$-similarity that must be maximized to identify the least favorable distributions. Besides the general results, the implementation is investigated using a band-type uncertainty model. For designing the minimax procedures, existing algorithms are modified to increase convergence speed while maintaining numerical stability. The proposed theory is supplemented by numerical results for both formulations.