Information acquisition problems form a class of stochastic decision problems in which a decision maker, by carefully controlling a sequence of actions with uncertain outcomes, dynamically refines the belief about (Markovian) parameters of interest. Examples arise in patient care, computer vision, spectrum utilization, and joint source--channel coding.
In this talk, as a special case of information acquisition, we consider the problem of (two-dimensional) search in an active hypothesis testing framework. We first provide a brief survey of the corresponding literature on active hypothesis testing and design of experiments. In particular, we review the dynamic programming interpretation of information utility introduced by De Groot as well as the notion of asymptotic optimality due to Chernoff. We connect the stochastic control theoretic notion of information utility to the test reliability in statistics and information theory. We then underline the main drawback of Chernoff’s asymptotic optimality notion: his neglecting the complimentary role of the number of hypotheses, i.e. the resolution of search in our case.
To address the above shortcomings, we connect De Groot's information utility framework with the Shannon theoretic concept of uncertainty reduction and strengthen Chernoff's lower bound to account for the resolution of the search. This lower bound, as a corollary, provides upper bounds on maximum information acquisition rate and the optimal reliability as a function of rate. We also introduce Extrinsic Jensen–Shannon (EJS) divergence as a measure of information based on which a heuristic acquisition strategy is constructed. This allows us to provide a new (tight) upper bound on the performance and a lower bound on reliability function.
This is joint work with Mohammad Naghshvar, Kamalika Chaudhury, Michèle Wigger, and Ofer Shayevitz.
Information Acquisition, Noisy Search, and Active Hypothesis Testing
Event Status
Scheduled
Event Details
Date and Time
Feb. 7, 2014, All Day