Nov 10, 2017
Lopata Hall, Room 101
Toward a New Convergence Rate Theory for Offline Learning
Dr. Ilya O. Ryzhov
Department of Decision, Operations & Information Technologies
University of Maryland
We consider sequential learning problems in which simulation is used to obtain information that can be used to improve decisions made over time. The simulation budget is limited, but simulations do not incur any additional economic cost; the goal is to optimally allocate the simulation budget in order to identify the best decisions as efficiently as possible. One of the most enduring and popular methodological approaches to this problem class is a Bayesian method known as "expected improvement" or EI, originally proposed in 1998 within the engineering optimization literature. We present recent and ongoing work that 1) proves the first convergence rate results for EI within the theoretical framework provided by large deviations theory; 2) shows how EI can be made to achieve optimal convergence rates with a minor modification that requires no tunable parameters; 3) builds on insights from this work to propose a completely new class of computational procedures that can be guaranteed to achieve optimal rates without tuning; 4) poses a number of open questions for future work.
Ilya O. Ryzhov is an Associate Professor of Operations Management and Management Science in the Dept. of Decision, Operations & Information Technologies at the Robert H. Smith School of Business, University of Maryland. He received a Ph.D. in Operations Research and Financial Engineering from Princeton University in 2011. His research mostly focuses on statistical and optimal learning, with applications in business analytics. He is a coauthor of the book Optimal Learning (Wiley, 2012).