Johan Segers, ISBA - UCL

November 24, 2017

14:30

Bruxelles

ULB Campus de la Plaine, Boulevard du Triomphe Batiment NO 9ème étage Local 2.NO.906 (salle dans la rotonde)

Accelerating the convergence rate of Monte Carlo integration through ordinary least squares

Abstract:

In numerical integration, control variates are commonly used to reduce the variance of the naive Monte Carlo method. The control functions can be viewed as explanatory variables in a linear regression model with the integrand as dependent variable. The control functions have a known mean vector and covariance matrix, and using this information or not yields a number of variations of the method. A specific variation arises when the control functions are centered and the integral is estimated as the intercept via the ordinary least squares estimator in the linear regression model. When the number of control functions is kept fixed, all these variations are asymptotically equivalent, with asymptotic variance equal to the variance of the error variable in the regression model. Nevertheless, the ordinary least squares estimator presents particular advantages: it is the only one that correctly integrates constant functions and the control functions. In addition, if the number of control functions grows to infinity with the number of Monte Carlo replicates, the ordinary least squares estimator converges at a faster rate than the Monte Carlo procedure, the integration error having a Gaussian limit whose variance can be estimated consistently by the residual variance in the regression model. An extensive simulation confirms the superior performance of the ordinary least squares Monte Carlo method for a variety of univariate and multivariate integrands and control functions.

This is joint work with F. Portier, Télécom ParisTech.

 

VENUE:
ULB Campus de la Plaine, Boulevard du Triomphe Batiment NO 9ème étage Local 2.NO.906 (salle dans la rotonde)
Categories Events: