Nonparametric calibration of jump-diffusion option pricing models.
Résumé
We present a non-parametric method for calibrating jump-diffusion models to a set of observed option prices. We show that the usual formulations of the inverse problem via nonlinear least squares are ill-posed. In the realistic case where the set of observed prices is discrete and finite, we propose a regularization method based on relative entropy: we reformulate our calibration problem into a problem of finding a risk neutral jump-diffusion model that reproduces the observed option prices and has the smallest possible relative entropy with respect to a chosen prior model. We discuss the numerical implementation of our method using a gradient based optimization and show via simulation tests on various examples that using the entropy penalty resolves the numerical instability of the calibration problem. Finally, we apply our method to empirical data sets of index options and discuss the empirical results obtained.
Loading...