Let us suppose that the dynamics of the stock prices and of their stochastic variance is described by the Heston model, that is by a system of two stochastic differential equations with a suitable initial condition. The aim of this paper is to estimate the parameters of the Heston model and one component of the initial condition, that is the initial stochastic variance, from the knowledge of the stock and option prices observed at discrete times. The option prices considered refer to an European call on the stock whose prices are described by the Heston model. The method proposed to solve this problem is based on a filtering technique to construct a likelihood function and on the maximization of the likelihood function obtained. The estimated parameters and initial value component are characterized as being a maximizer of the likelihood function subject to some constraints. The solution of the filtering problem, used to construct the likelihood function, is based on an integral representation of the fundamental solution of the Fokker–Planck equation associated to the Heston model, on the use of the wavelet expansions presented in (Fatone et al. in High performance algorithms based on a new wavelet expansion for time dependent acoustic obstacle scattering. Commun. Computat. Phys. (2007), Research Developments in Acoustics, vol. 2, pp. 39–69. Transworld Research Network, Kerala (2005), New wavelet bases made of piecewise polynomial functions: approximation theory, quadrature rules and applications to kernel sparsification and image compression. SIAM J. Sci. Comput. (submitted)) to approximate the integral kernel appearing in the representation formula of the fundamental solution, on a simple truncation procedure to exploit the sparsifying properties of the wavelet expansions and on the use of the fast Fourier transform (FFT). The use of these techniques generates a very efficient and fully parallelizable numerical procedure to solve the filtering problem, this last fact makes possible to evaluate very efficiently the likelihood function and its gradient. As a byproduct of the solution of the filtering problem we have developed a stochastic variance tracking technique that gives very good results in numerical experiments. The maximum likelihood problem used in the estimation procedure is a low dimensional constrained optimization problem, its solution with ad hoc techniques is justified by the computational cost of evaluating the likelihood function and its gradient. We use parallel computing and a variable metric steepest ascent method to solve the maximum likelihood problem. Some numerical examples of the estimation problem using synthetic and real data, that is data relative to an index of the Milano stock exchange (S&PMIB30), obtained with a parallel implementation of the previous numerical method are presented. Very impressive speed up factors are obtained in the numerical examples using the parallel implementation of the numerical method proposed. The website: http://www.econ.univpm.it/pacelli/mariani/finance/w1 contains animations and some auxiliary material that helps the understanding of this paper and makes available to the interested users the computer programs used to produce the numerical experience presented.

Maximum likelihood estimation of the Heston stochastic volatility model using asset and option prices: an application of nonlinear filtering theory

MARIANI, FRANCESCA;
2008-01-01

Abstract

Let us suppose that the dynamics of the stock prices and of their stochastic variance is described by the Heston model, that is by a system of two stochastic differential equations with a suitable initial condition. The aim of this paper is to estimate the parameters of the Heston model and one component of the initial condition, that is the initial stochastic variance, from the knowledge of the stock and option prices observed at discrete times. The option prices considered refer to an European call on the stock whose prices are described by the Heston model. The method proposed to solve this problem is based on a filtering technique to construct a likelihood function and on the maximization of the likelihood function obtained. The estimated parameters and initial value component are characterized as being a maximizer of the likelihood function subject to some constraints. The solution of the filtering problem, used to construct the likelihood function, is based on an integral representation of the fundamental solution of the Fokker–Planck equation associated to the Heston model, on the use of the wavelet expansions presented in (Fatone et al. in High performance algorithms based on a new wavelet expansion for time dependent acoustic obstacle scattering. Commun. Computat. Phys. (2007), Research Developments in Acoustics, vol. 2, pp. 39–69. Transworld Research Network, Kerala (2005), New wavelet bases made of piecewise polynomial functions: approximation theory, quadrature rules and applications to kernel sparsification and image compression. SIAM J. Sci. Comput. (submitted)) to approximate the integral kernel appearing in the representation formula of the fundamental solution, on a simple truncation procedure to exploit the sparsifying properties of the wavelet expansions and on the use of the fast Fourier transform (FFT). The use of these techniques generates a very efficient and fully parallelizable numerical procedure to solve the filtering problem, this last fact makes possible to evaluate very efficiently the likelihood function and its gradient. As a byproduct of the solution of the filtering problem we have developed a stochastic variance tracking technique that gives very good results in numerical experiments. The maximum likelihood problem used in the estimation procedure is a low dimensional constrained optimization problem, its solution with ad hoc techniques is justified by the computational cost of evaluating the likelihood function and its gradient. We use parallel computing and a variable metric steepest ascent method to solve the maximum likelihood problem. Some numerical examples of the estimation problem using synthetic and real data, that is data relative to an index of the Milano stock exchange (S&PMIB30), obtained with a parallel implementation of the previous numerical method are presented. Very impressive speed up factors are obtained in the numerical examples using the parallel implementation of the numerical method proposed. The website: http://www.econ.univpm.it/pacelli/mariani/finance/w1 contains animations and some auxiliary material that helps the understanding of this paper and makes available to the interested users the computer programs used to produce the numerical experience presented.
2008
stochastic volatility model; maximum likelihood; calibration problem
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/373409
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact