Publication Type
Working Paper
Version
submittedVersion
Publication Date
9-2019
Abstract
This paper proposes a two-stage method for estimating parameters in a para-metric fractional continuous-time model based on discrete-sampled observations. In the first stage, the Hurst parameter is estimated based on the ratio of two second-order differences of observations from different time scales. In the second stage, the other parameters are estimated by the method of moments. All estimators have closed-form expressions and are easy to obtain. A large sample theory of the pro-posed estimators is derived under either the in-fill asymptotic scheme or the double asymptotic scheme. Extensive simulations show that the proposed theory performs well in finite samples. Two empirical studies are carried out. The first, based on the daily realized volatility of equities from 2011 to 2017, shows that the Hurst parameter is much lower than 0.5, which suggests that the realized volatility is too rough for continuous-time models driven by standard Brownian motion or fractional Brownian motion with Hurst parameter larger than 0.5. The second empirical study is of the daily realized volatility of exchange rates from 1986 to 1999. The estimate of the Hurst parameter is again much lower than 0.5. Moreover, the proposed frac-tional continuous-time model performs better than the autoregressive fractionally integrated moving average (ARFIMA) model out-of-sample.
Keywords
Rough Volatility, Hurst Parameter, Second-order Difference, Different Time Scales, Method of Moments, ARFIMA
Discipline
Econometrics
Research Areas
Econometrics
First Page
1
Last Page
49
Publisher
SMU Economics and Statistics Working Paper Series, Paper No. 17-2019
City or Country
Singapore
Citation
WANG, Xiaohu; XIAO, Weilin; and Jun YU.
Estimation and Inference of fractional continuous-time model with discrete-sampled data. (2019). 1-49.
Available at: https://ink.library.smu.edu.sg/soe_research/2294
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.