Title: Quantifying neural network uncertainty under volatility clustering

Abstract: In this talk, I will discuss our latest developments on quantifying Neural Network uncertainty under volatility clustering. Time-series with volatility clustering pose a unique challenge to uncertainty quantification (UQ) for returns forecasts. Methods for UQ such as Deep Evidential regression offer a simple way of quantifying return forecast uncertainty without the costs of a full Bayesian treatment. However, the Normal Inverse-Gamma (NIG) prior adopted by Deep Evidential regression is prone to miscalibration as the NIG prior is assigned to latent mean and variance parameters in a hierarchical structure. Moreover, it also overparameterizes the marginal data distribution. These limitations may affect the accurate delineation of epistemic (model) and aleatoric (data) uncertainties. We propose a Scale Mixture Distribution as an alternative which can provide favourable complexity-accuracy trade-off and assign separate subnetworks to each model parameter. To illustrate the performance of our proposed method, we apply it to two sets of financial time-series exhibiting volatility clustering: cryptocurrencies and U.S. equities and test the performance in some ablation studies. This is a joint work with Steven Wong (Vice President, Associate Portfolio Manager Research at Accadian Asset Management, Australia) and Professor Jennifer Chan (The University of Sydney, Australia).

Dates

March 15th, 2025

Abstract submission deadline

March 15th, 2025

Paper submission deadline

April 16th, 2025

Accept/Reject notification

May 21-23 ,2025

Netys Conference

Proceedings

Revised selected papers will be published as a post-proceedings in Springer's LNCS "Lecture Notes in Computer Science"

Partners & Sponsors