10 uncertainty quantification methods in nnetsauce forecasting
This article was first published on T. Moudiki's Webpage - Python , and kindly contributed to python-bloggers. (You can report issue about the content on this page here)
Want to share your content on python-bloggers? click here.
Want to share your content on python-bloggers? click here.
This week, I released (Python) version 0.22.4
of nnetsauce
. nnetsauce
now contains 10 uncertainty quantification methods for time series forecasting:
gaussian
: simple, fast, but: assumes stationarity of Gaussian in-sample residuals and independence in the multivariate casekde
: based on Kernel Density Estimation of in-sample residualsbootstrap
: based on independent bootstrap of in-sample residualsblock-bootstrap
: based on moving block bootstrap of in-sample residualsscp-kde
: Split conformal prediction with Kernel Density Estimation of calibrated residualsscp-bootstrap
: Split conformal prediction with independent bootstrap of calibrated residualsscp-block-bootstrap
: Split conformal prediction with moving block bootstrap of calibrated residualsscp2-kde
: Split conformal prediction with Kernel Density Estimation of standardized calibrated residualsscp2-bootstrap
: Split conformal prediction with independent bootstrap of standardized calibrated residualsscp2-block-bootstrap
: Split conformal prediction with moving block bootstrap of standardized calibrated residuals
The release is available on Github, Conda and PyPI.
I’ll present nnetsauce
and these methods with more details and examples at the 44th International Symposium on Forecasting (ISF) (ISF) 2024 (on Wednesday). I hope to see you there, with your questions, remarks and suggestions.
Next week, I’ll release a stable (and documented) version of learningmachine
.
To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - Python .
Want to share your content on python-bloggers? click here.