Model trainning progres - Blood-Glucose-Control/nocturnal-hypo-gly-prob-forecast GitHub Wiki
Modsets | 5mins | 15 mins | Notes |
---|---|---|---|
ARIMA | Done | Done | NA |
Exponential | Done | Done | Performance ain't great |
EGARCH | Done | Done | NA |
GARCH | Done | Done | There is one extra run for 15mins GARCH (jobId 6600) |
HARCH | Done | Done | NA |
ARDL (Structural) | Finished - no results | Done | 15mins result is not good (only one patient) |
- 05_arch is trained in 6 batches (vol (3 params) * mean (2 params)) and the results are concat together.
- 15_arch has 2 results. One has all the vol (3 of them) and an extra one with only GARCH and larger lags.
Deep Learning Models
GridSearching a deep learning model probably doesn't really make sense: Maybe use some pre-trained models (Foundation Forecasting Model) and do fine-tuning?
benchmark: https://github.com/Nixtla/nixtla/tree/main/experiments/foundation-time-series-arena
The best model is highlighted in bold and the second best is underlined for ease of reference.
- It is currently only working for Informer, Autoformer, and TimeSeriesTransformer.
- From Amazon. Born in March 2024
- Adapter for using MOIRAI Forecasters. Seems like a pretty new model. Born in May 2024.
- From Google, zero-shot forecasting
- From IBM, Zero-shot and fine-tunable