Model trainning progres - Blood-Glucose-Control/nocturnal-hypo-gly-prob-forecast GitHub Wiki

Modsets 5mins 15 mins Notes
ARIMA Done Done NA
Exponential Done Done Performance ain't great
EGARCH Done Done NA
GARCH Done Done There is one extra run for 15mins GARCH (jobId 6600)
HARCH Done Done NA
ARDL (Structural) Finished - no results Done 15mins result is not good (only one patient)
  • 05_arch is trained in 6 batches (vol (3 params) * mean (2 params)) and the results are concat together.
  • 15_arch has 2 results. One has all the vol (3 of them) and an extra one with only GARCH and larger lags.

Deep Learning Models

GridSearching a deep learning model probably doesn't really make sense: Maybe use some pre-trained models (Foundation Forecasting Model) and do fine-tuning?

benchmark: https://github.com/Nixtla/nixtla/tree/main/experiments/foundation-time-series-arena

The best model is highlighted in bold and the second best is underlined for ease of reference. image

  1. HFTransformersForecaster
  • It is currently only working for Informer, Autoformer, and TimeSeriesTransformer.
  1. ChronosForecaster
  • From Amazon. Born in March 2024
  1. MOIRAIForecaster
  • Adapter for using MOIRAI Forecasters. Seems like a pretty new model. Born in May 2024.
  1. TimesFMForecaster
  • From Google, zero-shot forecasting
  1. TinyTimeMixerForecaster
  • From IBM, Zero-shot and fine-tunable