[ad_1]
Google simply entered the race of basis fashions for time-series forecasting.
In August 2023, the time-series neighborhood was disrupted by the discharge of TimeGPT, Nixtla’s first basis mannequin for time collection forecasting.
Following TimeGPT, a number of basis forecasting fashions have been launched, however there was one which stood out. Just lately, Google unveiled TimesFM[1], a groundbreaking time-series mannequin with phenomenal outcomes.
Time collection are ubiquitous, utilized in many domains like retail, power demand, economics, healthcare and extra. A basis TS mannequin may be readily utilized to any TS case with nice accuracy, like GPT-4 for textual content.
On this article, we focus on:
- The challenges of basis fashions in time collection in comparison with NLP.
- How TimesFM overcomes these challenges.
- How TimesFM works and why it’s a robust mannequin.
- TimesFM benchmark outcomes.
- Prospects for the way forward for basis fashions in time-series forecasting
Let’s get began.
I’ve launched AI Horizon Forecast, a publication specializing in time-series and revolutionary AI analysis. Subscribe right here to broaden your horizons!
The idea of a promising basis mannequin in NLP was already evident with the discharge of GPT-2 in Language Fashions are Unsupervised Multitask Learners [2].
However in time collection, constructing a basis mannequin isn’t easy. There are a number of challenges:
- Dataset Shortage: In NLP, discovering textual content knowledge is simple. Nonetheless, public time-series datasets should not available.
- Unpredictable Format: Language fashions are primarily based on well-defined grammars and vocabularies. Time-series knowledge might belong to domains with totally different traits — e.g., extremely sparse gross sales or unstable monetary knowledge.
- Totally different Granularities: Every time collection mannequin works for a selected granularity — e.g., hourly, weekly, month-to-month…
[ad_2]