Real-world examples of examples of time series forecasting techniques
Before definitions, let’s start with situations where people actually use these models. Here are some real examples of time series forecasting techniques in action:
- A hospital system forecasting next week’s emergency room arrivals by hour using exponential smoothing, based on historical patient counts.
- A utility company predicting daily electricity load with SARIMA (seasonal ARIMA), using 5+ years of hourly demand and temperature data.
- A national bank modeling quarterly GDP growth and inflation with ARIMA and VAR models, using public data from the Federal Reserve Economic Data (FRED).
- A grocery chain forecasting weekly demand for perishable items with Prophet-style decomposable models, accounting for holidays and promotions.
- A ride-sharing platform estimating ride demand by city and time of day with gradient boosted trees and LSTM networks.
- A public health agency projecting flu-like illness rates using autoregressive models and external signals, similar to approaches documented by the CDC.
Each of these is an example of how a specific forecasting technique is matched to the structure of the data: trend, seasonality, noise, and the cost of being wrong.
Classic statistical examples of examples of time series forecasting techniques
When people talk about examples of examples of time series forecasting techniques, they usually start with the classics: moving averages, exponential smoothing, and ARIMA-type models. These are still heavily used in 2024–2025 because they’re interpretable, fast, and easy to deploy.
Moving average and naive methods in real operations
At the very simplest end, you have:
- Naive forecast: tomorrow equals today. Surprisingly effective for very short-term financial series and for some inventory items.
- Seasonal naive: next Monday’s value equals last Monday’s value. Often used as a baseline for weekly or yearly seasonal patterns.
- Simple moving average: forecast is the average of the last k observations.
Real example:
A small e‑commerce shop with limited data might forecast next week’s orders using a 4-week moving average. They don’t have enough history for complex models, but they can smooth out random spikes from promotions. This moving-average approach is often the first example of a method used as a benchmark before rolling out more advanced models.
Exponential smoothing and Holt–Winters
Exponential smoothing methods give more weight to recent data. Variants include:
- Simple Exponential Smoothing (SES): for data with no clear trend or seasonality.
- Holt’s method: adds a trend component.
- Holt–Winters (additive or multiplicative): captures both trend and seasonality.
Real examples include:
- A call center forecasting call volume at 15-minute intervals for tomorrow’s staffing schedule.
- A regional airline predicting weekly passenger counts on specific routes, where there is a clear trend and strong seasonal pattern.
In both cases, exponential smoothing can be implemented quickly and updated automatically as new data comes in. The methods are still taught widely in statistics and operations research programs; you can see related methodology in time series courses from universities such as MIT OpenCourseWare, which often use exponential smoothing as a starting point for forecasting.
AR, MA, and ARIMA models: workhorses of forecasting
Autoregressive (AR), moving average (MA), and ARIMA (AutoRegressive Integrated Moving Average) models use past values and past errors to predict future values.
- AR(p): predicts the current value using a linear combination of the last p values.
- MA(q): predicts using the last q forecast errors.
- ARIMA(p,d,q): combines AR and MA after differencing the series d times to remove trend.
- SARIMA: extends ARIMA with seasonal components.
Real examples of examples of time series forecasting techniques with ARIMA:
- Macroeconomic forecasting: Central banks and researchers use ARIMA and related models to forecast GDP, unemployment, and inflation. For instance, data from FRED at the Federal Reserve Bank of St. Louis is often modeled using ARIMA as a baseline before exploring more complex structures.
- Influenza surveillance: Epidemiologists may use autoregressive models to forecast short-term flu activity, complementing surveillance data and models described by the NIH and CDC.
- Retail sales: ARIMA models are commonly applied to monthly or weekly sales data for individual products to generate store-level forecasts.
These ARIMA-based approaches remain some of the best examples for teaching the full workflow: stationarity checks, differencing, autocorrelation analysis, and residual diagnostics.
Seasonal and calendar-aware examples of time series forecasting techniques
Many real-world series are dominated by calendar effects: weekends, holidays, and weather. Some of the best examples of time series forecasting techniques explicitly model these patterns.
SARIMA for strong seasonality
When demand repeats in a regular pattern—daily, weekly, yearly—Seasonal ARIMA (SARIMA) is often the first serious tool people try.
Example of SARIMA in practice:
- Electricity load forecasting: Utilities in the U.S. and Europe routinely build SARIMA or related models on hourly load data, with daily and weekly seasonal components. These forecasts drive decisions about generation scheduling and grid reliability.
- Gas consumption forecasting: Natural gas distributors forecast daily demand using SARIMA, with additional regressors for temperature (heating degree days) and holidays.
The pattern is consistent: identify the dominant seasonal periods, difference accordingly, and fit a SARIMA model that explains both short-term and seasonal autocorrelation.
Prophet-style decomposable models
Tools inspired by Facebook’s Prophet library popularized a decomposable view of time series:
- Trend component (linear or logistic)
- Multiple seasonalities (weekly, yearly, sometimes daily)
- Holiday and event effects
Real examples include:
- Web traffic forecasting: Media sites forecast daily page views, capturing weekend dips and holiday spikes.
- Retail demand around holidays: Chains forecast the effect of Black Friday, Christmas, and other retail events, treating them as explicit holiday regressors.
This family of methods is often used by data teams that want interpretable components and quick iteration rather than the most complex deep learning architectures.
Machine learning examples of examples of time series forecasting techniques
From 2020 onward, and especially into 2024–2025, machine learning models moved from “interesting experiments” to production workhorses in many organizations. These include tree-based models, gradient boosting, and deep learning.
Tree-based and gradient boosting models
Random forests, XGBoost, LightGBM, and CatBoost aren’t time series models by design, but they become powerful forecasters when you engineer the right features:
- Lagged values (e.g., value at t−1, t−7, t−28)
- Rolling statistics (7-day mean, 30-day max, etc.)
- Calendar features (day of week, month, holiday flags)
- External regressors (price, promotions, weather)
Real examples include:
- Demand forecasting at scale: Large retailers with thousands of SKUs often use gradient boosting models to forecast weekly demand, because these models can handle many predictors and complex interactions.
- Ride-sharing demand: Platforms forecast rides per city and hour, combining historical demand with weather and event features.
In many benchmarking studies, these models are among the best examples of high-accuracy forecasting on tabular time series, especially when you care more about point forecasts than about formal statistical inference.
Deep learning: LSTM, Temporal Convolution, and Transformers
Neural networks for time series have matured. Models like LSTMs, GRUs, temporal convolutional networks (TCNs), and Transformers are now common in research and high-volume production systems.
Example of LSTM and related models in practice:
- High-frequency trading: Firms use deep learning models on millisecond-level price and order book data to forecast short-term price movements.
- Power grid stability: Research groups and utilities explore LSTM and TCN models to forecast frequency and load variations in real time.
- Traffic forecasting: City planners and navigation apps use graph-based neural networks and temporal models to predict congestion across road networks.
A 2024 pattern worth noting: many teams combine classical methods and deep learning. For instance, they might use ARIMA or exponential smoothing as a baseline, then apply an LSTM model to the residuals, or blend forecasts in an ensemble.
For readers wanting more technical depth, the time series sections of modern machine learning courses from universities such as Stanford and Harvard often showcase these models on real datasets.
Domain-specific examples of examples of time series forecasting techniques
To make this less abstract, let’s walk through domain-focused examples where different techniques are used side by side.
Health and epidemiology forecasting
Health data is inherently temporal: case counts, hospitalizations, bed occupancy, medication usage.
Real examples include:
- Influenza and respiratory illness trends: Public health agencies track weekly influenza-like illness (ILI) rates. Autoregressive models, exponential smoothing, and sometimes state-space models are used to forecast the next few weeks, supporting planning for vaccine distribution and hospital capacity. The CDC’s FluView data, described on CDC.gov, is a common reference.
- Hospital census forecasting: Hospitals forecast daily inpatients and ICU occupancy using Holt–Winters or ARIMA models, sometimes augmented with regression terms for local case counts or seasonal patterns.
These are some of the best examples of time series forecasting techniques where accuracy directly affects care quality and resource allocation.
Finance and macroeconomics
Financial and macroeconomic data are among the most widely modeled time series.
Examples include:
- Interest rate forecasting: Economists use ARIMA, VAR (Vector Autoregression), and state-space models to forecast short-term interest rates, often using data from the Federal Reserve.
- Stock index forecasting: While long-term forecasting is notoriously difficult, short-term volatility and risk metrics are often modeled with GARCH-type models, which are time series models for conditional variance.
- Credit risk and default rates: Banks forecast monthly default rates using ARIMA with macroeconomic covariates, then feed those projections into stress-testing frameworks.
These financial models appear in research and policy documents from central banks and academic institutions, including many working papers hosted by the Federal Reserve and top universities.
Energy, climate, and environment
Energy and environmental data are naturally time-indexed and often highly seasonal.
Real examples include:
- Short-term load forecasting (STLF): Utilities forecast electricity demand from 15 minutes ahead to a week ahead. SARIMA, exponential smoothing, gradient boosting, and deep learning all appear here, often combined in ensembles.
- Renewable generation forecasting: Solar and wind output forecasts use time series models with weather inputs. For example, wind farms may use LSTMs or TCNs trained on historical power output and meteorological forecasts.
- Climate indices: Researchers model indices such as the El Niño–Southern Oscillation using autoregressive and state-space models, often documented in climate science publications and datasets from agencies like NOAA.
These environmental applications are strong examples of examples of time series forecasting techniques that combine physical understanding with data-driven models.
How to choose among these examples of time series forecasting techniques
Seeing many examples is helpful, but choosing the right method is where modeling becomes an actual problem-solving tool.
A practical way to think about it:
- Short history, noisy data, limited expertise: Moving averages, naive, and simple exponential smoothing. These are good examples for small organizations getting started.
- Clear trend and seasonality, interpretable output needed: Holt–Winters, SARIMA, and Prophet-style models.
- Rich tabular data with many predictors: Gradient boosting and tree-based models, often outperforming classical models when you have strong external regressors.
- High-frequency, complex dependencies, or many related series: LSTMs, TCNs, and Transformers, especially when you can invest in engineering and MLOps.
In practice, many teams evaluate several of these examples of time series forecasting techniques side by side, pick a baseline (often ARIMA or exponential smoothing), and then adopt more complex models only if they deliver a clear improvement in forecast accuracy or decision quality.
FAQ: common questions about examples of time series forecasting
What are some simple examples of time series forecasting techniques for beginners?
Simple examples include the naive forecast (using the last observed value), moving averages, and simple exponential smoothing. These methods are easy to implement in spreadsheets or basic statistical software and are often used as baseline models in forecasting projects.
Can you give an example of a time series model used in healthcare?
A common example of a time series model in healthcare is using Holt–Winters exponential smoothing to forecast daily hospital admissions or bed occupancy. Hospitals use these forecasts to plan staffing levels and manage ICU capacity, often in coordination with public health surveillance data from agencies like the CDC.
Which examples of time series forecasting techniques work best for strong seasonality?
When your data has strong weekly or yearly patterns, SARIMA models, Holt–Winters exponential smoothing, and Prophet-style decomposable models are often effective. They are designed to capture repeating seasonal behavior and are widely used in energy demand forecasting, retail sales, and web traffic.
Are machine learning models always better than ARIMA?
Not necessarily. While models like gradient boosting and LSTMs can outperform ARIMA on complex datasets with many predictors, ARIMA remains a strong baseline and is easier to interpret and maintain. In many organizations, both types are used, and model selection is based on validation performance and operational constraints.
Where can I find real datasets to practice these techniques?
Public sources include the Federal Reserve Economic Data (FRED) at fred.stlouisfed.org, health and surveillance data from CDC.gov, and various open datasets from universities such as Harvard. These datasets provide realistic time series for practicing the examples of time series forecasting techniques discussed here.
Related Topics
Explore More Mathematical Modeling Techniques
Discover more examples and insights in this category.
View All Mathematical Modeling Techniques