Real-world examples of partial autocorrelation function (PACF)

If you work with time series, you eventually bump into the partial autocorrelation function and wonder what it actually looks like in practice. Abstract definitions are fine, but most people learn faster from concrete, real-world examples of partial autocorrelation function (PACF) behavior. This guide focuses on exactly that: how PACF behaves in different time series and how to read those patterns. We’ll walk through several realistic examples of examples of partial autocorrelation function (PACF): financial returns, daily temperatures, hospital admissions, web traffic, and more. Along the way, we’ll connect these patterns to ARIMA modeling decisions, show what a “good” PACF plot looks like for different processes, and point you to external resources if you want to go deeper into the theory. If you already know the definition of PACF and just want to see how it plays out in real data, you’re in the right place.
Written by
Jamie
Published

Examples of partial autocorrelation function (PACF) in real data

Most tutorials start with theory. Let’s flip that. Here are several real examples of partial autocorrelation function (PACF) behavior that you actually see in practice, and what they tell you about the underlying time series.

Each example of PACF below is tied to a common modeling decision: whether to difference, what AR order to try, and when to stop adding terms.


1. Stock index returns: PACF of a near white-noise series

Take daily log returns of a broad stock index like the S&P 500. Once you convert prices to returns, the series is famously close to white noise: today’s return tells you very little about tomorrow’s.

PACF pattern:

  • A spike at lag 1 that is small or barely significant.
  • All other lags fall well within the confidence bands.

This is one of the cleanest examples of partial autocorrelation function (PACF) behavior for a series with little linear predictability. If you fit an ARIMA model to this, the PACF suggests that AR terms beyond lag 0 or 1 are not buying you much.

Modelers often stop at ARIMA(0,0,0) or ARIMA(1,0,0) after seeing this PACF, then shift attention to volatility modeling (e.g., GARCH) rather than mean forecasting.

For a solid background on financial time series properties, the Federal Reserve’s education resources on markets and returns are a good starting point: https://www.federalreserve.gov/education.htm


2. Monthly sales with short-term persistence: AR(1)-like PACF

Imagine a retailer’s monthly sales, after removing trend and strong seasonality. Sales this month tend to be similar to last month, but that effect fades quickly.

PACF pattern:

  • A strong, significant spike at lag 1.
  • Lags 2 and beyond are near zero and within the bands.

This is the textbook AR(1) signature. Among the best examples of partial autocorrelation function (PACF) that instructors use in class, this one shows how a single AR term captures short-term momentum.

In practice, you might:

  • Difference once to remove trend.
  • Use the PACF to justify an ARIMA(1,1,0) or SARIMA model if there is also seasonality.

This kind of PACF appears frequently in retail and e‑commerce forecasting, where short-run carryover from marketing campaigns or inventory cycles is strong but doesn’t last long.


3. Industrial production with longer memory: AR(2) and beyond

Now consider an industrial production index, such as U.S. manufacturing output. Even after detrending, the series often shows longer memory: shocks take time to die out.

PACF pattern:

  • Significant spikes at lag 1 and lag 2.
  • A noticeable drop after lag 2.
  • Lags 3+ are small and mostly inside the bands.

This is a classic AR(2)-type pattern. Among real examples of partial autocorrelation function (PACF), this is where you start to see the benefit of looking past lag 1.

How you use it:

  • The sharp cut-off after lag 2 suggests trying ARIMA(2,1,0) after differencing for trend.
  • If the ACF decays slowly while the PACF cuts off around lag 2, that strengthens the AR(2) story.

Industrial and macroeconomic time series like this are commonly used in academic and policy modeling. For more background on economic time series, the National Bureau of Economic Research (NBER) and the Federal Reserve Bank of St. Louis (FRED) data portal are widely used references: https://fred.stlouisfed.org


4. Daily temperatures: PACF in seasonal climate data

Daily average temperature in a city like Chicago or New York shows strong seasonality and autocorrelation. Even after seasonally adjusting, the series often retains structure.

Raw series PACF (not seasonally adjusted):

  • Strong spikes at short lags (1–7 days).
  • Clear spikes at multiples of 365 for yearly seasonality (if using daily data across many years), though this can be noisy.

Seasonally adjusted series PACF:

  • A few significant short lags (e.g., 1–3 days) due to local weather systems.
  • Much weaker long-lag behavior once the annual cycle is removed.

This is a strong example of how the partial autocorrelation function (PACF) helps separate short-term dynamics (weather systems) from long-term seasonality (climate). In practice, you might:

  • Remove the annual seasonal component using STL or regression with seasonal dummies.
  • Use the PACF of the residuals to decide whether an AR(1) or AR(2) structure is still needed.

For climate and environmental time series, agencies like NOAA provide rich datasets and methodological notes: https://www.noaa.gov


5. Hospital admissions: PACF with weekly and yearly patterns

Health systems use time series models to forecast emergency department visits or hospital admissions. These series often show:

  • A weekly cycle (higher on Mondays, lower on weekends).
  • A yearly cycle (flu season spikes in winter).

PACF pattern for daily admissions:

  • Strong spikes at lags 1 and 7 (yesterday and same day last week).
  • Additional spikes at lags 14, 21, etc., reflecting multi-week patterns.
  • After adjusting for seasonality, residual PACF might show 1–2 short lags.

This is one of the best examples of partial autocorrelation function (PACF) behavior in public health operations, because it immediately motivates seasonal ARIMA models, such as SARIMA with weekly seasonality.

Researchers and practitioners often consult resources from the Centers for Disease Control and Prevention (CDC) when working with surveillance and admissions data: https://www.cdc.gov


6. Web traffic: PACF in high-frequency digital data

Website or app traffic measured hourly has strong structure:

  • Daily seasonality (24‑hour cycle).
  • Weekly seasonality.
  • Short-term carryover from recent hours.

PACF pattern for hourly page views:

  • Significant spikes at lags 1–3 hours (short-term persistence).
  • Strong spikes at lags 24, 48, 72 (daily cycle and its multiples).
  • Sometimes weaker spikes at 168 (weekly cycle) and its multiples.

Among real examples of examples of partial autocorrelation function (PACF), this one is particularly instructive for engineers and data scientists working on anomaly detection or capacity planning.

In practice, you might:

  • Difference at lag 24 to remove daily seasonality.
  • Use the PACF of the differenced series to decide the AR order (e.g., AR(1) or AR(2)).

The PACF here provides a visual justification for combining nonseasonal AR terms with seasonal AR terms at lag 24 or 168 in a SARIMA or related model.


7. Cryptocurrency prices: regime shifts and unstable PACF

Cryptocurrency markets, especially before 2021, often show regime shifts: long periods of quiet trading punctuated by explosive rallies or crashes.

If you compute the PACF of raw prices, you typically see:

  • Very slow decay in the ACF, suggesting nonstationarity.
  • A PACF with many significant lags, sometimes erratic.

After differencing to returns, the PACF of returns often looks like:

  • Mostly insignificant lags, similar to stock returns.
  • Occasional spikes during high-volatility periods.

This is a good example of partial autocorrelation function (PACF) behavior that changes over time. Analysts sometimes compute PACF over rolling windows to see how the local autocorrelation structure evolves across regimes.

The key takeaway: a messy PACF can be a signal that the series is nonstationary, has structural breaks, or needs variance modeling rather than more AR terms.


8. Energy demand: PACF with strong daily and weekly structure

Electricity load forecasting is a classic time series application. Hourly load data show:

  • Daily cycles tied to human activity.
  • Weekly cycles (weekdays vs weekends).
  • Weather-driven changes.

PACF pattern:

  • Strong spikes at lag 1 and a few nearby lags (short-run persistence).
  • Pronounced spikes at lag 24 and 168, reflecting daily and weekly cycles.
  • After removing seasonality and weather effects, the residual PACF might show just 1–2 significant lags.

Among the best examples of partial autocorrelation function (PACF) for operations research, this pattern guides modelers toward SARIMA or hybrid models that combine AR terms with regression on temperature and calendar effects.


How to read these examples of partial autocorrelation function (PACF)

Looking across these examples of examples of partial autocorrelation function (PACF), a few recurring patterns stand out:

  • Sharp cut-off in PACF, slow decay in ACF suggests an AR process. For instance, the AR(1) retail sales series and the AR(2) industrial production series.
  • Many significant PACF lags can indicate nonstationarity, structural breaks, or an overfitted model, as seen in undifferenced crypto prices.
  • Regular spikes at seasonal lags (7, 24, 168, 365, etc.) point to seasonal components, like hospital admissions or energy demand.
  • Flat PACF with all lags inside the bands is a hallmark of near white noise, like stock index returns.

In modern workflows (2024–2025), PACF is rarely used alone. It’s typically combined with:

  • Information criteria (AIC, BIC) to compare ARIMA orders.
  • Cross-validation or backtesting for forecast performance.
  • Domain knowledge (e.g., knowing that a weekly cycle exists even if data are noisy).

PACF remains valuable because it gives a visual, interpretable check on the structure suggested by more automated methods.


Practical tips when using PACF in 2024–2025

Recent trends in time series modeling lean heavily toward machine learning and deep learning (e.g., gradient-boosted trees, transformers). Still, the partial autocorrelation function has not gone out of style—it has just shifted roles.

Here are a few practical points, framed around the examples of partial autocorrelation function (PACF) above:

Use PACF to sanity-check feature engineering.
If you’re building lag features for a gradient-boosted model on web traffic, the PACF can tell you which lags are likely to matter (e.g., 1–3, 24, 168). That beats blindly throwing in 200 lags.

Check stationarity before interpreting PACF.
The messy PACF for undifferenced crypto prices illustrates how nonstationarity can make interpretation misleading. Always inspect the series, consider unit root tests, and difference or detrend as needed.

Combine PACF with domain calendars.
For hospital admissions or energy demand, you know weekends and holidays behave differently. Use PACF alongside calendar variables rather than trying to force everything into AR terms.

Use PACF on residuals, not just raw data.
In many of the best examples of partial autocorrelation function (PACF) from practice, analysts first remove trend, seasonality, and known exogenous drivers, then look at the PACF of the residuals. That tells you whether remaining serial dependence is adequately modeled.

For more formal background on PACF and ARIMA modeling, many statistics departments publish open course notes. One widely cited source is Penn State’s online statistics program: https://online.stat.psu.edu


FAQ: examples of PACF and common questions

Q1. Can you give a simple example of PACF indicating AR(1)?
Yes. A detrended monthly sales series where the PACF shows one large spike at lag 1 and no significant spikes afterward is a classic example of PACF indicating an AR(1) process. This pattern suggests that including just one AR term is usually enough.

Q2. What are common real examples of partial autocorrelation function (PACF) with seasonality?
Hospital admissions, hourly web traffic, and electricity demand are good real examples. Their PACF plots often show regular spikes at seasonal lags: 7 days for weekly patterns, 24 hours for daily cycles, or 168 hours for weekly cycles in hourly data.

Q3. How is PACF different from ACF in practice?
The ACF measures total correlation at each lag, mixing direct and indirect effects. The PACF isolates the direct correlation at lag k after controlling for all shorter lags. In practice, the ACF is more informative about MA components, while the PACF is more informative about AR components, especially when you compare patterns across the kinds of examples discussed above.

Q4. Is PACF still useful if I’m using machine learning models instead of ARIMA?
Yes. Even if you’re using random forests, gradient boosting, or deep learning, PACF helps you decide which lagged features are worth including and whether your residuals still have structure. Many practitioners in 2024–2025 use PACF as a diagnostic and feature-selection tool rather than as the sole guide to model order.

Q5. Where can I find datasets to experiment with more examples of PACF?
You can explore:

  • Economic and financial series from FRED (Federal Reserve Bank of St. Louis) at https://fred.stlouisfed.org
  • Climate and environmental data from NOAA at https://www.noaa.gov
  • Public health time series from CDC at https://www.cdc.gov

These sources give you plenty of raw material to generate your own examples of partial autocorrelation function (PACF) and practice interpreting different patterns.

Explore More Time Series Analysis Examples

Discover more examples and insights in this category.

View All Time Series Analysis Examples