# Time Series Patterns⌛

# What is Time Series?

A dataset consisting of observation values arranged over time. Examples include stock market data, weather data, etc.

The concepts of Stationary, Trend, Seasonality,and Cycle are crucial for interpreting time series data.

## Stationary

Stationary refers to the statistical properties of a series remaining constant over time. If the mean, variance and covariance of a time series remain constant throughout time, the series is considered stationary. Theoretically, when time series data follows a specific pattern, it becomes more predictable. If a time series is stationary, predictions can be made more comfortably. In cases where non-stationarity is observed the difference of the series is taken. For instance, if there are values on Mondays and also on Sundays at time T, these two values are subtracted from each other, making the series stationary.

## Trend

It is one of the most critical topics for the time series domain. Awareness of the concept of Trend is essential in every field. The structure indicating the long-term increase or decrease in a time series is referred to as a trend. If there is a trend, the probability of the series being stationary is very low.

## Seasonality

The condition where a time series exhibits a specific behavior that repeats at regular intervals is referred to as seasonality.

## Cycle

Cyclical patterns are similar to seasonality but distinct from it. Although not critically important for forecasting, it is necessary to distinguish between them. Seasonality is more pronounced, short-term and regularly associated with specific intervals such as days, weeks or seasons. On the other hand, cyclical patterns are longer-term, more uncertain in nature and do not align with specific intervals like days, weeks, or seasons. They often emerge due to structural reasons, such as changes influenced by statements from certain individuals in the political arena.

# Understanding the Nature of Time Series Models

Our goal is to predict what will happen on the next day. In a time series period, the future day progresses most influenced by its value from one day before. Building on this assumption, We can predict the next value by taking the average of the preceding 4–5 values. However, on the other hand, considering that the series previously experienced significant declines and increases. We can also take the averages of these points on the same date and add it as well. Carrying information about past seasonality and focusing on past values makes sense.

## Moving Average

The future value of a time series is the average of its k previous values. Moving average is generally used not for prediction but to capture and observe a trend.

However, within the scope of ML, we derive features based on moving averages when generating features.

## Weighted Average

Weighted average is similar to moving average. It carries the idea of giving more weight to later observations. For example, when taking the weighted average we can make a balanced prediction by assigning more weight to the first 4 days and gradually decreasing the weight for the subsequent days.

As it is understood, time series data is more influenced by its previous values. Therefore, going back to the previous values is a focus for us. How to go back to the previous values is also a focus,for example, using moving average or weighted average.

# Smoothing Methods (Holt-Winters)

## Single Exponential Smoothing

It is successful only for stationary series. There should be no trend or seasonality.

It makes predictions by applying exponential correction.

It assumes that the future is more related to the recent past and the effects of the past are weighted exponentially.

SES = Level

## Double Exponential Smoothing (DES)

It applies exponential correction, taking trend into account.

DES: Level (SES) + Trend

The fundamental approach is the same as SES but it also considers the trend. Suitable for univariate time series with trend but without seasonality. The total model we saw above is the additive model and the one below is the multiplicative model. A multiplicative series carries the expression that there are multiplicative expressions in the function. If the seasonal and residual components are independent of the trend then the series is an additive series. If they are not independent, it is a multiplicative series. If seasonal and residuals are distributed around 0 it is additive.

y(t) = Level + Trend + Seasonality + Noise

y(t) = LevelTrendSeasonality * Noise

## Triple Exponential Smoothing (Holt-Winters)

Triple exponential smoothing is the most advanced smoothing method. This method dynamically evaluates the effects of level, trend and seasonality to make predictions. It can be used for univariate time series with either trend or seasonality or both.

TES = SES + DES + Seasonality

# Statistical Methods

I will talk about some methods that form the basis of this subject within the scope of statistical methods.

The first method we will see is **Autoregression, AR(p)**. The prediction process is made with a linear combination of observations in previous time steps. It is suitable for univariate time series without trend and seasonality. p: the number of time lags. If p = 1, it means that the model is built with the previous time step. It is similar to SES but considered as a regression here.

**MA(q): Moving Average**

Prediction is made with a linear combination of errors obtained in previous time steps. It is suitable for univariate time series without trend and seasonality.

q: the number of time lags.

ARMA(p, q) = AR(p) + MA(q)

**The ARMA model** is the sibling of the SES model. In the SES model, there is a coefficient called the smoothing factor that weighs the effects of these two terms. In ARMA, the weight of past actual values is represented by a1, and the coefficient of residuals is represented by m1, both independent of each other. In SES, both are dependent on a single expression*.*While the terms in the Holt-Winters methods are shaped by a single parameter, in ARMA models, the terms have their parameters. In other words, the essence of the data is learned. It combines Autoregressive moving average AR and MA methods. Prediction is made with a linear combination of past values and past errors. It is suitable for univariate time series without trend and seasonality. p and q are the number of time lags. p for the AR model and q for the MA model.

## ARIMA(p, d, q): (Autoregressive Integrated Moving Average)

ARIMA models can model both trend and seasonality. Prediction is made with a linear combination of differenced observations from previous time steps and errors. It predicts the future by subtracting the values of today and the previous day.It is suitable for univariate data with trend but without seasonality

p represents the number of real value lags, p = 2 means yt-1 and yt-2 are in the model.

d represents the number of differencing operations.

q is the number of error lags. We will generate p, d, q values with the brute-force method and try to predict the best ones.

## SARIMA(p, d, q): (Seasonal Autoregressive Integrated Moving-Average)

It can be used for univariate time series with trend or seasonality.

p, d, q are parameters coming from ARIMA, which model the trend. ARIMA could model the trend.

p = number of real value lags (autoregressive degree)

d: number of differencing operations

q = number of error lags (moving average degree)

P, D, Q are seasonal lags, which are season elements from ARIMA.

m is the number of time steps for a single seasonality period, representing the structure of occurrence.

I Explained everything more detailed with codes and their explanations on my kaggle notebook;

To Be Continued…