*Modelling & Forecasting Time-Series data has been one of the cornerstones of Predictive Analytics in the era of Big Data. There are a plethora of forecasting techniques available today whose context can be a pain to understand and as we know, in the war on noise, context serves as a crucial ammunition. To that end, we, Hemanth Sindhanuru & Srinidhi K, from LatentView Analytics are presenting this series of articles where we will be discussing a structured methodology to understand, analyse & forecast time-series data*

Until now, we have explored time-series modelling in a generic sense. We have discussed the conceptual bases on which some of the popular approaches for modelling time-series are built and attributes which capture the underlying structural characteristics of a time-series. We now move on to the modelling techniques themselves. We start with Exponential Smoothing which is one of the fundamental time-series modelling techniques that is popular for its simplicity & low computational cost

This approach is built on the fundamental assumption that any particular observation of a time-series is related to (in other words, a function of) it’s past observations i.e. the observations preceding it. The models under the exponential smoothing framework involve defining an observation in the time-series as a weighted average of its past observations, with the weights decaying exponentially as the observations get older, hence the name exponential

As we see in a generic mathematical representation of the idea, a particular observation at time T+1 is a weighted average of all the preceding observations at times T, T-1, T-2, …, 2, 1 in the series and the rate at which the weights decay exponentially is controlled by the smoothing parameter represented by, which varies between 0 & 1.

SIMPLE EXPONENTIAL SMOOTHING

We start by looking at the simplest model in the framework which builds on the above assumption of exponentially decreasing weights. This model doesn’t take into account any seasonal or trend effects and hence, is suitable for application on time-series with very weak Seasonality & Trend. *( For a quantitative measure of these attributes, please refer the previous article in the series )*

Defining the Modelling Equation

Drawing from the aforementioned idea, a generic observation of a time-series can be represented as follows

The notation can be simplified further in the following way;

Computing the model parameters

To provide closure for the series, every model under the ETS framework requires an initialization of the first observation of the series.

The aforementioned assignment is not the only option for initializing the model. We can also set the value to the mean of the series, among other alternatives.

Once initialized, the next step involves the calculation of the parameter, α. To choose the optimum value for the parameter, we follow an approach similar to that of in Multiple Linear Regression. We select a loss function like the Sum of Squared Errors (SSE) and

select the optimum value of the parameter, alpha which minimizes the loss function.

EXTENDING THE SIMPLE ETS MODEL TO ACCOUNT FOR SEASONALITY & TREND

Though the use-cases of the Simple ETS model are limited due to the presence of a Seasonal or Trend component in most of the series under consideration, understanding the steps in the computation of the model provides us a stepping stone for the more complex models in the framework. To expand the scope of the Simple ETS model to time-series’ with Seasonality & Trend attributes, these components are characterized explicitly through separate equations which are characterized by their own independent parameters. Consider the following example

As we see, the seasonal & trend components are represented by explicit equations which are characterized by the parameters, β & γ. The computation of the model parameters involves an identical procedure for each of the explicit equations; Initializing, Selecting a Loss function & Optimizing the parameters with respect to the loss function.

TAXONOMY OF THE ETS MODELS

The flexibility of the ETS framework lies in its ability to model Seasonality & Trend components of different natures in a generic way. Let’s examine the multiplicative seasonality version of the Holt Winters model shown below;br/>

Considering the following variants of each of the time-series’ components; 2 for the Remainder, 5 for the Trend & 3 for the Seasonality shown below;

we have a framework of 30 ETS models ( 2 x 5 x 3 combinations )

```
IMPLEMENTATION IN R
require(datasets)
require(forecast)
?ets
# ets() function in R has numerous input arguments owing to its flexibility.
# Please refer the official documentation for the function before running it.
dataset <- UKgas
plot(dataset)
# UKgas is an inherent object of class "ts" in R containing the ...
# ... Quarterly UK gas consumption from 1960-Q1 to 1986-Q4
ETS_model ets ( y = dataset, model = "ZZZ" )
# ets() requires 3 main arguments as input from the user...
# ---> y = a numeric vector or an object of class ts
# ---> model = a three character string identifying the nature of Time-Series components, "ZZZ"
# |
# |----> FIRST CHARACTER: Nature of Remainder..... ('A','M','Z)
# |----> SECOND CHARACTER: Nature of Trend ....... ('N','A','M','Z')
# |----> THIRD CHARACTER: Nature of Seasonality... ('N','A','M','Z')
#
# 'N' = None, 'M' = Multiplicative, 'A' = Additive
# 'Z' = Automatic Selection of the Best Variant
plot( ETS_model )
```

```
OUTPUT
```

FORECASTING THE FUTURE HORIZONS

Once we represent the time-series under consideration with a mathematical model, we move on to forecasting the time-series values for future horizons. There are two approaches to forecasting for multiple horizons in general

RECURSIVE APPROACH: We calculate the one-step ahead forecasts recursively for each of the future horizons. This can be better understood as shown below.

If we take a closer look, as the forecasting horizon grows, the no. of estimates on the RHS (please refer the notation in the previous pictorial representations) increases. This leads to an increasing bias component of the error (which we will be discussing in the upcoming topic, Bias-Variance Trade-Off) as the forecasting horizon grows

DIRECT APPROACH:An alternative approach is to extrapolate the modelling equation for the future horizons as shown.

```
IMPLEMENTATION IN R
ETS_forecast forecast( object = ETS_model, h = 20 )
plot( ETS_forecast )
# forecast() is an generic function for forecasting time-series models.
# The function invokes particular parameters which depend on the class of the first arg.
# object ---> an object of class "ets", output of the ets() function
# h ---> future horizon to be forecasted
```

```
OUTPUT
```

Keep watching this space for more

BACK TO THE FUTURE – A Beginner’s Guide to Forecasting

1. A Primer on Time-Series Forecasting

2. Structural Time-Series Models

3. Periodicity of a Seasonal Time-Series

4. Defining Time-Series Attributes

5. Exponential Smoothing Framework

6. ARIMA modelling in a nutshell