We now describe the basic framework of modeling approaches of ARIMA and ANFIS.

### 3.1 Autoregressive integrated moving average

In the framework of regression models, the computation of present output is done as a linear combination of some pre-specified number of past outputs and moving average of random white Gaussian noise [3].

Let us denote Ω as the lag operator such that Ω*X*(*t*) = *X*(*t* - 1). In general we write Ω^{τ}*X*(*t*) = *X*(*t* - *τ*). Also let us denote Δ as the difference operator so that Δ*X*(*t*) = *X*(*t*) - *X*(*t* - 1). It can be observed that Δ^{τ}*X*(*t*) = (1 - Ω)^{τ}*X*(*t*). Let us also define two polynomial functions *ϕ*(Ω) = (1 - *ϕ*_{1}Ω-.......- *ϕ*_{
m
}Ω^{m}) and *θ*(Ω) = (1 - *θ*_{1}Ω-.......- *θ*_{
n
}Ω^{n}) where *ϕ*_{1}, *ϕ*_{2},....*ϕ*_{
n
} and *θ*_{1}, *θ*_{2},....*θ*_{
n
} are coefficients of the lag operator Ω; *m* and *n* are the degree of the polynomials, respectively.

Given these notations, the definition of regressive models follows next.

An autoregressive model of order *m,* generally denoted by AR (*m*) [3] has the form

\varphi \left(\mathrm{\Omega}\right)X\left(t\right)=\epsilon \left(t\right)

(1)

where *ε*(*t*) is random white Gaussian noise.

An autoregressive moving average model of order (*m*, *n*), generally denoted by ARMA (*m*, *n*) [3] has the form

\varphi \left(\mathrm{\Omega}\right)X\left(t\right)=\theta \left(\mathrm{\Omega}\right)\epsilon \left(t\right).

(2)

An autoregressive integrated moving average model of order (*m*, *τ*, *n*) which is generally denoted by ARIMA (*m*, *τ*, *n*) [3] has the form

\varphi \left(\mathrm{\Omega}\right)\phantom{\rule{0.12em}{0ex}}{\text{\Delta}}^{\tau}X\left(t\right)=\theta \left(\mathrm{\Omega}\right)\epsilon \left(t\right).

(3)

It can be seen that ARIMA is the most general of all the three regressive models discussed above. Although other more generalized regressive models are also available, ARIMA will be the focus of our study in this paper.

### 3.2 Adaptive neuro fuzzy inference system

A fuzzy inference system (FIS) is a framework for computation based on the concepts of fuzzy set theory, fuzzy if then rules and fuzzy reasoning [13, 19]. As shown in Figure 1, a FIS mainly has three conceptual components, *viz*., rule base, database, and reasoning mechanism. The rule base is a collection of fuzzy if-then rules which decide the system’s behavior and response under different possible situations. The database contains the information about the membership functions in terms of their type and shape. Finally, the reasoning or decision-making mechanism is used to infer and derive output from the system. It may be noted that a FIS may need a fuzzification interface to convert crisp input values to fuzzy values suitable for processing. However, when the inputs themselves are fuzzy then this may not be required. Similarly at the output side, a defuzzification interface is used because in almost all of the real-world application, we need a crisp output value.

A neural network following the above discussed framework of a FIS results in ANFIS. For introductory purpose, a first-order *Sugeno-type FIS* (see [19]) in Figure 2 and equivalent ANFIS architecture in Figure 3 has been shown next. The following common rule set for first-order Sugeno fuzzy model can easily be verified:

Rule 1. If *x* is *P*_{1} and *y* is *Q*_{1}, then *d*_{1} = *a*_{1}*x* + *b*_{1}*y* + *c*_{1}.

Rule 2. If *x* is *P*_{2} and *y* is *Q*_{2}, then *d*_{2} = *a*_{2}*x* + *b*_{2}*y* + *c*_{2}.

In the ANFIS architecture shown in Figure 3, each node in the same layer has the similar function. Here we denote output of the *i* th node in the layer *l* by {O}_{i}^{l}*.*

In layer 1, a linguistic label is associated with each input in terms of its membership grade. This membership grade can be defined by suitable membership functions {{\mu}_{P}}_{{}_{i}}\left(x\right) and {{\mu}_{Q}}_{{}_{i}}\left(x\right) with appropriate parameters. The parameters associated with these membership functions are called premise or nonlinear parameters.

In layer 2, a Suitable T-norm operator (most commonly multiplication) is used to perform fuzzy AND operation of the input signals to get the output:

{O}_{i}^{2}={w}_{i}={{\mu}_{P}}_{{}_{i}}\left(x\right){{\mu}_{Q}}_{{}_{i}}\left(x\right);\phantom{\rule{0.5em}{0ex}}i=1,2.

(4)

The output of this layer is often called the firing strength of the corresponding rule.

The ratio of a rule’s firing strength to the sum of the firing strengths of all the rules is calculated in layer 3. This operation is also called normalization of firing strengths:

{O}_{i}^{3}=\overline{{w}_{i}}=\frac{{w}_{i}}{{w}_{1}+{w}_{2}};\phantom{\rule{0.5em}{0ex}}i=1,2.

(5)

The output of layer 4 is given by

{O}_{i}^{4}=\overline{{w}_{i}}{d}_{i}=\overline{{w}_{i}}\left({a}_{i}x+{b}_{i}y+{c}_{i}\right);\phantom{\rule{0.5em}{0ex}}i=1,2.

(6)

Here, *a*_{
i
}, *b*_{
i
}, and *c*_{
i
}; *i* = 1, 2 are called consequent or linear parameters of ANFIS. The total number of parameters of ANFIS is the sum of premise and consequent parameters.

Lastly in layer 5, the summation of incoming signals is performed to get the overall output of ANFIS.

{O}_{i}^{5}={\displaystyle \sum _{i}\overline{{w}_{i}}{d}_{i}}=\frac{{\displaystyle \sum _{i}{w}_{i}{d}_{i}}}{{\displaystyle \sum _{i}{w}_{i}}}.

(7)

It must be noted that the structure of ANFIS explained above is not unique, and, in fact, arbitrary but meaningful assignment of node functions and configurations is possible.

A Sugeno-type FIS as in MATLAB Fuzzy Logic Toolbox is shown in Figure 4.