Journal of King Saud University - Computer and Information Sciences

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Contents lists available at ScienceDirect

Journal of King Saud University –


Computer and Information Sciences
journal homepage: www.sciencedirect.com

Deterministic weather forecasting models based on intelligent


predictors: A survey
K.U. Jaseena, Binsu C. Kovoor ⇑
Division of Information Technology, School of Engineering, Cochin University of Science and Technology, Kochi, Kerala, India

a r t i c l e i n f o a b s t r a c t

Article history: Weather forecasting is the practice of predicting the state of the atmosphere for a given location based on
Received 13 April 2020 different weather parameters. Weather forecasts are made by gathering data about the current state of
Revised 5 September 2020 the atmosphere. Accurate weather forecasting has proven to be a challenging task for meteorologists
Accepted 17 September 2020
and researchers. Weather information is essential in every facet of life like agriculture, tourism, airport
Available online xxxx
system, mining industry, and power generation. Weather forecasting has now entered the era of Big
Data due to the advancement of climate observing systems like satellite meteorological observation
Keywords:
and also because of the fast boom in the volume of weather data. So, the traditional computational intel-
Weather forecasting
Artificial neural networks
ligence models are not adequate to predict the weather accurately. Hence, deep learning-based tech-
Deep learning niques are employed to process massive datasets that can learn and make predictions more effectively
Autoencoders based on past data. The effective implementation of deep learning in various domains has motivated
Recurrent neural networks its use in weather forecasting and is a significant development for the weather industry. This paper pro-
vides a thorough review of different weather forecasting approaches, along with some publicly available
datasets. This paper delivers a precise classification of weather forecasting models and discusses poten-
tial future research directions in this area.
Ó 2020 The Authors. Production and hosting by Elsevier B.V. on behalf of King Saud University. This is an
open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
2. Related surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
3. Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
4. Weather forecasting framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5. Classification of weather forecasting models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.1. Classification based on the methodology used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.1.1. Statistical models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.1.2. Artificial intelligence models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.1.3. Hybrid models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.2. Classification based on parameter to be predicted . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
5.3. Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00

⇑ Corresponding author.
E-mail addresses: [email protected] (K.U. Jaseena), [email protected] (B.C. Kovoor).
Peer review under responsibility of King Saud University.

Production and hosting by Elsevier

https://doi.org/10.1016/j.jksuci.2020.09.009
1319-1578/Ó 2020 The Authors. Production and hosting by Elsevier B.V. on behalf of King Saud University.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Please cite this article as: K.U. Jaseena and B.C. Kovoor, Deterministic weather forecasting models based on intelligent predictors: A survey, Journal of King
Saud University – Computer and Information Sciences, https://doi.org/10.1016/j.jksuci.2020.09.009
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

6. Results and analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00


6.1. Open weather datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
6.2. Evaluation criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
6.3. Analysis of forecasting models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
7. Challenges and future directions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
8. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
Declaration of Competing Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 00

Nomenclature
KF Kalman filter
List of abbreviations LSSVM Least square SVM
ACO Ant colony optimization LSTM Long short term memory
AI Artificial intelligence MAE Mean absolute error
ANFIS Adaptive neuro-fuzzy inference system MAPE Mean absolute percentage error
ANN Artificial neural networks ML Machine learning
API Application programming interface MLP Multi-layer perceptron
ARIMA Autoregressive integrated moving average MMO Multi-objective optimization
ARMA Autoregressive moving average MMODA Modified multi-objective dragonfly algorithm
BPNN Backpropagation neural network MPR Multi-variable polynomial regression
CNN Convolutional neural networks MSE Mean squared error
CRBM Conditional restricted Boltzmann machine NARX Nonlinear autoregressive exogenous
DBN Deep belief network NMAE Normalized mean absolute error
DBNPF Deep belief network for precipitation forecast NMSE Normalized mean squared error
DNN Deep neural network NOAA National oceanic and atmospheric administration
ELM Extreme learning machine OSORELM Online sequential outlier robust ELM
EMD Empirical mode decomposition PSO Particle swarm optimization
ENN Elman neural network RBF Radial basis function
f-ARIMA Fractional-ARIMA RF Random forests
GARCH Generalized AutoRegressive Conditional Heteroskedas- RMSE Root mean squared error
ticity RNN Recurrent neural networks
GLM Generalized linear model SAE Stacked autoencoders
GRNN General regression neural network SDAE Stacked denoising autoencoders
GRU Gated recurrent unit SSA Singular spectrum analysis
GSOD Global surface summary of the day SVM Support vector machine
GSR Global solar radiation SVR Support vector regression
HDT Hybrid decomposition technique USAF United states air force
HMD Hybrid mode decomposition VAR Vector autoregression
IPSO Improved PSO WPD Wavelet packet decomposition
ISH Integrated surface hourly

1. Introduction Weather forecasting on a daily, weekly, monthly, or yearly basis


becomes essential as it can better reflect the changing trend of cli-
Weather forecasting is the scientific prediction of the state of mate and also provide timely and efficient environmental informa-
atmospheric conditions such as temperature, humidity, dew point, tion for the decisions at the micro-management level. Moreover, in
rainfall, and wind speed based on reliable data. Barometers, radar, specific cases, accurate weather prediction helps to prevent the
and thermometers are used to collect data for weather prediction. occurrence of floods or droughts (Salman et al., 2015).
External factors such as current weather conditions, data of previ- The rapid development of technologies like the Internet of
ous weather patterns, tracking the motion of air and clouds in the Things, Wireless Sensor Network, and Cloud Computing has helped
sky, finding and verifying changes in air pressure are essential in weather forecasting enter the era of Big Data. Big data technologies
forecasting weather. Accurate weather forecasting information is help to predict the future climate states more precisely. Moreover,
crucial in keeping people and property safe from threats of torna- with the advancement of deep learning techniques and appropri-
does, floods, and major storms. Various sectors like Business, Tour- ate data visualization methods, weather forecasting, and climate
ism, Sports, Agriculture, Mining, Power, Food Industry, Airport, and prediction can be made more effectively and accurately. Hence it
Naval systems depend heavily on accurate weather forecasting. In is reasonable to use deep learning approaches to mine valuable
agriculture, prior information about weather helps farmers to take information from weather data. Deep learning techniques use
necessary decisions to improve their crop yields. Airport or naval layers of neural networks to identify and extract meaningful pat-
systems require continuous weather data to know if there is a sud- terns from the datasets. A neural network with deep architectures
den change in climatic conditions. Accurate wind speed prediction can extract high-level abstract features of Big Data accurately
is critical for wind farms to control the working of wind turbines (Gheisari et al., 2017). In order to predict the weather in a very
during wind power generation. Mining industries require precise effective way, several weather forecasting models using deep
weather information to monitor the Earth’s crust continually. learning have been proposed. The foremost intention of this paper

2
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

is to provide an extensive review of weather forecasting models Experimental results verified that extrapolation using periodic
and various techniques and methodologies currently used by var- curve fitting and ANN are better models for wind speed prediction
ious researchers for weather forecasting. The main highlights of than ARIMA and produced reliable results. Medina et al. (2019)
the paper are presented below. made a comparative study of demand forecasting models using
ANN and observed that neural network produced more accurate
(1) A comprehensive review of various weather forecasting results compared to other methods. Rasp and Lerch (2018) studied
models the impact of employing neural networks for post-processing of
(2) Analysis of the essential hyper-parameters used by the fore- ensemble weather prediction models to determine the tempera-
casting models ture in Germany. Siami-Namini et al. (2018) conducted an exten-
(3) A precise classification of various weather forecasting sive survey on various time series prediction models using LSTM
models and ARIMA. Hong et al. (2020) performed a comprehensive review
(4) Exploration of the effectiveness of the various models using of energy forecasting models and summarized current research
statistical error indicators trends along with different publicly available datasets for energy
(5) Serve as a guide for beginners who are passionate about forecasting research. The proposed framework provides a precise
research on weather forecasting and acquire knowledge on classification of various weather forecasting models. The survey
different techniques and available Open datasets. also delivers the state of the art models for weather forecasting,
(6) This review deliberates potential future research directions its challenges, and future directions.
in this area. The proposed framework is a different approach for surveying
the existing weather forecasting models, where weather forecast-
This paper is organized as follows. Section 2 gives an overview ing models are classified mainly based on the methodology
of related surveys, and section 3 explains the motivation behind employed. The survey also classifies the models based on the
the proposed work. Section 4 describes the proposed framework, weather parameter to be predicted and provides the details of
where the general weather forecasting framework is presented, the open datasets available for experiments. The following section
followed by the classification of weather forecasting models in details the motivation behind the survey.
section 5. Section 6 investigates the results, and section 7 gives a
brief note on challenges and future directions. Finally, Section 8
3. Motivation
concludes the paper.

The weather forecasting systems can be classified into three


categories based on the model or methodology employed for fore-
2. Related surveys
casting, namely statistical models, Artificial Intelligence models,
and hybrid models. Among the three categories, statistical models
In this section, some of the surveys related to weather forecast-
deal with linear datasets. The commonly applied statistical models
ing are reviewed. Gheisari et al. (2017) performed a study on big
include ARMA, ARIMA, and its variants. Artificial Intelligence mod-
data and its research directions and analyzed various deep learning
els are further classified into machine learning predictors and deep
architectures suitable for big data processing, its challenges, and
learning predictors. Machine learning and deep learning models
future trends. Saima et al. (2011) surveyed various weather fore-
are better for handling nonlinear datasets. Artificial Neural Net-
casting models, including statistical, artificial intelligence, and
works, Support Vector Machine, Extreme Learning Machine,
hybrid models, and analyzed their merits and demerits. Nagaraja
Autoencoders, Convolutional Neural Networks, Recurrent Neural
et al. (2016) reviewed forecasting models for wind energy and
Networks, and Long Short Term Memory Networks are some of
attempted to find reliable forecasting models to explore the behav-
the popular AI models employed for weather forecasting tasks.
ior of wind. Liu et al. (2019) focused on wind speed forecasting
Hybrid models are combinations of two or more models that aim
models using machine learning predictors, deep learning predic-
to enhance the performance of the forecasting models further.
tors, and hybrid models. The main machine learning predictors
The main aim of weather forecasting models is to predict temper-
used for weather forecasting are Artificial Neural Networks, Sup-
ature, humidity, dew point, rainfall, and wind speed based on past
port Vector Machines, and Extreme Learning Machines. Deep
data. Analysis of the literature reveals that several surveys related
learning predictors consist of Autoencoders, Recurrent Neural Net-
to wind speed forecasting have been conducted. However, reliable
works, and Long Short Term Memory Networks. Time series fore-
and comprehensive surveys related to forecasting weather param-
casting becomes possible using these methods.
eters, such as temperature, rainfall, dew point, and humidity, are
Kunjumon et al. (2018) performed a study on various forecast-
much less. Hence, an extensive survey on various weather forecast-
ing models based on artificial neural networks, support vector
ing models and their classifications are presented in this paper.
machines, and decision trees. Naveen and Mohan (2019) discussed
various application domains of weather forecasting and also inves-
tigated weather prediction models using machine learning tech- 4. Weather forecasting framework
niques and the challenges related to weather forecasting. Reddy
and Babu (2017) analyzed various big data weather prediction The general framework of a weather prediction model is
models using MapReduce and machine learning models. The described in Fig. 1. The various steps of the weather prediction
authors also addressed the limitations and issues related to big framework include data acquisition, data pre-processing, model
data weather forecasting, especially rainfall forecasting. Tran-Anh selection and training, model evaluation, and visualization of
et al. (2019) recommended an improved rainfall prediction model results. Due to the rapid development of novel technologies, such
based on wavelet transform and seasonal ANN. The authors also as the Internet of Things, Wireless Sensor Networks, and Cloud
investigated different methods for predicting monthly rainfall. Tar- Computing, weather data were made available in various formats
ade and Katti (2011) offered a comprehensive study of wind speed and huge volumes (Chavan and Momin, 2017). This data consists
forecasting systems based on ARIMA, ANN, and polynomial curve of enormous amounts of both useful and useless information,
fitting. Kulkarni et al. (2008) investigated the effectiveness of var- and they are also in the form of unstructured data. Once the data
ious statistical approaches suitable for wind speed forecasting. is gathered, the subsequent step is to pre-process the data to

3
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Fig. 1. The layout of the weather forecasting model.

remove irrelevant and missing values and to get cleaned data. The ology used, the prediction horizon, and the parameter to be pre-
primary forms of data pre-processing include data cleaning, data dicted. Fig. 2 provides the complete classification of the
integration, data reduction, and data transformation. Data pre- forecasting models based on various factors mentioned above. The
processing techniques aim to improve the quality of the input data. forecasting models can be univariate or multivariate depending
Quality data will always yield quality output during the training upon the number of variables involved in forecasting. For instance,
process. Therefore, at this stage, data is prepared for training. Data if the temperature is predicted based on many environmental fac-
cleaning methods remove noise, missing values, and inconsisten- tors, then such type of forecasting can be termed as multivariate. A
cies present in the data. Real-world data often contain missing val- univariate model depends on only one variable, whereas a multi-
ues. It can be instigated by issues such as values that are not variate model predicts the output based on several factors. Apart
documented and data corruption. Most of the forecasting algo- from this, forecasting models can be single-step or multistep,
rithms do not support data with missing values. So, missing value depending on the number of time steps to be predicted. Single-
imputation is a necessity, and the missing values are imputed in step forecasting models predict a single observation in the future,
most cases using mean values. whereas multistep forecasting models predict multiple time steps
The data integration process merges data from different sources to the future. Weather forecasting approaches come in four different
into an intelligible data store. Data reduction can reduce the size of scales based on the time horizon: very short-term, short-term,
the data by aggregating, eliminating redundant features, or cluster- medium-term, and long term. Short-range and very-short range
ing. In the data transformation step, the data are normalized using forecasts are found to be more accurate than medium or long ranges.
the normalization or standardization technique and converted to a Furthermore, the forecasting systems can be two basic types
form suitable for processing. Data transformation techniques not namely, deterministic and probabilistic, based on the methodology
only improve the accuracy and efficiency of mining algorithms employed for forecasting. Deterministic methods provide accurate
but also reduces the training time of the models. These pre- values of weather forecasts for a specific location, while probabilis-
processing techniques are not mutually exclusive, and hence, they tic methods suggest probabilities of weather events. This survey
may function together. Once the data is pre-processed, a suitable basically focuses on deterministic forecasting models. The deter-
forecasting model is selected, trained, and tested using the data- ministic forecasting models can be further classified as statistical
sets. Appropriate algorithms can be used for effective forecasting models, Artificial Intelligence models, and hybrid models. Depend-
of weather information. ing on the parameter to be predicted, forecasting models can be
Model selection and training is an essential step in any forecast- classified as temperature prediction models, wind speed prediction
ing system. The knowledge about the various types of forecasting models, rainfall prediction models, dew point prediction models,
models will help the researchers in choosing a suitable forecasting etc. as illustrated in Fig. 2.
model. A suitable forecasting model matching the application With the influence of weather in social and economic activities,
domain is selected and trained using the training datasets. Upon adverse events can be prevented by predicting weather conditions
completion of the training, the performance of the model is evalu- accurately. In the preceding years, several methods have been sug-
ated using statistical quantitative error indicators such as MAE, gested to deal with this goal. The next section elucidates the differ-
RMSE, MAPE, and R2. Lastly, the results are visualized using suit- ent categories of weather forecasting models.
able plots. Scatter plots, line plots, and semilog plots can be
employed to visualize the results. These plots can be utilized to 5.1. Classification based on the methodology used
analyze graphically, the difference between actual and predicted
values. The variation of predicted values from actual values can Based on the methodology used for forecasting, deterministic
be visualized more precisely using semilog plots. models can be classified as statistical models, Artificial Intelligence
models, and hybrid models.
5. Classification of weather forecasting models
5.1.1. Statistical models
The weather forecasting systems are classified into various cate- Statistical forecasting models are linear models and can be
gories based on multiple factors. The factors comprise the number of employed for very short-term weather forecasting. Statistical fore-
variables involved, the number of time steps predicted, the method- casting allows weather predictions using past weather data. Linear
4
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Fig. 2. Classification of forecasting models.

regression is a statistical model and is used for finding the linear Liu et al. (2011) proposed a predictive model for wind speed
relationship between study variables and one or more explanatory using ARMA–GARCH method with 7-year hourly data collected
or independent variables. Linear regression can be of two types, from Colorado, USA. Liu et al. (2012) suggested wind speed fore-
namely simple and multiple. Simple linear regression is convenient casting models by combining ARIMA with ANN and ARIMA with
for finding the statistical relationship between two continuous Kalman separately and analyzed the effectiveness of these models
variables, and the aim is to obtain a line that best fits the data in multi-step prediction. Chen et al. (2009) modeled a system to
for which total prediction error is as small as possible. A gradient forecast wind power using ARIMA. A forecasting system using
descent optimization algorithm is employed to minimize the error ANFIS and ARIMA was proposed by Suhartono et al. (2012) for pre-
as small as possible. Multiple linear regression differs from simple dicting the monthly rainfall of Indonesia. Cadenas et al. (2016) rec-
linear regression in such a way that the study argument depends ommended a wind speed forecasting system with the help of a
on more than one independent arguments. ARMA, ARIMA, multiple univariate ARIMA and a multivariate NARX model. Table 1 provides
regression, and VAR are some of the commonly used statistical the parameters and the metrics used by the statistical models dis-
models. ARIMA is considered a statistical model that uses time- cussed in the literature.
series data to predict future observations. Vector autoregression
is also a statistical model used for forecasting a vector of time ser- 5.1.2. Artificial intelligence models
ies. VAR model is the generalized multivariate form of the univari- The development of Artificial Intelligence has inspired the
ate autoregression model. This model is beneficial when predicting growth of intelligent forecasting models. These models are proved
a collection of related variables. Tarade and Katti (2011) and Kulka- to be robust and effective compared to statistical models. AI mod-
rni et al. (2008) studied wind speed forecasting systems based on els are capable of dealing with non-linear datasets effectively and
various statistical approaches. Khashei and Bijari (2011) proposed exhibit better forecasting performance. These models are further
a model by combining linear model ARIMA and nonlinear model categorized into machine learning predictors and deep learning
ANN together and found them more effective than traditional indi- predictors, as shown in Fig. 2.
vidual models. It can be employed as a hybrid model in the time
series forecasting field, especially when higher forecasting accu-
5.1.2.1. Machine learning predictors. Machine learning and deep
racy is desired. Zaw and Naing (2009) suggested a rainfall predic-
learning models are better models for handling nonlinear datasets.
tion model using multi-variable polynomial regression (MPR) and
ANN, SVM, ELM, and Random Forests are some of the popular
compared the performance of the model with a multiple linear
machine learning predictors used for weather forecasting. In this
regression model. Kavasseri and Seetharaman (2009) presented a
section, the various weather prediction systems using machine
wind speed and power prediction model using the fractional-
learning techniques are presented. They include ANN-based mod-
ARIMA for North Dakota, USA, and compared the performance of
els, SVM-based models, and other ML models.
the forecasting model with the persistence model. Erdem and Shi
(2011) proposed wind speed prediction models based on ARMA, 5.1.2.1.1. ANN-based models. Neural networks based models,
decomposed ARMA, VAR, and restricted VAR and investigated the which are supervised and predictive, are considered to be one of
performance of these models on two datasets collected from wind the most popular techniques for weather prediction because they
farms in North Dakota. can capture non-linear relationships of past weather trends and
future weather conditions. Neural networks function as nonlinear
5
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Table 1
Summary of the parameters used by statistical models.

Algorithm Reference Parameters used Metrics used


Used
Temperature Wind Wind Air Humidity Precipitation Rainfall Dew Ozone R2 MSE RMSE MAE MAPE
speed direction pressure point
ARIMA (Tarade and Katti, U U
2011)
ARIMA (Kulkarni et al., 2008) U U
MPR (Zaw and Naing, 2009) U U
f-ARIMA (Kavasseri and U U
Seetharaman, 2009)
ARMA, (Erdem and Shi, 2011) U U
VAR
ARMA- (Liu et al., 2011) U U
GARCH
ARIMA-KF (Liu et al., 2012) U U U U
ARIMA (Chen et al., 2009) U U
ARIMA, (Suhartono et al., U U
ANFIS 2012)
ARIMA, (Cadenas et al., 2016) U U U
NARX

the activation function. The difference between the predicted value


Input layer Hidden layer Output layer
and the actual value is calculated, and this error is propagated back-
ward by adjusting the weights to minimize error using optimization
algorithms like Gradient Descent. Optimization algorithms are
employed as they are essential in building efficient neural
networks.
In this subsection, the various Weather prediction models
developed using Artificial Neural Networks proposed by different
authors are presented. Nayak et al. (2012) proposed a neural net-
work model for weather forecasting and compared the results with
SVM and Fuzzy Logic. Liu et al. (2018) presented a temperature
forecasting model based on the neural network approach and stud-
ied the importance of neural networks in intrusion detection. Hung
et al. (2009) proposed an ANN-based real-time rainfall forecasting
model for managing flood in Bangkok, Thailand, and the proposed
model showed better forecasting accuracy than the persistent
Fig. 3. Structure of a neural network. model. Kashiwao et al. (2017) observed that ANN gives more accu-
rate results to predict the probability of rainfall.
regression models with rapid information processing capability Abhishek et al. (2012) proposed a valid and reliable nonlinear
and develop a mapping of the input and output variables. predictive model using Artificial Neural Networks for daily
An Artificial Neural Network comprises a network of neurons weather forecasting and compared the performance of the pro-
termed nodes, connected to each other. Within each node, a trans- posed model using different activation functions, hidden layers,
fer function is employed to convert the inputs to the output. Fig. 3 and neurons to predict maximum temperature. Tests are con-
shows the architecture of a single hidden layer neural network. In ducted by varying the number of neurons and hidden layers and
general, a neural network system comprises an input, output, and concludes that performance increases with the number of hidden
hidden layers. The nodes within the input layer take in informa- layers. A two hidden layer Back Propagation Neural Network algo-
tion, transform this information using activation functions, and rithm for accurate rainfall prediction in Tenggarong, East Kaliman-
passes it to the next node. Each node aggregates the value it tan - Indonesia is proposed by Mislan et al. (2015). The metric
obtains and then transforms the results depending upon its activa- mean square error is employed to measure the performance of
tion function. These transformed values flow from the input layer the prediction task. Khajure and Mohod (2016) proposed a future
through hidden layers until the output nodes are reached. The weather prediction model using neural networks combined with
association between the input X and output Y of the neural net- the Fuzzy Inference System. Neural Networks is trained using dif-
work at layer i can be represented as in equation (1). ferent combinations of weather parameters such as humidity, tem-
! perature, pressure, wind speed, dew point, and visibility, and then
X
n
predicted values are applied to the Fuzzy Inference System. The
Yi ¼ g Wij Xj þ bi ð1Þ
j¼1
combination of a neuro-fuzzy system is used to enhance the accu-
racy of prediction.
where Wij represents the weight associated with the connection Suksri and Kimpan (2016) suggested a weather forecasting
between neurons, bi signifies the bias of the neuron i and g denotes method using Artificial Neural Networks to predict daily mean

6
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx


temperature, where the Fireworks Algorithm is employed to opti- 0 if jnj  e
jnje ¼ ð5Þ
mize the parameters of ANN. Fireworks Algorithm is a novel jnj  e otherwise
Swarm Intelligence algorithm for optimization. Narvekar and Far-
gose (2015) presented a weather forecasting model using ANN The dual design permits the extension of SVR to nonlinear func-
with a backpropagation algorithm, where the different input tions. The optimization problem can be further extended by using
parameters taken into consideration are temperature, relative the Lagrangian form as in equation (6).
humidity, air pressure, wind speed and direction, cloud amount, XN   X
1 N
   
and rainfall. The efficiency of the model is assessed by determining L¼ kwk2 þ C nj þ nj  aj e þ nj  yj þ wT £ xj þ b
2 j¼1 j¼1
the mean squared error. Akbar et al. (2018) suggested an ANN-
based model for predicting oil yield in turmeric, and the results X
N     N 
X 
confirmed that the proposed neural network model has better per-  aj e þ nj þ yj  wT £ xj  b  gj nj þ gj nj ð6Þ
j¼1 j¼1
formance. Wollsen and Jorgensen (2015) presented a novel method
to forecast future weather using NARX (Nonlinear Autoregressive The variables ai ; ai are Lagrange multipliers. It is also assumed
Exogenous) and ANN models. Filik and Filik (2017) described an that aj ; aj ; gj ; gj  0. Equation (6) can be reduced by taking the par-
ANN-based model to predict wind speed using weather parameters tial derivatives of L with respect to the variables (b,w, nj , nj ) and
such as wind speed, temperature, and pressure values and investi- equated to zero based on Karush-Kuhn-Tucker (KKT) conditions
gated the effect of these parameters. Sobrevilla (2016) developed (Smola and Schölkopf, 2004) as in equations (7)–(10).
an ANN-based model for daily weather prediction, and also ana-
N  
lyzed the importance of missing value imputation in the datasets. @L X
¼ aj  aj ¼ 0 ð7Þ
It has been perceived that the proposed model produced reason- @b j¼1
able accuracy.
5.1.2.1.2. SVM-based models. SVM is a Supervised Machine N    
@L X
Learning algorithm that can solve both classification and regres- ¼w aj  aj £ xj ¼ 0 ð8Þ
sion tasks (Awad and Khanna, 2015; Kavitha et al., 2016). SVM is @w j¼1
formally defined by a hyperplane that has the highest minimum
distance to the training data samples. An optimal hyperplane is a @L
¼ C  aj  gj ¼ 0 ð9Þ
hyperplane that has the maximum margin of training data. The @nj
neighboring training samples to the hyperplane are termed as Sup-
port Vectors. Support Vector machines can find nonlinear solutions @L
¼ C  aj  gj ¼ 0 ð10Þ
efficiently using the ‘‘Kernel trick,” and they also work well with @nj
high-dimensional datasets. Kernels are mathematical functions
that convert inputs into the necessary format. The various kernel Equation (6) can be rewritten by substituting the values of
functions employed by SVM algorithms can be linear, nonlinear, equations (7)–(10) as given below.
polynomial, radial basis function, and sigmoid. (
1X N
     X N
 
Let (xj, yj) be training datasets where j varies from 1 to N and N maximize  ai  ai aj  aj £ðxi ÞT £ xj  e ai þ ai
2 i;j¼1
indicates the number of samples. The SVR algorithm computes out- ) i¼1
X
L
  XN
 
put y from input data x using the equation (2). þ yi ai  ai such that ai  ai ¼ 0 and ai ; ai ½0; C  ð11Þ
i¼1 i¼1
y ¼ f ðxÞ ¼ w £ðxÞ þ b
T
ð2Þ
The dual variables gj ; gj are eliminated from equation (11) using
where £ðÞ indicates the nonlinear mapping function, which maps the conditions stated in equations (9) and (10). From equation (8),
input data to a high dimensional feature space. The variables w the parameter w can be defined as a nonlinear combination of the
and b are the weight vector and bias, respectively. The aim of e– training samples yi as shown in equation (12).
SVR is to find a function f(x) that has at most e deviation from
the targets yj for all the training data. This problem can be defined X
N
 
as convex optimization function as in Equation (3).
w¼ ai  ai £ðxi Þ ð12Þ
i¼1
(  
1 yj  wT £ xj  b  e The target function f ðxÞ can then be defined using equation (13).
minimize kwk2 subject to   ð3Þ
2 wT £ xj þ b  yj  e X
N
 
f ð xÞ ¼ ai  ai £ðxi ÞT £ðxÞ þ b ð13Þ
i¼1
The above-defined convex optimization problem is feasible
when function f() approximates all training pairs (xj, yj) with e pre- The kernel function kðxi ; xÞ can be defined as a linear dot pro-
cision. However, to deal with infeasible constraints, two slack vari- duct of the nonlinear mapping as in equation (14).
ables nj and nj are presented. Thus, equation (3) can be extended by
kðxi ; xÞ ¼ £ðxi Þ£ðxÞ ð14Þ
including slack variables as shown in equation (4).
hence, target function f ðxÞ given in equation (13) can be rewritten
N 
P 
minimize 1
kwk2 þC nj þ nj subject to as in equation (15).
2
j¼1
8   X
N
 
£ xj  b  e þ nj f ð xÞ ¼ ai  ai kðxi ; xÞ þ b ð15Þ
< yj  w
T
>
  i¼1
wT £ xj þ b  yj  e þ nj ð4Þ
>
: In this subsection, the various weather prediction models devel-
ni ; ni  0
oped using SVM are presented. SVM exhibits satisfactory accuracy
The variables C and e are predefined parameters and should be under a limited number of training samples. So, Du et al. (2017)
greater than 0. C is the regularization parameter. The above regres- recommended an SVM based precipitation forecasting model,
sion formulation deals with e-intensive loss function jnj2 and is where the Particle Swarm Optimization (PSO) algorithm is
defined as in equation (5). employed to optimize the hyperparameters of the SVM model.
7
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Yu et al. (2016) suggested a novel temperature forecasting model big data analytics are described by Himanshi and Raksha (2017),
based on Least Square SVM (LSSVM), where the IPSO algorithm and also the challenges faced while forecasting weather are
optimizes parameters of LSSVM. The authors compared the perfor- explained. Navadia et al. (2017) proposed a predictive analysis
mance of IPSO-LSSVM with SVM and ANN models. Rasel et al. model using Apache PIG on Hadoop to forecast the chances of rain-
(2017) recommended a weather forecasting model using SVM fall. The proposed system serves as a tool that takes in the rainfall
and ANN to predict the weather of Chittagong, Bangladesh, and data as input and efficiently predicts future rainfall. Table 2 sum-
the results showed that SVM outperformed ANN in rainfall predic- marizes the parameters and the metrics used by the machine
tion. Lu and Wang (2011) proposed a model for rainfall prediction learning predictors.
using a support vector machine to forecast monthly rainfall in
China. According to the authors, the accuracy of the model is 5.1.2.2. Deep learning predictors. Deep Learning is a subclass of
proved to be very promising and encouraging. Shabariram et al. machine learning which uses an artificial neural network approach
(2016) developed a tool for predicting rainstorm from a large for getting intelligence from massive datasets, and it mainly makes
amount of rainfall data in an efficient manner using Support Vector use of supervised or unsupervised strategies in deep architectures
Machine. to learn hierarchical representations automatically. Big data can be
more accurately processed using deep learning techniques. Deep
5.1.2.1.3. Other ML models. Random forest is one of the reliable learning models have been successfully implemented in many
ensemble classifiers for high dimensional data and is proved to be areas such as object detection, speech recognition, visual object
better for weather prediction. Singh et al. (2019) offered a weather recognition, genomics, and time series prediction. Deep learning
prediction system using a random forest algorithm. A model for techniques are more reliable for time series prediction because of
forecasting the likelihood of rain, average temperature, and maxi- their ability to learn temporal dependence present in time series
mum wind for a given day is developed by Ahmed (2015). The data. Deep learning uses neural networks with deep architectures,
authors employed the classifiers such as Naïve Bayes, Simple Ran- which are composed of many layers of non-linear processing
dom Forests, J48, and IB1 for the experiments. Hasan et al. (2016) stages. The output of each layer is given as input to its next higher
proposed a forecasting model where C4.5 learning algorithm is uti- layer. Fig. 4 shows the architecture of a deep neural network with
lized for predicting weather events. Stern (2008) offered a four hidden layers. Feed-forward neural networks or MLPs with
medium-range weather forecasting model for Melbourne, Aus- many hidden layers are indeed an excellent example of the models
tralia. A dew point temperature prediction model based on an with a deep architecture. Deep learning is a vigorous area of
extreme learning machine was presented by Mohammadi et al. research today, and Microsoft, Google, IBM, and Facebook are doing
(2015). The effectiveness of the suggested model is compared with active research in this area (Deng, 2014). Autoencoders and recur-
the SVM and ANN models. The simulation results verified that the rent neural networks are the popular deep learning architectures
ELM based proposed model outpaced both SVM and ANN models. employed for time series data.
The error metrics used to measure the accuracy are Mean Absolute 5.1.2.2.1. Autoencoders. An autoencoder is a neural network-
Bias Error, Root Mean Square Error, and correlation coefficient R. based unsupervised machine learning algorithm. Furthermore, it
Kurniawan et al. (2017) recommended a weather prediction sys- reconstructs the output values the same as the inputs. An autoen-
tem based on the fuzzy logic algorithm. A novel method for plant coder has three layers, the input layer, the hidden layers, and the
watering based on weather prediction is also introduced in this output layer. The hidden layer encodes the inputs to obtain a hid-
paper. The proposed weather calculation system will help the Gen- den representation, and the output layer decodes the hidden repre-
eral Farming Automation Control System to work automatically. sentation. The network is trained in such a way to reconstruct its
Juneja and Das (2019) proposed methods for enhancing the quality input to its output by learning abstract representations of the
of datasets by applying pre-processing techniques. inputs. Since Autoencoders reconstruct their input, the dimension-
Weather prediction models based on big data technologies are ality of the input and the output layers must be equal.
described below. In the big data era, traditional methods of data Autoencoders are a form of feed-forward neural network and
processing have been replaced by big data technologies. Ismail have two components, namely encoder, and decoder. The encoder
et al. (2016) proposed an analytical big data framework for tem- part performs encoding of input xi to a hidden pattern h as given in
perature forecasting based on the MapReduce algorithm. Radhika equation (16). The decoder part decodes the input from the hidden
et al. (2017) presented an outline of a few strategies in supporting pattern as in equation (17). The loss function used by Autoencoders
big data administration and investigation in the geoscience domain is squared error loss function and is defined in equation (18). The
for climate studies. The paper also discussed an overview of HBase weights, bias, and reconstructed input are denoted by W, b, and
for storing and managing essential geoscience data across dis- xbi , respectively. The activations or transfer functions are indicated
tributed machinery. There are various techniques to manage big using g and f. Backpropagation algorithm is employed to train the
data from the geospatial domain, and these techniques include networks. The main applications of Autoencoders include dimen-
Data fusion, Crowdsourcing, Cluster analysis, and Machine Learn- sionality reduction and denoising of data (Goodfellow et al., 2016).
ing. Jayanthi and Sumathi (2017) and Adam et al. (2017) developed Autoencoders can be of two types, undercomplete and over-
big data frameworks for weather forecasting using the MapReduce complete autoencoders. Undercomplete autoencoders have a
algorithm and Spark, respectively. Suryanarayana et al. (2019) fewer number of neurons in the hidden layer than the input layer
built a weather forecasting system using big data technology and are used for feature extraction and dimensionality reduction
Hadoop MapReduce. Pandey et al. (2017) investigated the applica- tasks. Whereas overcomplete autoencoders have a higher number
tion of big data in the field of weather prediction, and a Hadoop of neurons in the hidden layer than the input layer. The architec-
based weather forecasting model is proposed for efficient process- ture of a two-layer stacked undercomplete autoencoder is shown
ing and prediction of weather data. The word count algorithm is in Fig. 5. The Figure shows two parts, the encoder part encodes
used to pre-process the weather data. Two data mining tools, fuzzy inputs to a hidden pattern, and the decoder part reconstructs the
logic and ANFIS methods, are applied for accurate prediction of input from the hidden pattern. Autoencoders can represent both
weather data. Experimental results showed that the ANFIS method linear and nonlinear transformation.
gives more accurate results compared to other methods. Various
application areas that could benefit from weather forecasting using h ¼ g ðWxi þ bÞ ð16Þ

8
K.U. Jaseena and B.C. Kovoor
Table 2
Summary of the parameters used by Machine learning models.

Methodology Algorithm Used Reference Parameters used Metrics used


Tempeature Wind speed Wind direction Air pressure Humidity Precipitation Rainfall Dew point Ozone R2 MSE RMSE MAE MAPE
ANN based models ANN (Tran Anh et al., 2019) U U U
ANN (Tarade and Katti, 2011) U U
ANN (Kulkarni et al., 2008) U U
ANN (Nayak et al., 2012) U U U U U
ANN (Liu et al., 2018) U U U
ANN (Hung et al., 2009) U U U
ANN (Kashiwao et al., 2017) U U
ANN (Abhishek et al., 2012) U U U
ANN (Mislan et al., 2015) U U
ANN (Khajure and Mohod, 2016) U U U U U
Fireworks - ANN (Suksri and Kimpan, 2016) U U U U U U U
ANN (Narvekar and Fargose, 2015) U U U U U U U U
ANN (Filik and Filik, 2017) U U U U U
SVM based models PSO - SVM (Du et al., 2017) U U
9

SVM (Yu et al., 2016) U


SVM (Rasel et al., 2017) U U U

Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx
SVM (Lu and Wang, 2011) U U U
SVM (Shabariram et al., 2016) U
Other ML models RF (Singh et al., 2019) U U
RF (Ahmed, 2015) U U U U
C4.5 (Hasan et al., 2016) U U U U U U
ELM (Mohammadi et al., 2015) U U U U
MapReduce (Ismail et al., 2016) U
MapReduce and Spark (Suryanarayana et al., 2019) U U U
MapReduce (Pandey et al., 2017) U U U U
Apache PIG (Navadia et al., 2017) U U
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Fig. 4. Structure of a deep neural network.

xbi ¼ f ðWh þ cÞ ð17Þ Stacked autoencoders are produced by stacking multiple autoen-
coders in which the outputs of each layer are linked to the inputs
1 Xn X n
  of the succeeding layer. The training of stacked autoencoders is
J ðHÞ ¼ min ^x ij  xij 2 ð18Þ performed using a greedy layer-wise strategy. Stacked Denoising
m i¼1 j¼1
Autoencoders are a collection of denoising autoencoders stacked
Denoising Autoencoders, Stacked Autoencoders, Stacked together. After pre-training all the layers, fine-tuning for the entire
Denoising Autoencoders, and Variational Autoencoders are some network is performed. Unlike ordinary autoencoders, variational
of the forms of autoencoders. The main goal of autoencoders is to autoencoders are generative models that take random numbers
set the output the same as the input. However, sometimes the as inputs and generate images as output.
reconstruction never ensures perfect results. This may be because 5.1.2.2.2. Recurrent neural networks. Deep learning architec-
of information loss during extraction. Denoising autoencoders tures like RNN and LSTM are deep-rooted models for time series
reproduce pure input from the corrupted version of the input. prediction. Recurrent Neural Networks are not appropriate for
learning long term dependencies present in time series data and
can be used to learn only short term dependencies precisely. This
is because Recurrent Neural Networks suffer from vanishing and
exploding gradients (Nielsen, 2015). This unstable gradient prob-
lem may occur when the gradient of the transfer function becomes
too small or too large. This makes RNN hard to train and also pre-
vents it from catching long term dependencies. These shortcom-
ings can be rectified using LSTM networks, a variant form of
Recurrent Neural Networks. LSTM based architecture is capable
of capturing long term dependencies more precisely.
LSTM networks were familiarized by Hochreiter and Schmidhu-
ber in 1997 (Hochreiter and Schmidhuber, 1997). Besides input
and output layers, LSTM networks comprise one or more hidden
layers. Each LSTM cell in the hidden layer consists of three gates:
the input gate, output gate, and forget gate. The layout of an LSTM
cell is depicted in Fig. 6. Forget gate attempts to increase the effi-
ciency of the network by eliminating less important information
from the cell state. The input gate transfers new information to
the cell state, while the output gate passes valuable information
from the memory cell to the output. The mathematical representa-
tion of an LSTM network is given in equations (19) to (24).
Fig. 5. The architecture of a stacked autoencoder.
ot ¼ rðW o ht1 þ U o xt þ bo Þ ð19Þ

Fig. 6. Layout of LSTM cell. Fig. 7. The layout of an LSTM network.

10
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

it ¼ rðW i ht1 þ U i xt þ bi Þ ð20Þ capturing non-linear relationships between attributes and an


MLP for the prediction task.
 
f t ¼ r W f ht1 þ U f xt þ bf ð21Þ Ghoneim and Manjunatha (2017) presented a new deep
learning-based ozone level prediction model. Different deep archi-
 tectures have experimented, and architecture with ten hidden lay-
s t ¼ tanhðWht1 þ Uxt þ bÞ ð22Þ ers, 120 hidden neurons in each hidden layer, and 500 epochs is
found to be the best configuration. The proposed model predicted
 the ozone level more accurately. Error-values MSE, MAE, and RMSE
st ¼ f t  st1 þ it  s t ð23Þ
of the proposed model are determined to be minimal compared to
conventional machine learning models like SVM, ANN, and GLM.
ht ¼ ot  tanhðst Þ ð24Þ Gensler et al. (2016) presented a deep learning-based architecture
for solar power forecasting. The proposed architecture utilized
where it, ot, and ft represent input gate, output gate, and forget gate,
stacked autoencoder and LSTM for forecasting of solar power. Hos-
respectively. st, xt, and ht specify the state of the network, input, and
sain et al. (2015) proposed a deep neural network model with

output respectively at time step t. s t signifies a temporary state. U, Stacked Denoising Autoencoders for predicting air temperature.
V, and W denote the weights, and b is the bias. The operator  sym- Experimental results revealed that stacked denoising autoencoders
bolizes the element-wise multiplication. performed better than the neural network in temperature predic-
Fig. 7 illustrates the structure of a single hidden layer LSTM net- tion as it produced high accuracy.
work. Stacked or Deep LSTM consists of multiple layers, where the A novel approach for precipitation forecasting based on deep
output of the previous layer is the input to the following layer. The belief nets, called Deep Belief Network for Precipitation Forecast,
notion of Stacked LSTM was first suggested by Graves et al. (2013). is proposed by Zhang et al. (2017), and the characteristics hidden
Stacked LSTM cells can store more information through hidden lay- in hydrological time series data are easily extracted. The proposed
ers. Each LSTM layer processes a portion of the job and transfers deep learning-based model is compared with other traditional
the output to the next layer. Lastly, the results are taken from machine learning approaches. The experimental results show that
the output layer. The stacking of hidden layers deepens the Recur- the DBNPF model is more efficient and effective compared to the
rent Model and can help learn features more correctly. Stacked benchmark models such as the RBF neural network, SVM, ARIMA,
LSTMs are a powerful technique for sequence prediction tasks. ELM, and sparse autoencoder. A wind direction prediction system
GRU is another form of RNN which can also be used for solving is proposed by Hou et al. (2019) using Z-score normalization and
time series prediction problems. LSTM networks. Liang et al. (2018) recommended a multivariate
In this subsection, the various weather prediction models devel- wind speed forecasting system using a stacked LSTM network for
oped using deep learning architectures are discussed. Salman et al. a wind station in West Texas, USA. Zaytar and Amrani (2016) pro-
(2015) proposed a deep learning-based weather forecasting model posed a weather forecasting model using the Stacked LSTM net-
using rich hierarchical weather representations, and the models work, and the performance of the model is estimated using
are tested using large volumes of weather data. In this paper, weather data collected from nine cities. The studies verify that
mainly three deep learning models for weather forecasting, namely deep learning architectures like RNN and LSTM have strong self-
RNN, CRBM, and CNN, are described. Each of these models is learning capability and are better models for time series prediction.
trained and tested using the predetermined weather dataset. A Table 3 provides the parameters and the metrics used by the deep
parameter learning algorithm for each model, Gradient Descent learning predictors.
for CRBM, is implemented to gain testing error below the predeter-
mined threshold value. Model performance is evaluated using a k- 5.1.3. Hybrid models
fold cross-validation technique in which the weather dataset is Hybrid models are devised by combining two or more models,
divided randomly into k folds. Kaba et al. (2018) developed a deep where linear or nonlinear models can be combined. These are the
neural network-based daily GSR prediction model for various sta- most recent models which yield better performance. Analysis of
tions in Turkey. Flores-Vergara et al. (2019) proposed a deep learn- studies confirms that hybrid models deliver better forecasting
ing architecture for predicting ozone pollution at Santiago in South accuracy than individual models. Khashei and Bijari (2011) pro-
America. posed a hybrid system combining linear model ARIMA and nonlin-
A new deep network layer model called ‘‘Dynamic Convolu- ear model ANN and found them more effective as it produced
tional Layer” for short-range weather prediction is described by higher forecasting accuracy. Khodayar and Teshnehlab et al.
Klein et al. (2015). Three different datasets validate the effective- (2015) proposed a short-term wind speed forecasting model using
ness of the model. The dynamic convolution layer has two inputs, stacked autoencoder for Colorado, USA. A rough regression layer is
namely feature maps and filters. The whole system is trained using trained on the top of the SAE to forecast wind speed efficiently. The
the back-propagation algorithm. The open-source Caffe library is rough regression layer is employed to handle the uncertainties
used to implement the Dynamic Convolutional Layer. All the present in wind speed time series. Su et al. (2018) developed an
weights in the deep neural network are initialized according to ultra-short-term wind power forecasting model based on data
the Xavier initialization. Li et al. (2016) proposed a novel spatio- decomposition techniques and Elman neural network.
temporal deep learning-based air quality prediction method using Yu et al. (2017) proposed a prediction system to forecast wind
stacked autoencoder to extract inherent air quality features. The speed using a wavelet transform and Elman neural network, where
authors then compared the performance of this model with artifi- the wavelet transform is employed to the input time series to
cial neural networks, autoregression moving average, and support denoise the signal. Saba et al. (2017) recommended a hybrid model
vector regression models. The empirical outcomes illustrate that using MLP and RBF to enhance the efficiency of weather prediction.
the proposed method for performing air quality predictions has a The accuracy is assessed using the Correlation coefficient, RMSE,
superior performance. The efficiency of the suggested model is and scatter index. The proposed hybrid model has better forecast
evaluated using the error indicators RMSE, MAE, and MAPE. A deep accuracy compared to single MLP and RBF networks. Liu et al.
learning-based architecture for the prediction of the accumulated (2013) proposed a wind speed forecasting system by combining
daily precipitation for the next day is suggested by Hernandez Wavelet, Wavelet Packet, Time series, and ANN. The proposed
et al. (2016). The model utilized an autoencoder for reducing and model is compared with neuro-fuzzy, ANFIS, wavelet packet-RBF,
11
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Table 3
Summary of the parameters used by Deep learning models.

Algorithm Reference Parameters used Metrics used


Used
Temperature Wind Wind Air Humidity Precipitation Rainfall Dew Ozone R2 MSE RMSE MAE MAPE
speed direction pressure point
RNN (Salman et al., 2015) U U U U
RNN (Flores-Vergara et al., U U
2019)
DNN (Hernández et al., U U U U U U U U
2016)
DNN (Ghoneim and U U U U U U U U U U
Manjunatha, 2017)
SDAE (Hossain et al., 2015) U U U U U U
DBNPF (Zhang et al., 2017) U U U U U U U U
LSTM (Hou et al., 2019) U U U
Stacked (Liang et al., 2018) U U U U U
LSTM
Stacked (Zaytar and El Amrani, U
LSTM 2016)

Table 4
Summary of the parameters used by Hybrid models.

Algorithm Reference Parameters used Metrics used


Used
Temperature Wind Wind Air Humidity Precipitation Rainfall Dew Ozone R2 MSE RMSE MAE MAPE
speed direction pressure point
SAE-RoughR (Khodayar and U U
Teshnehlab, 2015)
EMD-ENN (Su et al., 2018) U U U U
Wavelet- (Yu et al., 2017) U U U U
ENN
MLP-RBF (Saba et al., 2017) U U U U U U U
Wavelet (Liu et al., 2013) U U U U
Packet-
ANN
SAE-DNN- (Liu et al., 2015) U U U U U
SVR
SAE-DBN (Sergio and U U U U
Ludermir, 2015)
SAE-LSTM (Wu et al., 2019) U U U U
Wavelet- (Mi et al., 2017) U U U U
ELM
HDT-BPNN (Qu et al., 2019) U U U U
ACO-ELM (Carrillo et al., 2017) U U
Wavelet-SVR (Dhiman et al., 2019) U U
HMD – (Zhang et al., 2019) U U U U
OSORELM
SSA-EMD- (Mi et al., 2019) U U U U
SVM
GREY – ELM (Qolipour et al., U U U U
2019)
ELM-ENN- (Chen et al., 2019) U U U
LSTM
MMODA (Liu et al., 2020) U U U U

and persistent model. The results demonstrated that the wavelet model. Mi et al. (2017) defined a new hybrid method for wind
packet ANN model outperformed all other models. Liu et al. speed forecasting using WT, WPD, ELM, and Outlier correction
(2015) developed a model using deep neural networks to predict algorithm, where the Wavelet Transform is applied to denoise
the weather, especially temperature and wind speed, accurately the input signal. The WPD algorithm is then employed to partition
for the next 24 h using big weather datasets. The weather param- the input wind speed series.
eter is predicted using stacked autoencoder DNN architecture with Qu et al. (2019) built a hybrid system to predict the wind speed
SVR trained on top of it. Sergio and Ludermir (2015) suggested a based on HDT and a backpropagation neural network optimized
deep learning-based model to forecast mean wind speed in the using the flower pollination algorithm. Carrillo et al. (2017) pro-
North-eastern region of Brazil using SAE and DBN. Wu et al. posed a hybrid predictive model based on ACO and ELM to forecast
(2019) developed a DNN based model for wind speed forecasting wind power. The datasets collected from two wind farms in Spain
using stacked autoencoder and LSTM, where Stacked Autoencoder are utilized to estimate the efficiency of the model. A novel hybrid
is selected to extract in-depth features from the weather datasets weather forecasting model using a multi-task agent combined with
for improving forecasting accuracy further. NMSE and NMAE are supervised and unsupervised learning is proposed by Arunachalam
used to evaluate the prediction performance of the proposed et al. (2015). Dhiman et al. (2019) suggested a wind speed predic-

12
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Table 5
Classification of models based on the parameter to be predicted.

Sl Parameter References
No. predicted
1 Temperature (Rasp and Lerch, 2018; Nayak et al., 2012; Liu et al., 2018, 2015; Abhishek et al., 2012; Khajure and Mohod, 2016; Suksri and Kimpan, 2016;
Narvekar and Fargose, 2015; Yu et al., 2016; Rasel et al., 2017; Singh et al., 2019; Ahmed, 2015; Mohammadi et al., 2015; Ismail et al., 2016;
Jayanthi and Sumathi, 2017; Suryanarayana et al., 2019; Hossain et al., 2015)
2 Wind speed (Tarade and Katti, 2011; Kulkarni et al., 2008; Kavasseri and Seetharaman, 2009; Erdem and Shi, 2011; Liu et al., 2011, 2012, 2013, 2015, 2020;
Chen et al., 2009, 2019; Cadenas et al., 2016; Filik and Filik, 2017; Hou et al., 2019; Liang et al., 2018; Khodayar and Teshnehlab, 2015; Su et al.,
2018; Yu et al., 2017; Sergio and Ludermir, 2015; Wu et al., 2019; Mi et al., 2017, 2019; Qu et al., 2019; Carrillo et al., 2017; Dhiman et al.,
2019; Zhang et al., 2019; Qolipour et al., 2019)
3 Rainfall (Reddy and Babu, 2017; Tran Anh et al., 2019; Zaw and Naing, 2009; Suhartono et al., 2012; Hung et al., 2009; Kashiwao et al., 2017; Mislan
et al., 2015; Narvekar and Fargose, 2015; Rasel et al., 2017; Lu and Wang, 2011; Shabariram et al., 2016; Ahmed, 2015; Navadia et al., 2017;
Hernández et al., 2016; Zhang et al., 2017; Saba et al., 2017)
4 Precipitation (Du et al., 2017; Hernández et al., 2016; Zhang et al., 2017)
5 Pressure (Khajure and Mohod, 2016; Suryanarayana et al., 2019)
6 Dew point (Khajure and Mohod, 2016; Hasan et al., 2016; Mohammadi et al., 2015; Saba et al., 2017)
7 Ozone level (Flores-Vergara et al., 2019; Ghoneim and Manjunatha, 2017)

tion system by combining the Wavelet Transform and SVR. Zhang Likewise, wind speed prediction models are intended for fore-
et al. (2019) proposed a wind speed prediction model based on the casting wind speed. Wind speed prediction is an important activity
Hybrid Mode Decomposition method and OSORELM, where HMD as the wind has applications in various domains such as power
is employed to decompose the input wind series. HMD is designed generation, agriculture, Industry, Naval Systems, and the marine
by combining Variational Mode decomposition, Sample Entropy, world. The wind is the process of moving air from high pressure
and Wavelet Packet Decomposition. Crisscross optimization algo- to low pressure, and wind speed is calculated by taking into
rithm is applied to optimize the parameters of the OSORELM account the difference between them. As the difference increases,
model. The simulation results demonstrate that the performance the wind speed also increases.
of the proposed hybrid model has substantially improved. Mi Quantitative rainfall is the amount of rainfall received over a
et al. (2019) described an improved hybrid wind speed forecasting period of time in a given area. Rain forecasting is essential for
model based on SSA, EMD, and convolutional SVM. Qolipour et al. the agricultural industry and the flood monitoring system. Precip-
(2019) recommended a hybrid wind speed forecasting system itation is frozen water or liquid that forms in the atmosphere and
using Grey ELM and the efficiency of the model is assessed by using drops to the Earth. This weather parameter will help farmers to
four different datasets. By combining the features of ELM, Elman make decisions on activities like irrigation, spraying, and
neural network, and LSTM networks, Chen et al. (2019) suggested harvesting.
a hybrid short-term wind speed forecasting model. Liu et al. Atmospheric air pressure is the weight exerted by the air on the
(2020) recommended a wind speed forecasting model by combin- surface of the Earth, and it varies with altitude. Atmospheric pres-
ing data pre-treatment strategy, modified multi-objective opti- sure decreases with increasing altitude. Dew point specifies how
mization (MMO) algorithm, and various forecasting models. A much water is present in the air, and it will get a high value when
pre-treatment strategy is applied to capture the significant fea- there is more water vapor in the air. Dew point is essential for the
tures of wind speed data, and a modified MMO algorithm is growth of desert plants and also for plants growing in areas with
employed to compute the optimal weights of the suggested model. moderate rainfall. The quantity of moisture contained in the air
The performance of the model is compared with ARIMA, BPNN, of the lower atmosphere is termed as humidity, which is essential
ELM, Elman neural network, and General Regression Neural Net- for leaf growth, photosynthesis, pollination, and economic yield.
work models. Ozone is one of the hazardous pollutants in the lower atmosphere.
The summary of the parameters and the metrics used by vari- The concentration of ozone in the atmosphere discloses its impact
ous hybrid weather forecasting models is listed in Table 4. on plants, humans, and other organisms. Predicting and controlling
ozone concentrations can reduce the effects of tropospheric ozone
5.2. Classification based on parameter to be predicted on human health and the environment.
Table 5 provides details of some of the studies conducted by
Forecasting models can be categorized as temperature predic- various researchers for forecasting temperature, wind speed, dew
tion models, wind speed prediction models, rainfall prediction point, rainfall, precipitation, and pressure. Several studies have
models, dew point prediction models, and so on depending upon been proposed by various researchers, as shown in Table 5.
the parameter to be predicted. Each prediction model forecasts a
particular parameter based on previous observations. Temperature 5.3. Discussion
forecasting models aim at predicting the minimum, maximum, or
average temperature of a location, based on various weather From the analysis of various weather prediction models, it has
parameters. The maximum temperature forecast depends on sun- been observed that hybrid models are proved to be more reliable
shine during day time, whereas the minimum temperature fore- models for weather forecasting. Most of the wind speed forecasting
cast relies on cloud conditions at night. models discussed in the literature are univariate models, as the
Temperature prediction plays a vital role in agriculture. Prior forecasting involves only a single variable. Fig. 8 explains the
information about weather helps farmers to take necessary deci- details of various prediction models reviewed in this study. Fig. 9
sions to enrich their crop cultivation. The extreme temperature illustrates the summary of the quantitative indicators used by var-
will cause severe damages to plants and animals, and it is always ious models presented in the literature. Fig. 10 depicts the percent-
a great concern for farmers. The temperature should be optimal age of quantitative indicators used by various models. It can be
for the growth of plants. Low temperatures may cause injury to comprehended from the figure that most of the prediction models
crops. So, the prediction based on temperature is essential in are wind speed prediction models, which are followed by temper-
agriculture. ature prediction models.
13
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Fig. 8. Details of various forecasting models based on parameters.

Fig. 9. Details of quantitative indicators used by various forecasting models.

6. Results and analysis

6.1. Open weather datasets

Some of the publicly available datasets for experimental analy-


sis are detailed in this section. Some researchers publish their data
online along with journals as supplementary files or in some public
archives such as GitHub so that others can use these datasets for
their research. This type of online datasets promotes reproducible
research.

1) US Weather Events is an open weather dataset that contains


information of about five million events. The weather
parameters include rain, storm, snow, and freezing condi-
tions of 49 states of the US from 2016 to 19 (Moosavi
et al., 2019).
2) Historical Hourly Weather Data is also an open dataset that
Fig. 10. Percentage of quantitative indicators used. contains hourly data of temperature, humidity, air pressure
for five years from 2012 to 17. The dataset includes data
Moreover, about 30% of the forecasting models employed RMSE from 30 US and Canadian Cities, as well as 6 Israeli cities,
as the quantitative indicator for assessing the performance of the and the dataset can be downloaded from the OpenWeather-
forecasting models, as illustrated in Fig. 10. RMSE is established Map website using Weather API under the ODbl license.
to be a commonly used metric for validating the efficacy of the 3) Rain in Australia is a publicly available rain dataset that con-
models. RMSE is very easy to differentiate, making it easy to use tains weather observations from weather stations of Aus-
in derivative based methods. Since the errors are squared before tralia, which can be used for rainfall prediction. The
they are averaged, the RMSE places a relatively high weight on dataset can be downloaded from http://www.bom.gov.
large errors. Moreover, absolute values are not used by RMSE, au/climate/data.
and hence, gradient and distance can be measured more effec- 4) Daily global surface weather observations from more than
tively. This might be the reason for using squared errors such as 9000 weather stations from 1929 to present are available
RMSE in prediction models. from NOAA and include GSOD data acquired from the ISH

14
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

dataset. The USAF Climatology Center provides the ISH data- the metrics, MAE (Mean Absolute Error), MSE (Mean Squared
set. The weather parameters include temperature, wind Error), RMSE (Root Mean Squared Error), MAPE (Mean Absolute
speed, dew point, visibility, and pressure. Percentage Error), and R2 (Coefficient of Determination). Various
5) Historical Daily Weather Data includes weather data from models employ different metrics for the evaluation of the models.
163 countries from 1st January 2020 to 21st April 2020. MAE is demarcated as the absolute difference between the
The countries are chosen based on Johns Hopkins predicted value (b y ) and the actual value(y). RMSE is described as
COVID-19 dataset. The dataset provides daily information the square root of the average of squared differences between pre-
on temperature, pressure, humidity, ozone levels, visibil- dicted value ( b
y ) and the actual value(y). MAPE is the relative per-
ity, and precipitation. This data can be extracted using centage error between actual and predicted values. The lower the
the Dark Sky API. The goal of this dataset is to deter- error values, the higher will be the prediction accuracy of the cor-
mine the impact of weather elements on Covid19 trans- responding model. R2, the Coefficient of Determination, is a statis-
mission and to propose forecast models based on these tical indicator that evaluates the goodness of fit of the model,
factors. which varies from 0 to 100. Higher R2 values represent smaller dif-
6) Weather Istanbul dataset contains parameters such as rain- ferences between the actual and the predicted values. R2 = 100
fall, average wind speed, average humidity, and average means the prediction model performs exceptionally well on the
pressure. The dataset can be downloaded from the website testing datasets. These error indicators are defined using equations
https://www.worldweatheronline.com. (25) to (29). Among the quantitative indicators employed for
7) Weather Underground provides weather information such as assessing the performance of the models, MAE, MSE, RMSE, and
temperature, humidity, wind speed, pressure, and visibility MAPE are negative indicators, which means that the lower values
from major cities across the world. The dataset can be down- of which indicate better prediction accuracy.
loaded from Weather Underground, available at www.
1X n 
y  b

wunderground.com. MAE ¼ yj ð25Þ
8) In-situ weather data provides atmospheric observations from n j¼1 j
aircraft and surface stations. The collected data may not be
of good quality, and hence some quality control and prepro- 1X n
 2
cessing steps to be carried out before processing the data. MSE ¼ y b
yj ð26Þ
n j¼1 j

vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
6.2. Evaluation criteria u X
u1 n  2
RMSE ¼ t y b yj ð27Þ
Evaluation plays a vital role in any forecasting model. The accu- n j¼1 j
racy of the forecasting models can be adequately assessed using

Table 6
Error values of Temperature prediction models.

References MSE RMSE MAE MAPE R2


(Nayak et al., 2012) 4.4130
(Liu et al., 2018) 1.0600 1.0296 99.6
(Abhishek et al., 2012) 2.7100 1.6460
(Suksri and Kimpan, 2016) 0.0212 0.1476
(Yu et al., 2016) 0.0281 0.1676 0.1062 0.0279
(Rasel et al., 2017) 10.956 3.3100 2.2900
(Ahmed, 2015) 93.36
(Mohammadi et al., 2015) 0.4501 0.6709 0.5203 98.77
(Hossain et al., 2015) 1.9044 1.3800
(Liu et al., 2015) 91.50

Table 7
Error values of Wind speed prediction models.

References MSE RMSE MAE MAPE R2


(Tarade and Katti, 2011) 1.4328 1.1969
(Kulkarni et al., 2008) 3.21126 1.79200
(Kavasseri and Seetharaman, 2009) 28.6225 5.35
(Liu et al., 2012) 0.2322 0.48187 0.1745 0.87
(Cadenas et al., 2016) 1.51 1.22882 0.91
(Filik and Filik, 2017) 0.42354 0.65080 0.5046
(Liang et al., 2018) 0.00297 0.05448 0.0428 99.917
(Khodayar and Teshnehlab, 2015) 0.00281 0.05300
(Yu et al., 2017) 0.22090 0.47000 0.3593 8.7200
(Liu et al., 2013) 0.54330 0.73709 0.4396 3.5500
(Liu et al., 2015) 87.100
(Sergio and Ludermir, 2015) 0.59920 0.77408 0.5962 0.1120
(Mi et al., 2017) 0.08280 0.28770 0.2353 1.2500
(Qu et al., 2019) 0.00434 0.06590 0.0533 1.0047
(Dhiman et al., 2019) 2.01640 1.42000
(Zhang et al., 2019) 0.00025 0.01590 0.0125 0.1294
(Mi et al., 2019) 0.16000 0.40000 0.3100 5.3500
(Qolipour et al., 2019) 0.00099 0.03142 0.03784 99.376
(Chen et al., 2019) 0.4363 6.7007
(Liu et al., 2020) 0.04260 0.20640 0.1659 3.1539

15
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Table 8
Error values of Rainfall prediction models.

References MSE RMSE MAE R2


(Suhartono et al., 2012) 144.5766 12.024
(Hung et al., 2009) 0.5041 0.710 94.00
(Mislan et al., 2015) 0.7010 0.8373
(Rasel et al., 2017) 413.308 20.330 0.95
(Hernández et al., 2016) 40.110 6.330
(Zhang et al., 2017) 5.09404 2.257 1.632
(Saba et al., 2017) 146.000 12.083 95.00

Table 9
requested not to deduce a testimonial regarding the performance
Error values of Precipitation prediction models. of weather forecasting models by merely looking at the tables
and figures shown below because these reported error values
References MSE RMSE MAE
might be strongly impacted by the forecasting duration, the num-
(Du et al., 2017) 0.231 0.480 ber of variables involved, and the methodology employed in differ-
(Hernández et al., 2016) 40.110 6.330
(Zhang et al., 2017) 0.667 0.817 0.494
ent methods. Moreover, according to the variations in the data
format and size of datasets used in the various models, the results
  obtained may also tend to vary. Hence, it cannot be concluded that
n 
1X y j 
yj  b some methods are more accurate than other methods and vice
MAPE ¼   ð28Þ
n j¼1  yj  versa. However, the results are plotted in the figures only for anal-
ysis purposes.
Pn   Tables 6–9 list the best-reported error values of different
b 2
j¼1 yj  y j weather forecasting models. Table 6 illustrates the error measures
R2 ¼ 1  Pn  2 ð29Þ
j¼1 yj  yj
of temperature prediction models presented in the literature. The
error measures of wind speed, rainfall, and precipitation forecast-
ing models are described in Tables 7, 8, and 9, respectively.
6.3. Analysis of forecasting models Even though various researchers employed MSE, RMSE, MAE,
MAPE, and R2 for assessing the effectiveness of their models, RMSE
A huge amount of research has performed in weather forecast- has been selected for comparing the models in this study. Fig. 11
ing, especially in temperature and wind speed forecasting, with shows the error measures of various temperature forecasting mod-
drastic improvements in the performance of the models. Best els discussed in the literature in terms of RMSE. The models pro-
reported error values, namely MAE, MSE, RMSE, MAPE, and R2 on posed by Suksri and Kimpan (2016) and Yu et al. (2016) reported
various models, are presented in the tables below. Readers are lower RMSE values and hence exhibit better prediction accuracy

Fig. 11. Comparison of Temperature Forecasting models based on RMSE.

16
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Fig. 12. Comparison of Wind Speed Forecasting Models based on RMSE.

in temperature forecasting. It can be understood that ANN and


SVM based temperature prediction models exhibit better
performance.
The prediction accuracy of wind speed forecasting models
based on RMSE is illustrated in Fig. 12. It can be perceived that
the models proposed by Liang et al. (2018), Khodayar and Tesh-
nehlab (2015), Qu et al. (2019), Zhang et al. (2019), and Qolipour
et al. (2019) reported minimum error values in wind speed fore-
casting compared to other models. It has been observed that hybrid
architectures yield better prediction accuracy compared to individ-
ual models. Hybrid models outperform other models, specifically
in wind speed forecasting.
The comparison of rainfall and precipitation forecasting
models based on RMSE are provided in Figs. 13 and 14,
respectively. It can be observed that both machine learning
and deep learning-based models perform well in rainfall and
precipitation predictions. Hung et al. (2009), Mislan et al.
(2015), and Zhang et al. (2017) reported the best prediction
accuracy in rainfall forecasting. The models developed by Du
et al. (2017) and Zhang et al. (2017) reported minimum error
values in precipitation forecasting.
The detailed survey concludes that Artificial Neural Net- Fig. 13. Comparison of Rainfall Forecasting Models based on RMSE.
works and Support Vector machines are the most commonly
used and reliable machine learning architectures for weather
prediction. Nonlinear problems that are difficult to solve using
traditional techniques can be solved using these architectures.
ANNs can be employed for weather forecasting because of the
ability to find relationships between variables within the data-
sets. Deep learning is valued as a promising technique for
weather forecasting. The studies confirm that deep learning
and hybrid models are reliable for accurate weather forecast-
ing, especially wind speed forecasting. Proper and accurate
weather forecasting will help people for better decision mak-
ing and improve safety across all aspects of human life.

7. Challenges and future directions

From the analysis of the works presented in the literature, the


following challenges have been noticed. The efforts which are
taken to overcome these challenges can be considered as potential
future directions. Fig. 14. Comparison of Precipitation Forecasting Models based on RMSE.

17
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

(1) Data collection is an important step in any forecasting prob- Adam, K., Majid, M.A., Fakherldin, M.A.I., Zain, J.M., 2017. A Big data prediction
framework for weather forecast using MapReduce algorithm. Adv. Sci. Lett. 23
lems. The availability and accessibility of real-time and good
(11), 11138–11143.
quality weather datasets is still a challenging task. Ahmed, B., 2015. Predictive capacity of meteorological data: Will it rain tomorrow?
(2) It can be noticed that most of the works presented in the lit- In: 2015 Science and Information Conference (SAI). IEEE, pp. 199–205.
erature utilized small datasets for experiments. So, more Akbar, A., Kuanar, A., Patnaik, J., Mishra, A., Nayak, S., 2018. Application of artificial
neural network modeling for optimization and prediction of essential oil yield
works on massive datasets using different techniques need in turmeric (Curcuma longa L.). Comput. Electron. Agric. 148, 160–178.
to be addressed. Arunachalam, N., Giles, G., Raghunath, R., Kaviyarasan, V., 2015. A hybrid approach
(3) A large number of works have been performed based on var- model for weather forecasting using multi-task agent. In: 2015 2nd
International Conference on Electronics and Communication Systems (ICECS).
ious machine learning, deep learning, and hybrid techniques IEEE, pp. 1675–1678.
for wind speed forecasting. However, works related to other Awad, M. and Khanna, R. 2015. Support vector regression. In Efficient Learning
weather parameters are comparatively less. Hence, mea- Machines (pp. 67-80). Apress, Berkeley, CA. https://doi.org/10.1007/978-1-
4302-5990-9_4.
sures are to be taken to focus more on rainfall and dew point Cadenas, E., Rivera, W., Campos-Amezcua, R., Heard, C., 2016. Wind speed
prediction models using various novel techniques as a future prediction using a univariate ARIMA model and a multivariate NARX model.
research direction. Energies 9 (2), 109.
Carrillo, M., Del Ser, J., Bilbao, M.N., Perfecto, C., Camacho, D., 2017. Wind power
(4) It is observed that most of the studies presented in the liter- production forecasting using ant colony optimization and extreme learning
ature make use of structured data sets for storing weather machines, In International Symposium on Intelligent and Distributed
information. Efforts are to be made for developing forecast- Computing. Springer, Cham, pp. 175–184.
Chavan, G. and Momin, B., 2017. An integrated approach for weather forecasting
ing models using atmospheric images and can be considered
over Internet of Things: A brief review. In 2017 International Conference on I-
as prospective future research concerns. SMAC (IoT in Social, Mobile, Analytics and Cloud)(I-SMAC) (pp. 83-88). IEEE.
(5) Most of the wind speed forecasting models presented in the Chen, M.R., Zeng, G.Q., Lu, K.D. and Weng, J. 2019. A two-layer nonlinear
literature are based on univariate LSTM networks, and corre- combination method for short-term wind speed prediction based on ELM,
ENN and LSTM. IEEE Internet of Things Journal. https://doi.org/10.1109/
lated features are not included in the input features. Hence, jiot.2019.2913176r.
more multivariate models need to be developed in the future Chen, P., Pedersen, T., Bak-Jensen, B., Chen, Z., 2009. ARIMA-based time series model
by incorporating correlated features such as temperature of stochastic wind power generation. IEEE Trans. Power Syst. 25 (2), 667–676.
Deng, L., 2014. A tutorial survey of architectures, algorithms, and applications for
and wind direction into the input features. The effectiveness deep learning. APSIPA Transactions on Signal and Information Processing, 3.
of such models should also be explored as well. Dhiman, H.S., Anand, P., Deb, D., 2019. Wavelet transform and variants of SVR with
(6) It can be observed that the performance indicators applied in application in wind forecasting. In: Innovations in Infrastructure. Springer,
Singapore, pp. 501–511. 10.1007/978-981-13-1966-2_45.
the articles reviewed include an assessment of the predic- Du, J., Liu, Y., Yu, Y., Yan, W., 2017. A prediction of precipitation data based on
tion accuracy only. Some measures need to be addressed support vector machine and particle swarm optimization (PSO-SVM)
to evaluate the stability to ensure that the proposed model algorithms. Algorithms 10 (2), 57.
Erdem, E., Shi, J., 2011. ARMA based approaches for forecasting the tuple of wind
can be used to predict more types of data, which can be con- speed and direction. Appl. Energy 88 (4), 1405–1414.
sidered as a future research concern. Filik, Ü.B., Filik, T., 2017. Wind speed prediction using artificial neural networks
based on multiple local measurements in Eskisehir. Energy Procedia 107, 264–
269.
8. Conclusion
Flores-Vergara, D., Ñanculef, R., Valle, C., Osses, M., Jacques, A., Domínguez, M.,
2019. Forecasting ozone pollution using recurrent neural nets and multiple
With the advancement of Big Data technologies and deep learn- quantile regression. In: 2019 IEEE CHILEAN Conference on Electrical, Electronics
ing techniques, weather forecasting and climate prediction can be Engineering, Information and Communication Technologies (CHILECON). IEEE,
pp. 1–6.
done effectively and accurately. The proposed survey discusses the Gensler, A., Henze, J., Sick, B. and Raabe, N., 2016. Deep Learning for solar power
recent research works related to weather forecasting, along with a forecasting—An approach using AutoEncoder and LSTM neural networks. In
detailed analysis of the results. The classification of weather fore- 2016 IEEE international conference on systems, man, and cybernetics (SMC)
(pp. 002858-002865). IEEE.
casting models are done mainly based on methodology employed Gheisari, M., Wang, G., Bhuiyan, M.Z.A., 2017. A survey on deep learning in big data.
and weather parameter to be predicted. The main limitation identi- In: 2017 IEEE international conference on computational science and
fied in the evaluation of the existing systems is the lack of stability engineering (CSE) and IEEE international conference on embedded and
ubiquitous computing (EUC). IEEE, pp. 173–180.
assessment of the weather forecasting models. All the existing mod- Ghoneim, O.A., Manjunatha, B.R., 2017. Forecasting of ozone concentration in smart
els only evaluate the prediction accuracy. The results claimed by the city using deep learning. In: 2017 International Conference on Advances in
authors are analyzed for assessing the performance of different tech- Computing, Communications and Informatics (ICACCI). IEEE, pp. 1320–1326.
Goodfellow, I., Bengio, Y. and Courville, A. 2016. Deep learning. The MIT Press
niques. ANN and SVM are established to be more reliable machine publisher, 2016, 800 pp, ISBN: 978-0-262-03561-3, http://www.
learning techniques for weather forecasting, and recently, neural deeplearningbook.org.
networks with deep architectures and hybrid models offer promis- Graves, A., Mohamed, A.R., Hinton, G., 2013. Speech recognition with deep recurrent
neural networks. In: 2013 IEEE international conference on acoustics, speech
ing results in the field of weather prediction. The survey provides
and signal processing. IEEE, pp. 6645–6649.
the state of the art models for weather forecasting, its challenges, Hasan, N., Uddin, M.T., Chowdhury, N.K., 2016. Automated weather event analysis
available open datasets, and future research directions. This detailed with machine learning. In: 2016 International Conference on Innovations in
literature review will help researchers who intend to explore the Science, Engineering and Technology (ICISET). IEEE, pp. 1–5.
Hernández, E., Sanchez-Anguix, V., Julian, V., Palanca, J., Duque, N., 2016. Rainfall
field of weather forecasting as a reference guide. prediction: A deep learning approach. In: International Conference on Hybrid
Artificial Intelligence Systems. Springer, Cham, pp. 151–162.
Declaration of Competing Interest Himanshi, J., Raksha, J., 2017. Big data in weather forecasting: Applications and
challenges. Int. Conf. Big Data Analytics.
Hochreiter, S., Schmidhuber, J., 1997. Long short-term memory. Neural Comput. 9
The authors declared that there is no conflict of interest. (8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735.
Hong, T., Pinson, P., Wang, Y., Weron, R., Yang, D., Zareipour, H., 2020. Energy
forecasting: a review and outlook (No. WORMS/20/08). Department of
Operations Research and Business Intelligence. Wroclaw University of Science
References and Technology.
Hossain, M., Rekabdar, B., Louis, S.J., Dascalu, S., 2015. Forecasting the weather of
Nevada: A deep learning approach. In: 2015 international joint conference on
Abhishek, K., Singh, M.P., Ghosh, S., Anand, A., 2012. Weather forecasting model
neural networks (IJCNN). IEEE, pp. 1–6.
using artificial neural network. Procedia Technol. 4, 311–318.

18
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Hou, C., Han, H., Liu, Z. and Su, M. 2019. A wind direction forecasting method based Mi, X., Liu, H., Li, Y., 2019. Wind speed prediction model using singular spectrum
on z_score normalization and long short term memory. In 2019 IEEE 3rd analysis, empirical mode decomposition and convolutional support vector
International Conference on Green Energy and Applications (ICGEA) (pp. 172- machine. Energy Convers. Manage. 180, 196–205.
176). IEEE. https://doi.org/10.1109/icgea.2019.8880774. Mislan, M., Haviluddin, H., Hardwinarto, S., Sumaryono, S., Aipassa, M., 2015.
Hung, N.Q., Babel, M.S., Weesakul, S., Tripathi, N.K., 2009. An artificial neural Rainfall monthly prediction based on artificial neural network: A case study in
network model for rainfall forecasting in Bangkok, Thailand. Hydrology Earth tenggarong station, east kalimantan-indonesia. The International Conference on
System Sci. 13 (8). Computer Science and Computational Intelligence (ICCSCI 2015)-Procedia
Ismail, K.A., Majid, M.A., Zain, J.M., Bakar, N.A.A., 2016. Big data prediction Computer Science 59.
framework for weather temperature based on MapReduce algorithm. In: 2016 Mohammadi, K., Shamshirband, S., Motamedi, S., Petković, D., Hashim, R., Gocic, M.,
IEEE Conference on Open Systems (ICOS). IEEE, pp. 13–17. 2015. Extreme learning machine based prediction of daily dew point
Jayanthi, D. and Sumathi, G., 2017. Weather data analysis using Spark—An In- temperature. Comput. Electron. Agric. 117, 214–225.
memory computing framework. In 2017 Innovations in Power and Advanced Moosavi, Sobhan, Samavatian, Mohammad Hossein, Nandi, Arnab, Parthasarathy,
Computing Technologies (i-PACT) (pp. 1-5). IEEE.r. Srinivasan, Ramnath, Rajiv, 2019. Short and long-term pattern discovery over
Juneja, A. and Das, N.N., 2019. Big data quality framework: Pre-processing data in large-scale geo-spatiotemporal data. Proceedings of the 25th ACM SIGKDD
weather monitoring application. In 2019 International Conference on Machine International Conference on Knowledge Discovery & Data Mining. ACM.
Learning, Big Data, Cloud and Parallel Computing (COMITCon) (pp. 559-563). Nagaraja, Y., Devaraju, T., Kumar, M.V., Madichetty, S., 2016. A survey on wind
IEEE. energy, load and price forecasting: (Forecasting methods). In: 2016
Kaba, K., Sarıgül, M., Avcı, M., Kandırmaz, H.M., 2018. Estimation of daily global International Conference on Electrical, Electronics, and Optimization
solar radiation using deep learning model. Energy 162, 126–135. Techniques (ICEEOT). IEEE, pp. 783–788.
Kashiwao, T., Nakayama, K., Ando, S., Ikeda, K., Lee, M., Bahadori, A., 2017. A neural Narvekar, M., Fargose, P., 2015. Daily weather forecasting using artificial neural
network-based local rainfall prediction system using meteorological data on the network. Int. J. Computer Appl. 121 (22).
internet: a case study using data from the Japan meteorological agency. Appl. Navadia, S., Yadav, P., Thomas, J. and Shaikh, S., 2017. Weather prediction: a novel
Soft Comput. 56, 317–330. approach for measuring and analyzing weather data. In 2017 International
Kavasseri, R.G., Seetharaman, K., 2009. Day-ahead wind speed forecasting using f- Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud)(I-SMAC) (pp.
ARIMA models. Renewable Energy 34 (5), 1388–1393. 414-417). IEEE.
Kavitha, S., Varuna, S., Ramya, R., 2016. November. A comparative analysis on Naveen, L., Mohan, H.S., 2019. Atmospheric weather prediction using various
linear regression and support vector regression. In: 2016 Online machine learning techniques: A survey. In: 2019 3rd International Conference
International Conference on Green Engineering and Technologies (IC- on Computing Methodologies and Communication (ICCMC). IEEE, pp. 422–428.
GET). IEEE, pp. 1–5. Nayak, R., Patheja, P.S., Waoo, A., 2012. An enhanced approach for weather
Khajure, S., Mohod, S.W., 2016. Future weather forecasting using soft computing forecasting using neural network. In: Proceedings of the International
techniques. Procedia Comput. Sci. 78 (C), 402–407. Conference on Soft Computing for Problem Solving (SocProS 2011) December
Khashei, M. and Bijari, M., 2011. A new hybrid methodology for nonlinear time 20-22, 2011. Springer, New Delhi, pp. 833–839.
series forecasting. Modelling and Simulation in Engineering, 2011. Nielsen, M.A., 2015. Neural networks and deep learning. CA, USA, Determination
Khodayar, M., Teshnehlab, M., 2015. Robust deep neural network for wind speed press publisher, San Francisco.
prediction. In: 4th Iranian Joint Congress on Fuzzy and Intelligent Systems Pandey, A.K., Agrawal, C.P., Agrawal, M., 2017. A hadoop based weather prediction
(CFIS). IEEE, pp. 1–5. model for classification of weather data. In: 2017 Second International
Klein, B., Wolf, L., Afek, Y., 2015. A dynamic convolutional layer for short range Conference on Electrical, Computer and Communication Technologies
weather prediction. Proceedings of the IEEE Conference on Computer Vision and (ICECCT). IEEE, pp. 1–5.
Pattern Recognition, 4840–4848. Qolipour, M., Mostafaeipour, A., Saidi-Mehrabad, M., Arabnia, H.R., 2019. Prediction
Kulkarni, M.A., Patil, S., Rama, G.V., Sen, P.N., 2008. Wind speed prediction using of wind speed using a new Grey-extreme learning machine hybrid algorithm: A
statistical regression and neural network. J. Earth Syst. Sci. 117 (4), 457–463. case study. Energy Environ. 30 (1), 44–62.
Kunjumon, C., Nair, S.S., Suresh, P., Preetha, S.L., 2018. Survey on weather Qu, Z., Mao, W., Zhang, K., Zhang, W., Li, Z., 2019. Multi-step wind speed forecasting
forecasting using data mining. In: 2018 Conference on Emerging Devices and based on a hybrid decomposition technique and an improved back-propagation
Smart Systems (ICEDSS). IEEE, pp. 262–264. neural network. Renewable Energy 133, 919–929.
Kurniawan, A.P., Jati, A.N. and Azmi, F., 2017. Weather prediction based on fuzzy Radhika, T. V., Krushna Chandra Gouda, Kumar, S.S., 2017. Big data research in
logic algorithm for supporting general farming automation system. In 2017 5th climate science”, International Conference on Communication and Electronics
International Conference on Instrumentation, Control, and Automation (ICA) Systems (ICCES), IEEE, 2017.
(pp. 152-157). IEEE. Rasel, R.I., Sultana, N., Meesad, P., 2017. An application of data mining and machine
Li, X., Peng, L., Hu, Y., Shao, J., Chi, T., 2016. Deep learning architecture for air quality learning for weather forecasting. In: International Conference on Computing
predictions. Environ. Sci. Pollut. Res. 23 (22), 22408–22417. and Information Technology. Springer, Cham, pp. 169–178.
Liang, S., Nguyen, L. and Jin, F. 2018. A multi-variable stacked long-short term Rasp, S., Lerch, S., 2018. Neural networks for postprocessing ensemble weather
memory network for wind speed forecasting. In 2018 IEEE International forecasts. Mon. Weather Rev. 146 (11), 3885–3900.
Conference on Big Data (Big Data) (pp. 4561-4564). IEEE. https://doi.org/ Reddy, P.C., Babu, A.S., 2017. Survey on weather prediction using big data analystics.
10.1109/bigdata.2018.8622332. In: 2017 Second International Conference on Electrical, Computer and
Liu, H., Erdem, E., Shi, J., 2011. Comprehensive evaluation of ARMA–GARCH (-M) Communication Technologies (ICECCT). IEEE, pp. 1–6.
approaches for modeling the mean and volatility of wind speed. Appl. Energy 88 Saba, T., Rehman, A., AlGhamdi, J.S., 2017. Weather forecasting based on hybrid
(3), 724–732. neural model. Appl. Water Sci. 7 (7), 3869–3874.
Liu, J.N., Hu, Y., He, Y., Chan, P.W. and Lai, L., 2015. Deep neural network modeling Saima, H., Jaafar, J., Belhaouari, S., Jillani, T.A., 2011. Intelligent methods for weather
for big data weather forecasting. In Information Granularity, Big Data, and forecasting: A review. In: 2011 National Postgraduate Conference. IEEE, pp. 1–6.
Computational Intelligence (pp. 389-408). Springer, Cham. Salman, A.G., Kanigoro, B., Heryadi, Y., 2015. Weather forecasting using deep
Liu, X., Zhang, C., Liu, P., Yan, M., Wang, B., Zhang, J. and Higgs, R., 2018. Application learning techniques. In: Advanced Computer Science and Information Systems
of temperature prediction based on neural network in intrusion detection of (ICACSIS), 2015 International Conference on. IEEE, pp. 281–285.
IoT. Security and Communication Networks, 2018. Sergio, A.T., Ludermir, T.B., 2015. Deep learning for wind speed forecasting in
Liu, H., Chen, C., Lv, X., Wu, X. and Liu, M, 2019. Deterministic wind energy northeastern region of Brazil. In: 2015 Brazilian Conference on Intelligent
forecasting: A review of intelligent predictors and auxiliary methods, Energy Systems (BRACIS). IEEE, pp. 322–327.
Conversion and Management, 195, pp.328-345, (2019). Shabariram, C.P., Kannammal, K.E., Manojpraphakar, T., 2016. Rainfall analysis and
Liu, Z., Jiang, P., Zhang, L., Niu, X., 2020. A combined forecasting model for time rainstorm prediction using MapReduce framework. In: 2016 International
series: application to short-term wind speed forecasting. Appl. Energy 259, Conference on Computer Communication and Informatics (ICCCI). IEEE, pp. 1–4.
114137. https://doi.org/10.1016/j.apenergy.2019.114137. Siami-Namini, S., Tavakoli, N., Namin, A.S., 2018. A comparison of ARIMA and LSTM
Liu, H., Tian, H.Q., Li, Y.F., 2012. Comparison of two new ARIMA-ANN and ARIMA- in forecasting time series. In: 2018 17th IEEE International Conference on
Kalman hybrid methods for wind speed prediction. Appl. Energy 98, 415–424. Machine Learning and Applications (ICMLA). IEEE, pp. 1394–1401.
Liu, H., Tian, H.Q., Pan, D.F., Li, Y.F., 2013. Forecasting models for wind speed using Singh, N., Chaturvedi, S., Akhter, S., 2019. Weather forecasting using machine
wavelet, wavelet packet, time series and Artificial Neural Networks. Appl. learning algorithm. In: 2019 International Conference on Signal Processing and
Energy 107, 191–208. Communication (ICSC). IEEE, pp. 171–174.
Lu, K., Wang, L., 2011. A novel nonlinear combination model based on support Smola, A.J., Schölkopf, B., 2004. A tutorial on support vector regression. Statist.
vector machine for rainfall prediction. In: 2011 Fourth International Joint Computing 14 (3), 199–222. https://doi.org/10.1023/b:
Conference on Computational Sciences and Optimization. IEEE, pp. 1343–1346. stco.0000035301.49549.88.
Medina, D.O.G., dos Santos, R.C., Sousa, T., Junior, J.C.L., 2019. Comparative analysis Sobrevilla, K.L.M.D., Quiñones, A.G., Lopez, K.V.S., Azaña, V.T., 2016. Daily weather
of artificial neural networks and statistical models applied to demand forecast in Tiwi, Albay, Philippines using artificial neural network with missing
forecasting. In: 2019 IEEE PES Innovative Smart Grid Technologies values imputation. In: 2016 IEEE Region 10 Conference (TENCON). IEEE, pp.
Conference-Latin America (ISGT Latin America). IEEE, pp. 1–6. 2981–2985.
Mi, X.W., Liu, H., Li, Y.F., 2017. Wind speed forecasting method using wavelet, Stern, H., 2008. The accuracy of weather forecasts for Melbourne. Australia.
extreme learning machine and outlier correction algorithm. Energy Convers. Meteorological applications 15 (1), 65–71.
Manage. 151, 709–722.

19
K.U. Jaseena and B.C. Kovoor Journal of King Saud University – Computer and Information Sciences xxx (xxxx) xxx

Su, Y., Wang, S., Xiao, Z., Tan, M., Wang, M., 2018. An ultra-short-term wind power Wu, Y.X., Wu, Q.B., Zhu, J.Q., 2019. Data-driven wind speed forecasting using deep
forecasting approach based on wind speed decomposition, wind direction and feature extraction and LSTM. IET Renew. Power Gener. 13 (12), 2062–2069.
elman neural networks. In: 2018 2nd IEEE Conference on Energy Internet and Yu, H., Chen, Y., Hassan, S.G., Li, D., 2016. Prediction of the temperature in a Chinese
Energy System Integration (EI2). IEEE, pp. 1–9. solar greenhouse based on LSSVM optimized by improved PSO. Comput.
Suhartono, Faulina, R., Lusia, D.A., Otok, B.W. and Kuswanto, H., 2012. Ensemble Electron. Agric. 122, 94–102.
method based on anfis-arima for rainfall prediction. In 2012 International Yu, C., Li, Y., Zhang, M., 2017. An improved wavelet transform using singular
Conference on Statistics in Science, Business and Engineering (ICSSBE) (pp. 1-4). spectrum analysis for wind speed forecasting based on Elman neural network.
IEEE. Energy Convers. Manage. 148, 895–904. https://doi.org/10.1016/j.
Suksri, S., Kimpan, W., 2016. Neural network training model for weather forecasting enconman.2017.05.063.
using fireworks algorithm. In: 2016 International Computer Science and Zaw, W.T., Naing, T.T., 2009. Modeling of rainfall prediction over Myanmar using
Engineering Conference (ICSEC). IEEE, pp. 1–7. polynomial regression. In: 2009 International Conference on Computer
Suryanarayana, V., Sathish, B.S., Ranganayakulu, A., Ganesan, P., 2019. Novel Engineering and Technology. IEEE, pp. 316–320.
weather data analysis using Hadoop and MapReduce–A case study. In: 2019 Zaytar, M.A., El Amrani, C., 2016. Sequence to sequence weather forecasting with
5th International Conference on Advanced Computing & Communication long short term memory recurrent neural networks. Int. J. Comput. Appl. 143
Systems (ICACCS). IEEE, pp. 204–207. (11).
Tarade, R.S., Katti, P.K., 2011. A comparative analysis for wind speed prediction. Zhang, D., Peng, X., Pan, K., Liu, Y., 2019. A novel wind speed forecasting based on
International Conference on In Energy, Automation, and Signal (ICEAS). IEEE, pp. hybrid decomposition and online sequential outlier robust extreme learning
1–6. machine. Energy Convers. Manage. 180, 338–357.
Tran Anh, D., Duc Dang, T., Pham Van, S., 2019. Improved rainfall prediction using Zhang, P., Zhang, L., Leung, H., Wang, J., 2017. A deep-learning based precipitation
combined pre-processing methods and feed-forward neural networks. J.— forecasting approach using multiple environmental factors. In: 2017 IEEE
Multidisciplinary Sci. J. 2 (1), 65–83. International Congress on Big Data (BigData Congress). IEEE, pp. 193–200.
Wollsen, M.G., Jørgensen, B.N., 2015. Improved local weather forecasts using
artificial neural networks. In: Distributed Computing and Artificial Intelligence,
12th International Conference. Springer, Cham, pp. 75–86.

20

You might also like