Next Article in Journal
Fixed Point Results of Expansive Mappings in Metric Spaces
Next Article in Special Issue
Simultaneous Feature Selection and Classification for Data-Adaptive Kernel-Penalized SVM
Previous Article in Journal
Unpredictable Solutions of Linear Impulsive Systems
Previous Article in Special Issue
Kidney and Renal Tumor Segmentation Using a Hybrid V-Net-Based Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Data Science in Economics: Comprehensive Review of Advanced Machine Learning and Deep Learning Methods

1
Doctoral School of Management and Business Administration, Szent Istvan University, 2100 Godollo, Hungary
2
Environmental Quality, Atmospheric Science and Climate Change Research Group, Ton Duc Thang University, Ho Chi Minh City, Vietnam
3
Faculty of Environment and Labour Safety, Ton Duc Thang University, Ho Chi Minh City, Vietnam
4
College of Electrical and Information Engineering, Hunan University, Changsha 410082, China
5
Helmholtz-Zentrum Dresden-Rossendorf, Helmholtz Institute Freiberg for Resource Technology, D-09599 Freiberg, Germany
6
Department of Mathematics, J. Selye University, 94501 Komarno, Slovakia
7
Institute of Research and Development, Duy Tan University, Da Nang 550000, Vietnam
8
Future Technology Research Center, College of Future, National Yunlin University of Science and Technology 123 University Road, Section 3, Douliou, Yunlin 64002, Taiwan
9
Faculty of Civil Engineering, Technische Universität Dresden, 01069 Dresden, Germany
10
Faculty Laboratory of Artificial Intelligence and Decision Support (LIAAD)-INESC TEC, Campus da FEUP, Rua Roberto Frias, 4200-465 Porto, Portugal
11
Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia
*
Authors to whom correspondence should be addressed.
Mathematics 2020, 8(10), 1799; https://doi.org/10.3390/math8101799
Submission received: 10 September 2020 / Revised: 11 October 2020 / Accepted: 15 October 2020 / Published: 16 October 2020
(This article belongs to the Special Issue Advances in Machine Learning Prediction Models)

Abstract

:
This paper provides a comprehensive state-of-the-art investigation of the recent advances in data science in emerging economic applications. The analysis is performed on the novel data science methods in four individual classes of deep learning models, hybrid deep learning models, hybrid machine learning, and ensemble models. Application domains include a broad and diverse range of economics research from the stock market, marketing, and e-commerce to corporate banking and cryptocurrency. Prisma method, a systematic literature review methodology, is used to ensure the quality of the survey. The findings reveal that the trends follow the advancement of hybrid models, which outperform other learning algorithms. It is further expected that the trends will converge toward the evolution of sophisticated hybrid deep learning models.

1. Introduction

Due to the rapid advancement of databases and information technologies, and the remarkable progress in data analysis methods, the use of data science (DS) in various disciplines, including economics, has been increasing exponentially [1]. Advancements in data science technologies for economics applications have been progressive with promising results [2,3]. Several studies suggest that data science applications in economics can be categorized and studied in various popular technologies, such as deep learning, hybrid learning models, and ensemble algorithms [4]. Machine learning (ML) algorithms provide the ability to learn from data and deliver in-depth insight into problems [5]. Researchers use machine learning models to solve various problems associated with economics. Notable applications of data science in economics are presented in Table 1. Deep learning (DL), as an emerging field of machine learning, is currently applied in many aspects of today’s society, from self-driving cars to image recognition, hazard prediction, health informatics, and bioinformatics [5,6]. Several comparative studies have evaluated the performance of DL models with standard ML models, e.g., support vector machine (SVM), K-nearest neighbors (KNN), and generalized regression neural networks (GRNN) in economic applications. The evolution of DS methods has progressed at a fast pace, and every day, many new sectors and disciplines are added to the number of users and beneficiaries of DS algorithms. On the other hand, hybrid machine learning models consist of two or more single algorithms and are used to increase the accuracy of the other models [7]. Hybrid models can be formed by combining two predictive machine learning algorithms or a machine learning algorithm and an optimization method to maximize the prediction function [8]. It has been demonstrated that the hybrid machine learning models outperform the single algorithms, and such an approach has improved the prediction accuracy [9]. Ensemble machine learning algorithms are one of the supervised learning algorithms that use multiple learning algorithms to improve learning processes and increase predictive accuracy [10]. Ensemble models apply different training algorithms to enhance training and learning from data [11].
Various works exist on the state-of-the-art of DS methods in different disciplines, such as image recognition [21], animal behavior [22], renewable energy forecasting [23]. Hybrid methods have also been investigated in various fields, including financial time series [24], solar radiation forecasting [25], and FOREX rate prediction [26], while ensemble methods have been mostly in the fields, ranging from breast cancer [27], image categorization [28], electric vehicle user behavior prediction [29], and solar power generation forecasting [30]. Exploring the scientific databases such as Thomson Reuters Web-of-Science (WoS) shows an exponential rise in using both DL and ML in economics. The results of an inquiry of essential ML and DL in the emerging applications to economics over the past decade is illustrated in Figure 1. Even though many researchers have applied DS methods to address different problems in the field of economics, these studies are scattered. At the same time, no single study provides a comprehensive overview of the contributions of DS in economic-related fields. Therefore, the current study is conducted to bridge this literature gap. In other words, the main objective of this study is to investigate the advancement of DS in three parts: deep learning methods, hybrid deep learning methods, and ensemble machine learning techniques in economics-related fields. The present work aims to answer the following research questions. (1) what are the emerging economics domains with the involvement of data science technologies? (2) what are the popular data science models and applications in these domains?
The rest of the paper is organized as follows. The method by which the database of this article was formed is described in Section 2. Section 3 presents the findings and discussion, including articles categorized by sector and area of study and models. A taxonomy of the application of data science in economics is presented as well. Section 4 discusses the analytical results and the conclusions.

2. Materials and Methods

The current study utilized Prisma, a systematic literature review approach, to find the most published articles that have applied data science methods for addressing an issue in a field related to economics. Systematic literature review based on the Prisma method includes four steps: (1) identification, (2) screening, (3) eligibility, and (4) inclusion [31]. In the identification stage, the documents are identified through an initial search among the mentioned databases. In this study, the review was limited to the original peer-review research articles published in Thomson Reuters Web-of-Science (WoS) and Elsevier Scopus. Today, the Scopus database includes almost entire authenticated scientific repositories, including WoS. Therefore, to avoid overlapping of the inquiries, the Scopus has been used as the principal database for search, and the results had been verifies using the WoS repository. This review was limited to articles written in English. This step resulted in 217 articles.
Our search included essential machine learning and deep learning, which were used as the search keywords. The search for articles’ sources included both economics and computer science journals. The screening step includes two stages. First, duplicate items were eliminated, resulting in 135 unique articles that were moved to the next stage, where the relevance of the articles was examined based on their titles and abstracts. The second stage resulted in 84 articles for further consideration. The third step of the Prisma model is eligibility, in which the full text of articles was read by the authors, among which 57 were considered eligible for final review. In fact, at this stage, after reading the abstract and the full text of the articles, the articles that did not develop a model for one of the fields related to economics using the machine learning method were removed. The Prisma model’s last step is the creation of the database and is used for qualitative and quantitative analysis. The current study database comprises 57 articles, which were all analyzed in this study. Figure 2 illustrates the steps of creating the database of the present study based on the Prisma method.

3. Findings and Discussion

Figure 2 shows that this study’s database consists of 57 articles that were analyzed and categorized according to two criteria: (1) research/application area, and (2) the method type. Based on the review of articles by application, it was found that these articles were designed to address the issues of five different applications, namely the Stock Market (37 articles), Marketing (6 articles), E-commerce (8 articles), Corporate Bankruptcy (3 articles), and Cryptocurrency (3 articles) (Table 2, Table 3, Table 4, Table 5, Table 6 and Table 7). In addition, the articles were analyzed by the type of method, revealing that 42 unique algorithms were employed among the 57 reviewed articles (see Figure 3). It was further found that 9 articles used 9 single DL models (Table 8), 18 hybrid deep learning (HDL) models (Table 9), 7 hybrid machine learning models (Table 10), and 8 ensemble models (Table 11). In the following section, the identified applications and each of these methods are described in detail.

3.1. Applications of Data Science in Economics

3.1.1. Stock Market

Applying deep learning in the stock market has become more common than in other economics areas, considering that most of the research articles reviewed in the present study are classified in this category (37 out of 57). Table 2 summarizes the articles that employed predictive models in stock market studies, including research objectives, data sources, and applied models of each article. Investment in the stock market is profitable, while the higher the profit, the higher the risk. Therefore, investors always try to determine and estimate the stock value before any action. The stock value is often influenced by uncontrollable economical and political factors that make it notoriously difficult to identify the future stock market trends. Not only is the nature of the stock market so volatile and complex, but the financial time series data are also noisy and nonstationary. Thus, the traditional forecasting models are not reliable enough to predict the stock value. Researchers are continuously seeking new methodologies based on DS algorithms to enhance the accuracy of such predictions. Forecasting stock price was found to be the objective of 29 out of 37 articles. Other studies aimed at applying DS in sentiment analysis, or the analysis of the context of texts to extracts subjective information, to identify future trends in the stock market. In addition, portfolio management, algorithmic trading (i.e., using a pre-programmed automated system for trading), automated stock trading, socially responsible investment portfolios, the S&P 500 index trend prediction, and exchange-trade-fund (EFT) options prices prediction were the objectives of other articles that projected to employ DS methods. Financial time series served as the data source of all these studies, except for the studies aimed at sentiment analysis, which used different data sources, such as social media and financial news.

LSTM

Long short-term memory (LSTM) networks are a special kind of recurrent neural network (RNN) that can overcome the main issue of RNN, i.e., vanishing gradients using the gates to retain relevant information and discard unrelated details selectively. The structure of an LSTM neural network is shown in Figure 4, which is composed of a memory unit C t , a hidden state h t and three types of gates, where t indexes the time step. Specifically, for each step t , LSTM receives an input x t and the previous hidden state h t 1 then calculates the activation of the gates. Finally, the memory unit and the hidden state are updated. The computations involved are described below:
f t = σ ( W f x t + w f h t 1 + b f )
i t = σ ( W i x t + w i h t 1 + b i )
O t = σ ( W o x t + w o h t 1 + b o )
C t = f t C t 1 + i t σ c ( W c x t + w c h t 1 + b c )
h t = O t t a n h ( C t )
where W f , W i , and W o denote the weights of inputs; and w f and w i indicate weights of recurrent output and biases, respectively. The subscripts f , i , and O represent the forget, input, and output gate vectors, respectively; b f , b i , and b o denote the biases; and is the element-wise multiplication.
Many researchers have strived to forecast the stock value relying on the LSTM algorithm, either a single long short-term memory (LSTM) or a hybrid model of LSTM. Adapting the LSTM algorithm, Moon and Kim [32] proposed an algorithm to predict the stock market index and volatility. Fischer and Krauss [33] expand the LSTM networks to forecast out-of-sample directional movements in the stock market. The comparison between the performance of their model with the random forest (RF), deep neural network (DNN), and logistic regression classifier (LOG) illustrates the remarkable outperformance of the LSTM model. Tamura et al. [34] introduced a two-dimensional approach to predict the stock values in which the technical financial indices of the Japanese stock market were entered as input data to the LSTM for the prediction, then the data on financial statements of the related companies were retrieved and added to the database. Wang et al. [35] tried to find the best model to predict portfolio management’s financial time-series to optimize portfolio formation. They compared the results of LSTM against SVM, RF, DNN, and the autoregressive integrated moving average model (ARIMA) and realized that LSTM is more suitable for financial time-series forecasting. Using LSTM, Fister et al. [36] designed a model for automated stock trading. They argue that the performance of LSTM is remarkably higher than the traditional trading strategies, such as passive and rule-based trading strategies. In their case studies, the German blue-chip stock and BMW in the period between 2010 and 2018 formed the data sources.
In addition, there is much evidence in the literature that hybrid LSTM methods outperform the other single DL methods [37]. In the application of the stock market, LSTM has been combined with different methods to develop a hybrid model. For instance, Tamura et al. [34] used LSTM to predict stock price and reported that the accuracy-test results outperform other literature models. Employing optimal long short-term memory (O-LSTM), Agrawal et al. [38] proposed a model for the stock price prediction using correlation-tensor, which is formed by stock technical indicators (STIs) to optimize the deep learning function. As a result, two predictive models were developed, namely one for price trend prediction and the other for making the buy-sell decision at the end of the day.
Integrating wavelet transforms (WT), stacked autoencoders (SAEs), and LSTM, Bao, Yue, and Rao [39] established a new method to predict the stock price. In the first stage, WT first eliminates noises to decompose the stock price time series, then predictive features for the stock price are created by SAEs in the next stage. Finally, the LSTM is applied to predict the next day’s closing price based on the previous stage’s features. Bao et al. [39] claim that their model outperforms state-of-the-art literature models in terms of predictive accuracy and profitability performance. To cope with non-linearity and non-stationary characteristics of financial time series, Yan and Ouyang [40] integrate wavelet analysis-LSTM (WA-LSTM) to forecast the daily closing price of the Shanghai Composite Index. Results show that their proposed model outperformed multiple layer perceptron (MLP), SVM, and KNN in finding the patterns in the financial time series. Vo et al. [41] used a multivariate bidirectional-LSTM (MB-LSTM) to develop a deep responsible investment portfolio (DRIP) model for the prediction of stock returns for socially responsible investment portfolios. They applied the deep reinforcement learning (DRL) model to retrain neural networks. Fang, Chen, and Xue [42] developed a methodology to predict the exchange-trade-fund (EFT) options prices. Through integrating LSTM and support vector regression (SVR), they produce two models of LSTM-SVR for modeling the final transaction price, buy price, highest price, lowest price, volume, historical volatility, and the implied volatility of the time segment. They predicted the price with promising results. In the second generation of LSTM-SVR, the hidden state vectors of LSTM and the seven factors affecting the option price are considered the SVR’s inputs. They also compare the results with the LSTM and RF models, where the proposed model outperforms other methods.

DNN

Deep neural network (DNN), which is composed of multiple nonlinear operations levels, and each layer only receives the connections from its previous training layer as shown in Figure 5, adapted from [49]. Suppose X be the input data, and w j , j = 1 , 2 , , k be a filter bank, where k is the number of layers. The multi-layer features of the DNN can be represented as
f ( X ) = f k ( f 2 ( f 1 ( X ; w 1 ) ; w 2 ) , w k )
Currently, DNN has been widely applied in the stock market to identify the trends and patterns among the financial time series data. Go and Hong [46] used the DNN method to predict the stock value. They first trained the method by the time series data and then tested and confirmed their model’s predictability. Song et al. [49] developed DNN using 715 novel input-features to forecast the stock price fluctuation. They also compared the performance of their model with the other models that include simple price-based input-features. For predicting the stock market behavior, Chong, Han, and Park [63] examined the performance of DNN. They consider high-frequency intraday stock returns as the input in their model. They analyzed the predictability of principal component analysis (PCA), autoencoder, and RBM. According to their results, DNN has good predictability with the information they receive from the autoregressive mode residuals. Although applying the autoregressive model to the network’s residuals may not contribute to the model’s predictability. Besides, Chong et al. [63] found out applying covariance-based market structure analysis to the predictive network remarkably increases the covariance estimation. Das et al. [60] used DNN to predict the future trends of the S&P 500 Index. Their results show that their model can poorly forecast the underlying stocks’ behavior in the S&P 500 index. They believe that randomness and non-stationarity are the reasons that make hard the predictability of this index.
In addition, hybrid methods that are constructed based on DNN have been reported to be very accurate in the financial time series data. For example, Das and Mishra [45] proposed an advanced model to plan, analyze, and predict the stock value, using a multilayer deep neural network (MDNN) optimized by Adam optimizer (AO) to find the patterns among the stock values. Moews et al. [48] proposed a method to predict the stock market’s behavior, as a complex system with a massive number of noisy time series. Their model integrates DNN and stepwise linear regressions (SLR). Moews et al. [48] considered regression slopes as trend strength indicators for a given time interval. To predict the Google stock price, Singh and Srivastava [65] compared two integrated models, i.e., 2-directional 2-dimensional principal component analysis-DNN ((2D)2PCA-DNN) and (2D)2PCA-radial basis function neural network (RBFNN). According to their results, the (2D) 2PCA-DNN model has higher accuracy in predicting the stock price. They also compared their results with the RNN model and reported that the predictability of (2D)2PCA-DNN outperforms RNN as well.

CNN

Convolutional neural network (CNN) is one of the most popular methods in deep learning and is widely applied in various fields [70,71,72,73], such as classification, language processing, and object detection. A classical CNN structure is presented in Figure 6, adapted from [71], which mainly consists of three components, i.e., convolution layers, pooling layers, and fully connected layers. Different layers have different roles in the training process and are discussed in more detail below:
Convolutional layer: This layer is composed of a set of trainable filters, which are used to perform feature extraction. Suppose X is the input data and there are k filters in convolutional layers, then the output of the convolutional layer can be determined as follows:
y j = i f ( x i w j + b j ) , j = 1 , 2 , , k
where w j and b j are the weight and bias, respectively; and f ( · ) denotes an activation function. represents the convolution operation.
Pooling layer: In general, this layer is used to decrease the obtained feature data and network parameters’ dimensions. Currently, max pooling and average pooling are the most widely used methods. Let S be a p × p window size, then the average pooling operation can be expressed as follows:
z = 1 N ( i , j ) S x i j ,   i , j = 1 , 2 , p
where x i j indicates the activation value at ( i , j ) ; and N is the total number of elements in S .
Fully connected layer: Following the last pooling layer, the fully connected layer is utilized to reshape the feature maps into a 1-D feature vector, which can be expressed as
Y = i f ( w z + b )
where Y and z denote the output vector and input features; and w and b represent the weight and bias of the fully connected layer, respectively.
Recently, many researchers have extensively applied CNN for predicting stock values using financial time series data. Sim et al. [52] developed a model to predict the stock price by adapting CNN. Their results reveal that CNN’s predictive performance demonstrated a better performance in forecasting stock price than ANN and SVM. Tashiro et al. [54] first criticized the current models for the price prediction in the stock markets in that the properties of market orders are ignored. Therefore, they constructed a CNN architecture integrating order-based features to predict the stock markets’ mid-price trends. Their results prove that adding the features of orders to the model increased its accuracy. Dingli and Fournier [64] applied CNN to predict the future movement of stock prices and found that the predictive accuracy of their model was 65% in predicting the following month’s price and was 60% for the following week’s price. Gonçalves et al. [47] compared the results of the prediction of CNN, LSTM, and deep neural network classifier (DNNC) for finding the best model to predict the price trends in the exchange markets. Their findings reveal that CNN, on average, has the best predictive power. Sohangir et al. [58] compared the performance of several neural network models, including CNN, LSTM, and doc2vec, for sentiment analysis among the experts’ posts and opinions in StockTwits to predict the movements in the stock markets. Their results disclose that CNN has the highest accuracy in predicting the sentiment of experts.
To increase CNN’s accuracy, some researchers integrated CNN with other models and proposed new hybrid models. For example, by integrating gated recurrent unit (GRU) and CNN, Sabeena et al. [44] introduced a hybrid DL model to predict financial fluctuations in the real-time stock market that is able to process the real-time data from online financial sites. GRUs are developed based on the RNN architectures. They represent a simpler implementation for LSTMs to address gradient functions and learn long-range dependencies. To predict the price movement from financial time series samples, Long et al. [50] introduced an end-to-end model called multi-filters neural network (MFNN) that incorporates CNN and recurrent neural network (RNN).

Other Algorithms

In addition to LSTM, DNN, and CNN, other DS methods have been employed for predicting stock value using time series data. For example, Sirignano and Cont [55] developed an LSDL model to study the USA market quotes and transactions. Their results disclose that there is a universal and stationary relationship between order flow history and price trends. Kim et al. [61] proposed a multi-agent collaborated network (MACN) model to optimize financial time series data, claiming that their model can share and generalize agents’ experience in stock trading.
In addition, various other hybrid methods have been applied by the researchers for financial time series data. For instance, to predict the stock price trends, Lien Minh et al. [59] developed and trained a two-stream GRU (TS-GRU) network and Stock2Vec model to analyze the sentiments in the financial news and their relationship with the financial prices, based on their belief that financial news and sentiment dictionaries affect the stock prices. Their findings support the outperformance of their model in comparison with the current models. Lien Minh et al. [59] also claim that Stock2Vec is highly efficient in financial datasets. Lei et al. [43] combined deep learning models and reinforcement learning models to develop a time-driven feature-aware (TDFA) jointly deep reinforcement learning model (TFJ-DRL) for financial time-series forecasting in algorithmic trading. Preeti et al. [57] introduced an extreme learning machine (ELM)-auto-encoder (AE) model to identify patterns in the financial time series. They tested the accuracy of their model in the time series data of gold price and crude oil price and compared their results with those of generalized autoregressive conditional heteroskedasticity (GARCH), GRNN, MLP, RF, and group method of data handling (GRDH). The result of the mean square error (MSE) test proved that the performance of their model was higher than the existing methods.
In addition to the hybrid deep learning models, four articles applied hybrid machine learning models to financial time series data. Shekhar and Varshney [66] integrated a hybrid model of genetic algorithm-SVM (GV-SVM) with sentiment analysis to predict the future of the stock market. Using quantitative empirical analysis, they proved that the combination of sentiment analysis with GV-SVM increased the model’s accuracy by 18.6% and reported the final model’s accuracy of about 89.93%. Ahmadi et al. [67] compared the performance of two hybrid machine learning models in predicting the timing of the stock markets, using imperialist competition algorithm-SVM (ICA- SVM), and SVM-GA. Their results exposed that SVM-ICA has a higher performance compared with SVM-GA in the prediction of stock market trends for periods of 1–6 days. To predict stock prices using financial time series data, Ebadati and Mortazavi [68] applied a hybrid model by integrating GA-ANN, where GA was employed to select ANN features and optimize parameters. Their study suggests that this hybrid machine learning model has an improved sum square error (SSE) (i.e., performance accuracy) by 99.99% and improved time (i.e., speed accuracy) by 90.66%. Johari et al. [69] compared the accuracy and performance of GARCH-SVM and GARCH-ANN models in the financial time series data for stock price forecasting. They found that GARCH-SVM outperformed GARCH-ANN, SVM, ANN, and GARCH based on MSE and RMSE accuracy metrics.
Rajesh et al. [51] used ensemble learning to predict future stock trends by applying heat maps and ensemble learning to the top 500 companies’ financial data in the S&P stock exchange. Their results show that the combination of RF, SVM, and K-neighbors classifiers had the most accurate results, and the accuracy of the proposed model was 23% higher than a single classifier labeling prediction model. Weng et al. [56] aimed to design a financial expert system to forecast short-term stock prices. For data analysis and predicting stock prices, they employed four machine learning ensemble methods, namely neural network regression ensemble (NNRE), support vector regression ensemble (SVRE), boosted regression tree, and random forest regression (RFR). Using Citi Group stock ($C) data, they were able to forecast the one-day ahead price of 19 stocks from different industries. Weng et al. [56] claim that the boosted regression tree (BRT) outperformed other ensemble models with a considerably better mean absolute percent error (MAPE) than those reported in the literature. Faghihi-Nezhad and Minaei-Bidgoli [62] employed ensemble learning and ANN to develop a two-stage model to predict the stock price. They first predicted the next price movement’s direction and then created a new training dataset to forecast the stock price. They used a genetic algorithm (GA) optimization and particle swarm optimization (PSO) to optimize the results of each stage. The results ultimately reveal that the accuracy of their model in prediction of stock price outperformed other models in the literature.
Reviewing the articles categorized in the stock market category reveals that, although the articles’ research objectives are different, most utilized financial time-series data (35 out of 37 articles). Only two articles used financial news and social media as the data source to determine future trends in the stock market (see Table 3).

3.1.2. Marketing

Studying the purpose of the articles discloses that DS algorithms were mostly used for the purpose of studying customer behavior and promotional activities, which is why these articles are classified in group labeled marketing. As seen in Table 4, two studies applied a single DL method, and three used hybrid DL methods. In addition, these studies used various data sources, such as customer time series data, case studies, and social media. For example, Paolanti et al. [74] employed deep convolutional neural network (DCNN) to develop a mobile robot, so-called ROCKy, to analyze real-time store heat maps of retail store shelves for detection of the shelf-out-of-stock (SOOS) and promotional activities during working hours. Dingli, Marmara, and Fournier [75] investigated solutions to identify the patterns and features among transactional data to predict customer churn within the retail industry. To do so, they compared the performance of CNN and restricted Boltzmann machine (RBM), realizing the RBM outperformed in customer churn prediction.
On the other hand, RF-DNN, RNN-CNN, and similarity, neighborhood-based collaborative filtering model (SN-CFM) are hybrid models that researchers used to study customer behavior. Ładyżyński et al. [76], for instance, employed Random Forest (RF) and DNN methods and customers’ historical transactional data to propose a hybrid model that can predict the customers’ willingness to purchase credit products from the banks. Ullah et al. [77] used the RF algorithm to predict churn customers and to formulate strategies for customer relationship management to prevent churners. Ullah et al. [77] explained that a combination of churn classification, utilizing the RF algorithm and customer profiling using k-means clustering, increased their model’s performance. Agarwal [78] integrated RNN and CNN to develop a model for sentiment analysis. According to Agarwal [78], sentiment analysis is the best approach to receive the customers’ feedback. He tested his proposed model using social media data and believes that the result of the sentiment analysis provides guidance to the business to improve the quality of their service and presents evidence for the startups to improve customer experience. Shamshirband et al. [79] proposed SN-CFM to predict consumer preferences according to the similarity of features of users and products that are acquired from the Internet of Things and social networks.

3.1.3. E-Commerce

Another category that emerged after reviewing the articles is labeled e-commerce, where the authors employed data science models to address problems in the e-commerce sector. A summary of these studies is presented in Table 5. Based on the GRU model, Lei [80] designed a neural network investment quality evaluation model to support the decision-making related to investment in e-commerce. Their proposed model is able to evaluate different index data that provides a better picture to investors. Leung et al. [81] argue that the ability to handle orders and logistics management is one of the major challenges in e-commerce. Therefore, using a hybrid autoregressive-adaptive neuro-fuzzy inference system (AR-ANFIS), they developed a prediction model for B2C e-commerce order arrival. According to their results, their proposed hybrid model can successfully forecast e-order arrivals. Cai et al. [82] used deep reinforcement learning to develop an algorithm to address allocating impression problems on e-commerce websites, such as www.taobao.com, www.ebay.com, and www.amazon.com. In this algorithm, the buyers are allocated to the sellers based on the buyer’s impressions and the seller’s strategies in a way that maximizes the income of the platform. To do so, they applied GRU, and their findings show that GRU outperforms the deep deterministic policy gradient (DDPG). Ha, Pyo, and Kim [83] applied RNN to develop a deep categorization network (Deep CN) for item categorization in e-commerce, which refers to classifying the leaf category of items from their metadata. They used RNN to generate features from text metadata and categorize the items accordingly. Xu et al. [84] designed an advanced credit risk evaluation system for e-commerce platforms to minimize the transaction risk associated with buyers and sellers. To this end, they employed a hybrid machine learning model combined with decision tree-ANN (DT-ANN) and found that this model has high accuracy and outperforms other hybrid machine learning models, such as DT-Logistic Regression and DR-dynamic Bayesian network.
Selling products online has unique challenges, for which data science has been able to provide solutions. To increase the buyer’s trust in the quality of the products and to buy online, Saravanan and Charanya [85] designed an algorithm where the products are categorized according to several criteria, including reviews and ratings of other users. By integrating a hybrid feature extraction method principle component analysis (PCA) and t-distributed stochastic neighbor embedding (t-SNE) with SVM using lexicon-based method, Saravanan and Charanya [85] also proposed a model that separates the products from the large collection of different products based on characteristics, best product ratings, and positive reviews. Wang, Mo, and Tseng [86] used RNN to develop a personalized product recommendation system on e-commerce websites. The result of their model disclosed the outperformance of RNN to K-nearest neighbors (KNN). Wu and Yan [87] claim that the main assumption of the current production recommender models for e-commerce websites is that all historical data of users are recorded, while in practice, many platforms fail to record such dada. Therefore, they devised a list-wise DNN (LWDNN) to model the temporal online behaviors of users and provide recommendations for anonymous users.

3.1.4. Cryptocurrency

The decision-making process related to investing in the cryptocurrencies is similar to investing in the stock market, where the prediction of future value is very determinant and effective on the investment decisions. Applying machine learning and DL models to predict the trends of cryptocurrency prices is an attractive research problem that is emerging in the literature (see Table 6). For example, Lahmiri and Bekiros [88] applied deep learning methods for the prediction of the price of cryptocurrencies, including Bitcoin, Digital Cash, and Ripple, and compared the predictive performance of LSTM and GRNN. Their findings disclose that the LSTM model has a better performance in comparison with GRNN. Altan, Karasu, and Bekiros [89] claim that integrating LSTM and empirical wavelet transform (EWT) improves the performance of LSTM in forecasting the digital currency price by testing their proposed model using Bitcoin, Ripple, Digital Cash, and Litecoin time series data. Jiang and Liang [90] developed a CNN model to predict the price of Bitcoin as a cryptocurrency example. They train their proposed model using historical data of financial assets prices and used portfolio weights of the set as the output of their model.

3.1.5. Corporate Bankruptcy Prediction

Corporate bankruptcy prediction has become an important tool to evaluate the future financial situation of a company. Utilizing machine learning-based methods is widely recommended to address bankruptcy prediction problems. Such as done by Chen, Chen, and Shi [91], who utilized bagging and boosting ensemble strategies and develop two models of Bagged-proportion support vector machines (pSVM) and boosted-pSVM. Using datasets of UCI and LibSVM, they test their models and explain that ensemble learning strategies increased the performance of the models in bankruptcy prediction. Lin, Lu, and Tsai [92] believe that finding the best match of feature selection and classification techniques improves the prediction performance of bankruptcy prediction models. Their results reveal that the genetic algorithm as the wrapper-based feature selection method and the combination of the genetic algorithm with the naïve Bayes and support vector machine classifiers had a remarkable predictive performance. Lahmiri et al. [93], to develop an accurate model for forecasting corporate bankruptcy, compare the performance of different ensemble systems of AdaBoost, LogitBoost, RUSBoost, subspace, and bagging. Their finding reveals that the AdaBoost model has been effective in terms of short-time data processing and low classification error, and limited complexity. Faris et al. [94] investigate the combination of re-sampling (oversampling) techniques and multiple features of election methods to improve the accuracy of bankruptcy prediction methods. According to their results, employing SMOTE oversampling technique and AdaBoost ensemble method using reduced error pruning (REP) tree provides reliable promising results to bankruptcy prediction. A summary of these research articles is presented in Table 7.

3.2. Applied Data Science Methods in the Economics

3.2.1. Deep Learning Methods

The new generation of deep learning methods rooted in the work of Hinton et al. [95]. Hinton et al. [95] develop, for the first time, a two-step approach in which the deep learning algorithms are firstly trained and then fine-tune the model in a back-through process. The advantage of DL models compared to other ML models is that DL models can effectively identify high-level features and representations (the outputs) from a large diverse data sample (inputs). Employing an unsupervised pre-training and a supervised fine-tuning approach, the deep learning models extract hierarchical features from the inputs to classify the patterns of the data [96]. The ability of DL algorithms in prediction and finding the patterns among the raw data has grabbed the attraction of many researchers from various fields. As explored above the stock price prediction, e.g., [32,33,34] and consumer behavior, e.g., [75] have well benefited with a wide range of a deep learning algorithms. Among the methods, the LSTM, CNN, and DNN are respectively the most popular DL [32,33,34]. LSTM method has also been very popular for predicting financial time series. The DL applications to portfolio management [35], automated stock trading [36], and cryptocurrencies price prediction [88] have also been relatively popular.
Similar to LSTM, the CNN algorithm is applied mainly for financial time series data to stock price prediction [47,52,54,64] and cryptocurrencies price prediction [90]. CNN algorithm is also used for analyzing social media data for sentiment analysis [58]. The DNN algorithm, likewise the LSTM, is only used to analyze financial time series data to predict stock prices [46,49,63] and the S&P 500 Index trend prediction [60]. The GRU algorithm, which is another DL model, is applied in the e-commerce section to analyze financial time series [80] and customer time series [82]. RNN is a DL algorithm applied to analyze primary data for item categorization [83] and product recommendation [86]. Large-scale deep learning (LSDL) and MACN are DL algorithms that are used to analyze financial time series data to predict stock price [55,61]. Ultimately, it is found that DCNN and RBM applied for analyzing primary data for respectively promotional activities [74] and customer behavior forecasting [75] (see Table 8).

3.2.2. Hybrid Deep Learning Methods

Hybrid deep neural networks are architectures that apply generative and discriminative components at the same time. Hybrid models combine machine learning models or combine a machine learning model with an optimization model to improve the predictivity of the deep learning model [9]. The findings of the literature review in the study reveal that hybrid deep learning (HDL) models are widely applied in the field of economics. The role of HDL and other algorithms in data science taxonomy is represented in Figure 7. On the other hand, Figure 8 illustrates that the accuracy of such models is reported higher than the single DL models. The various HDL models used among the papers reviewed in this study are summarized in Table 9.

3.2.3. Hybrid Machine Learning Methods

Hybrid models combine machine learning models or combine a machine learning model with an optimization model to improve the predictivity of the machine learning model. Hybrid machine learning models are frequently employed among the reviewed articles in this study. Table 10 illustrates that seven hybrid machine learning models are mainly proposed to stock price predictions (i.e. four out of seven) and three models have been developed to address problems related to e-commerce, namely order arrival prediction, dynamic credit risk evaluation, and product recommendation.

3.2.4. Ensemble Machine Learning Algorithms

Ensemble machine learning algorithms or ensemble learning (EL) models use multiple learning algorithms to improve training processes and boost learning from data [6]. There is much evidence in the literature that ensemble classifiers and ensemble learning systems have been very effective in financial time series data. For instance, Rajesh et al. [51] combine RF, SVM, and K-neighbors classifiers, Weng et al. [56] used neural network regression ensemble, and Faghihi-Nezhad and Minaei-Bidgoli [62] applied an ANN-EL to predict stock values. In addition, Chen et al. [91], Lin et al. [92], Lahmiri et al. [93] respectively applied Bagged-pSVM and Boosted-pSVM models, a genetic algorithm with the naïve Bayes and SVM, SMOTE-AdaBoost-REP Tree to predict corporate bankruptcy. Ullah et al. [77] also took advantage of EL strategies and used an RF algorithm to predict the churn customers. Table 11 summarizes these studies which have used an ensemble model.

3.3. Discussions on the Taxonomy of Data Science Advancements in Economics

Oláh et al. [97] provided a checklist of items needed to validate the Prisma method’s findings. They argue that the selection of the articles should be explained in detail. Then, each article’s characteristics, including data source, data analysis method, and the findings of each study, should be written and referenced to that article. In the present study, the process of database formation of the study was described in detail in the methodology section and its processes were shown in Figure 2. In addition, the characteristics and results of each reviewed article are presented in detail in Table 2, Table 3, Table 4, Table 5, Table 6, Table 7, Table 8, Table 9, Table 10 and Table 11. One of this study’s contributions is presenting the taxonomy of data science methods and applications in economics. Indeed, the output of reviewing the literature is presented in Figure 7. This study presents the advancement of deep learning models, hybrid deep learning models, hybrid machine learning models, and ensemble machine learning models in economics. It is figured out that overall, 42 unique algorithms employed among 57 reviewed articles in which 9 of them used single DL models (see Table 8), 18 hybrid DL models (see Table 9), 7 hybrid ML (see Table 10), and 8 ensemble models (see Table 11). It has also been revealed that these 42 models have been used for different purposes, including algorithmic trading, automated stock trading, bankruptcy prediction, cryptocurrencies price prediction, customer behavior, shelf out of stock, impression allocation problem, investment quality evaluation model, item categorization, portfolio management, product recommendation, sentiment analysis, socially responsible investment portfolios, and stock price prediction. It is also disclosed that data science algorithms mainly applied for financial time series data to forecast stock prices in which the LSTM model has been the most popular model for analyzing financial time series. CNN and DNN have been respectively the most applied algorithms among the reviewed articles in this study.
Twenty-five models out of 42 models presented the taxonomy are hybrid models that show the increasing trend of using hybrid models in the economy. As the hybrid models often outperform the single DL models, the popularity of hybridization methods has been increasing. The model performance techniques using the evaluation metrics, e.g., root-mean-squared error (RMSE) has been a standard strategy to evaluate the model performance in most cases. Therefore, the RMSE, which is among the most popular accuracy metrics of machine learning models, are summarized in Figure 8. The lower the RMSE, the higher the accuracy of the model [10]. Figure 8 is a three-dimension graph in which the hybrid models are sorted in the x axis, the y axis represents the values of RMSE, and the z axis shows the number of times the models are used. According to Figure 8, the value of RMSE for the single DL models has been higher than the hybrid DL models that indicate the higher accuracy of hybrid DL models. This finding is consistent with the claims made in the literature (e.g., [9,10,11]) and it can also explain why the use of hybrid models has become trendy.
The use of data science to address economics-related issues is increasing dramatically. Data science has shown significant contributions to the advancement of novel economic models and data-driven solutions. Providing the state-of-the-art of data science methods depicts a clear image of data science advancements in the economics-related fields. Therefore, this study’s main objective was to investigate the advances of data science in economics through a novel taxonomy of applications and methods. To that end, the Prisma method, a systematic literature review methodology, applied to survey all the publications indexed in the Scopus and Thomson Reuters Web-of-Science databases. The review is limited to published documents written in English and used data science techniques to solve problems in one of the economics-related fields. Finally, fifty-seven papers were selected for further consideration. The finding revealed that five economics-related fields, namely stock market, marketing, E-commerce, corporate banking, cryptocurrency, and utilized data science solutions. On the other hand, it was found that deep learning models, hybrid deep learning models, hybrid machine learning models, and ensemble machine learning models were used in each of these applications. It disclosed that the trends are on the advancement of hybrid models as 25 out of 42 models used among the reviewed articles have been hybrid models. According to the root-mean-square error accuracy metric, hybrid models had higher prediction accuracy than other algorithms; this can justify the prevalence of hybrid models. The findings of this study are confined to these five applications related to the field of economics and cannot be generalized to other fields or disciplines. This research provides insights into the advancements of data science models and the application of these models in the field of economics, which offers guidance for researchers and practitioners in the field.
This research presents the state-of-the-art advancements in data science technologies and methods and their application in economics. This study’s findings guide researchers and practitioners in this field who can select the appropriate model according to their application after familiarizing themselves with the latest developments in data science models.

4. Conclusions

The use of data science techniques in economics and related fields is increasing day-by-day. The current study provides the latest developments in data science methods and their application in economics. This study’s findings show that the advancement of data science algorithms occurred in deep learning models, ensemble models, hybrid deep learning models, and hybrid machine learning models. Therefore, these four models were further investigated based on previous reports in five different areas: (1) stock market, in which stock price prediction was the main goal of most of the papers; (2) marketing, in which the objectives of the papers were mostly to study the customer behavior; (3) corporate bankruptcy; (4) cryptocurrency, which is a new trend in the economic field where researchers try to predict the price of digital money; and (5) e-commerce, in which the DS methods are applied mainly to increase the performance of e-commerce websites (e.g., by item categorization and product recommendation). This study elaborately provides the algorithms of deep learning models used in economics, data sourcing, and data science in each category. Findings reveal that hybrid models’ utilization has increased due to the higher prediction accuracy than single deep learning models. Specifically, LSTM, CNN, and DNN models have been the most applied literature models to analyze financial time series data and predict stock price. Upon review of 57 articles in this work, it was found that all works employed 42 unique models, 25 of which were hybrid models, thus illustrating the trend of using such models in economics. The comparative results of RMSE values disclose that hybrid models show a remarkable low error level and hence higher prediction accuracy. Therefore, it is recommended to apply hybrid models to model and optimize objectives in all five fields studied in this research. Overall, this study provides state-of-the-art data science models in economics and generates insights for researchers and practitioners to determine the most appropriate model for their applications. For the future study reviewing of machine learning methods for 5G data and its economic potential is proposed. A survey of machine learning applications for economic modeling the COVID-19 (coronavirus) long-term effects also encouraged.

Author Contributions

Conceptualization, A.M.; investigation, S.N., and A.M.; writing—original draft preparation, S.N.; methodology, A.M.; writing—review and editing, S.N., A.M., P.D., P.G., F.F., S.S.B., U.R., J.G., and A.H.G.; visualization, S.N.; software, A.M.; project administration, A.M.; resources, A.M.; controlling the results, P.G., U.R., and A.H.G.; supervision, U.R., J.G., and A.H.G.; validation, A.M., U.R., J.G., and A.H.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research in part by the Hungarian-Mexican bilateral Scientific and Technological (2019-2.1.11-TÉT-2019-00007) project, and also EFOP-3.6.2-16-2017-00016 project in the framework of the New Szechenyi Plan. Completing this project is supported by the European Union and co-financed by the European social fund.

Acknowledgments

Support of the Alexander von Humboldt Foundation is acknowledged.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AcronymExplanation
(2D)2PCA2-Directional 2-Dimensional Principal Component Analysis
AEAuto-Encoder
ANNArtificial Neural Network
AOAdam Optimizer
AR-ANFISAuto Regressive-Adaptive Neuro-Fuzzy Inference System
ARIMAAutoregressive Integrated Moving Average Model
BRTBoosted Regression Tree
CNNConvolutional Neural Network
DCNNDeep Convolutional Neural Network
DLDeep Learning
DNNDeep Neural Network
DNNCDeep Neural Network Classifier
DRLDeep Reinforcement Learning
DSData Science
DT—ANNDecision Tree—Artificial Neural Network
ELEnsemble Learning
ELMExtreme Learning Machine
EWTEmpirical Wavelet Transform
GA-ANNGenetic Algorithm—Artificial Neural Network
GARCHGeneralized Autoregressive Conditional Heteroskedasticity
GRDHGroup Method of Data Handling
GRNNGeneralized Regression Neural Networks
GRUGated Recurrent Unit
ICAImperialist Competition Algorithm
KNNK-Nearest Neighbors
LOGLogistic Regression Classifier
LSDLLarge-Scale Deep Learning
LSTMLong Short-Term Memory
LWDNNList-Wise Deep Neural Network
MACNMulti-Agent Collaborated Network
MB-LSTMMultivariate Bidirectional LSTM
MDNNMultilayer Deep Neural Network
MFNNMulti-Filters Neural Network
MLMachine Learning
MLPMultiple Layer Perceptron
NNRENeural Network Regression Ensemble
O-LSRMOptimal Long Short-Term Memory
PCAPrincipal Component Analysis
pSVMProportion Support Vector Machines
RBFNNRadial Basis Function Neural Network
RBMRestricted Boltzmann Machine
REPReduced Error Pruning
RFRandom Forest
RFRRandom Forest Regression
RNNRecurrent Neural Network
SAEStacked Autoencoders
SLRStepwise Linear Regressions
SN-CFMSimilarity, Neighborhood-Based Collaborative Filtering Model
STIStock Technical Indicators
SVMSupport Vector Machine
SVRSupport Vector Regression
SVRESupport Vector Regression Ensemble,
TDFATime-Driven Feature-Aware
TS-GRUTwo-Stream GRU
t-SNE t-distributed stochastic neighbor embedding
WAWavelet Analysis
WTWavelet Transforms
K-NNk-nearest neighbors algorithm

References

  1. Nosratabadi, S.; Mosavi, A.; Keivani, R.; Ardabili, S.; Aram, F. State of the art survey of deep learning and machine learning models for smart cities and urban sustainability. In Proceedings of the International Conference on Global Research and Education; Springer: Cham, Switzerland, 2019; pp. 228–238. [Google Scholar]
  2. Mittal, S.; Stoean, C.; Kajdacsy-Balla, A.; Bhargava, R. Digital Assessment of Stained Breast Tissue Images for Comprehensive Tumor and Microenvironment Analysis. Front. Bioeng. Biotechnol. 2019, 7, 246. [Google Scholar] [CrossRef] [Green Version]
  3. Stoean, R.; Iliescu, D.; Stoean, C. Segmentation of points of interest during fetal cardiac assesment in the first trimester from color Doppler ultrasound. arXiv 2019, arXiv:1909.11903. [Google Scholar]
  4. Stoean, R.; Stoean, C.; Samide, A.; Joya, G. Convolutional Neural Network Learning Versus Traditional Segmentation for the Approximation of the Degree of Defective Surface in Titanium for Implantable Medical Devices. In Proceedings of the International Work-Conference on Artificial Neural Networks; Springer: Cham, Switzerland, 2019; pp. 871–882. [Google Scholar]
  5. Casalino, G.; Castellano, G.; Consiglio, A.; Liguori, M.; Nuzziello, N.; Primiceri, D. A Predictive Model for MicroRNA Expressions in Pediatric Multiple Sclerosis Detection. In Proceedings of the International Conference on Modeling Decisions for Artificial Intelligence; Springer: Cham, Switzerland, 2017; pp. 177–188. [Google Scholar]
  6. Ardabili, S.; Mosavi, A.; Várkonyi-Kóczy, A.R. Advances in machine learning modeling reviewing hybrid and ensemble methods. In Proceedings of the International Conference on Global Research and Education; Springer: Cham, Switzerland, 2019; pp. 215–227. [Google Scholar]
  7. Nosratabadi, S.; Karoly, S.; Beszedes, B.; Felde, I.; Ardabili, S.; Mosavi, A. Comparative Analysis of ANN-ICA and ANN-GWO for Crop Yield Prediction. In Proceedings of the 2020 RIVF International Conference on Computing and Communication Technologies (RIVF), Ho Chi Minh, Vietnam, 14–15 October 2020. [Google Scholar] [CrossRef]
  8. Torabi, M.; Hashemi, S.; Saybani, M.R.; Shamshirband, S.; Mosavi, A. A Hybrid clustering and classification technique for forecasting short-term energy consumption. Environ. Prog. Sustain. Energy 2019, 38, 66–76. [Google Scholar] [CrossRef] [Green Version]
  9. Mosavi, A.; Edalatifar, M. A hybrid neuro-fuzzy algorithm for prediction of reference evapotranspiration. In Proceedings of the International Conference on Global Research and Education; Springer: Cham, Switzerland, 2018; pp. 235–243. [Google Scholar]
  10. Mosavi, A.; Shamshirband, S.; Salwana, E.; Chau, K.-w.; Tah, J.H. Prediction of multi-inputs bubble column reactor using a novel hybrid model of computational fluid dynamics and machine learning. Eng. Appl. Comput. Fluid Mech. 2019, 13, 482–492. [Google Scholar] [CrossRef]
  11. Torabi, M.; Mosavi, A.; Ozturk, P.; Varkonyi-Koczy, A.; Istvan, V. A hybrid machine learning approach for daily prediction of solar radiation. In Proceedings of the International Conference on Global Research and Education; Springer: Cham, Switzerland, 2018; pp. 266–274. [Google Scholar]
  12. Lee, H.; Li, G.; Rai, A.; Chattopadhyay, A. Real-time anomaly detection framework using a support vector regression for the safety monitoring of commercial aircraft. Adv. Eng. Inf. 2020, 44. [Google Scholar] [CrossRef]
  13. Husejinović, A. Credit card fraud detection using naive Bayesian and c4.5 decision tree classifiers. Period. Eng. Nat. Sci. 2020, 8, 1–5. [Google Scholar]
  14. Zhang, Y. Application of improved BP neural network based on e-commerce supply chain network data in the forecast of aquatic product export volume. Cogn. Sys. Res. 2019, 57, 228–235. [Google Scholar] [CrossRef]
  15. Sundar, G.; Satyanarayana, K. Multi layer feed forward neural network knowledge base to future stock market prediction. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 1061–1075. [Google Scholar] [CrossRef]
  16. Hew, J.J.; Leong, L.Y.; Tan, G.W.H.; Ooi, K.B.; Lee, V.H. The age of mobile social commerce: An Artificial Neural Network analysis on its resistances. Technol. Soc. Chang. 2019, 144, 311–324. [Google Scholar] [CrossRef]
  17. Abdillah, Y.; Suharjito. Failure prediction of e-banking application system using Adaptive Neuro Fuzzy Inference System (ANFIS). Int. J. Electr. Comput. Eng. 2019, 9, 667–675. [Google Scholar] [CrossRef]
  18. Sabaitytė, J.; Davidavičienė, V.; Straková, J.; Raudeliūnienė, J. Decision tree modelling of E-consumers’ preferences for internet marketing communication tools during browsing. E M Ekon. Manag. 2019, 22, 206–224. [Google Scholar] [CrossRef]
  19. Zatevakhina, A.; Dedyukhina, N.; Klioutchnikov, O. Recommender Systems-The Foundation of an Intelligent Financial Platform: Prospects of Development. In Proceedings of the 2019 International Conference on Artificial Intelligence: Applications and Innovations (IC-AIAI), Belgrade, Serbia, 30 September–4 October 2019; pp. 104–1046. [Google Scholar]
  20. Benlahbib, A.; Nfaoui, E.H. A hybrid approach for generating reputation based on opinions fusion and sentiment analysis. J. Organ. Comput. Electron. Commer. 2020, 30, 9–27. [Google Scholar] [CrossRef]
  21. Fujiyoshi, H.; Hirakawa, T.; Yamashita, T. Deep learning-based image recognition for autonomous driving. Iatss Res. 2019. [Google Scholar] [CrossRef]
  22. Mathis, M.W.; Mathis, A. Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 2020, 60, 1–11. [Google Scholar] [CrossRef] [PubMed]
  23. Wang, H.; Lei, Z.; Zhang, X.; Zhou, B.; Peng, J. A review of deep learning for renewable energy forecasting. Energy Convers. Manag. 2019, 198, 111799. [Google Scholar] [CrossRef]
  24. Durairaj, M.; Krishna Mohan, B.H. A review of two decades of deep learning hybrids for financial time series prediction. Int. J. Emerg. Technol. 2019, 10, 324–331. [Google Scholar]
  25. Guermoui, M.; Melgani, F.; Gairaa, K.; Mekhalfi, M.L. A comprehensive review of hybrid models for solar radiation forecasting. J. Clean. Prod. 2020, 258, 120357. [Google Scholar] [CrossRef]
  26. Pradeepkumar, D.; Ravi, V. Soft computing hybrids for FOREX rate prediction: A comprehensive review. Comput. Oper. Res. 2018, 99, 262–284. [Google Scholar] [CrossRef]
  27. Hosni, M.; Abnane, I.; Idri, A.; de Gea, J.M.C.; Alemán, J.L.F. Reviewing ensemble classification methods in breast cancer. Comput. Methods Programs Biomed. 2019, 177, 89–112. [Google Scholar] [CrossRef]
  28. Song, X.; Jiao, L.; Yang, S.; Zhang, X.; Shang, F. Sparse coding and classifier ensemble based multi-instance learning for image categorization. Signal Process. 2013, 93, 1–11. [Google Scholar] [CrossRef]
  29. Chung, Y.-W.; Khaki, B.; Li, T.; Chu, C.; Gadh, R. Ensemble machine learning-based algorithm for electric vehicle user behavior prediction. Appl. Energy 2019, 254, 113732. [Google Scholar] [CrossRef]
  30. AlKandari, M.; Ahmad, I. Solar Power Generation Forecasting Using Ensemble Approach Based on Deep Learning and Statistical Methods. Appl. Comput. Inform. 2020. [Google Scholar] [CrossRef]
  31. Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.; Clarke, M.; Devereaux, P.J.; Kleijnen, J.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. J. Clin. Epidemiol. 2009, 62, e1–e34. [Google Scholar] [CrossRef] [Green Version]
  32. Moon, K.S.; Kim, H. Performance of deep learning in prediction of stock market volatility. Econ. Comput. Econ. Cybern. Stud. Res. 2019, 53, 77–92. [Google Scholar] [CrossRef]
  33. Fischer, T.; Krauss, C. Deep learning with long short-term memory networks for financial market predictions. Eur. J. Oper. Res. 2018, 270, 654–669. [Google Scholar] [CrossRef] [Green Version]
  34. Tamura, K.; Uenoyama, K.; Iitsuka, S.; Matsuo, Y. Model for evaluation of stock values by ensemble model using deep learning. Trans. Jpn. Soc. Artif. Intell. 2018, 33. [Google Scholar] [CrossRef] [Green Version]
  35. Wang, W.; Li, W.; Zhang, N.; Liu, K. Portfolio formation with preselection using deep learning from long-term financial data. Expert Syst. Appl. 2020, 143, 113042. [Google Scholar] [CrossRef]
  36. Fister, D.; Mun, J.C.; Jagric, V.; Jagric, T. Deep learning for stock market trading: A superior trading strategy? Neural Netw. World 2019, 29, 151–171. [Google Scholar] [CrossRef] [Green Version]
  37. Stoean, C.; Paja, W.; Stoean, R.; Sandita, A. Deep architectures for long-term stock price prediction with a heuristic-based strategy for trading simulations. PLoS ONE 2019, 14, e0223593. [Google Scholar] [CrossRef] [Green Version]
  38. Agrawal, M.; Khan, A.U.; Shukla, P.K. Stock price prediction using technical indicators: A predictive model using optimal deep learning. Int. J. Recent Technol. Eng. 2019, 8, 2297–2305. [Google Scholar] [CrossRef]
  39. Bao, W.; Yue, J.; Rao, Y. A deep learning framework for financial time series using stacked autoencoders and long-short term memory. PLoS ONE 2017, 12. [Google Scholar] [CrossRef] [Green Version]
  40. Yan, H.; Ouyang, H. Financial Time Series Prediction Based on Deep Learning. Wirel. Pers. Commun. 2018, 102, 683–700. [Google Scholar] [CrossRef]
  41. Vo, N.N.; He, X.; Liu, S.; Xu, G. Deep learning for decision making and the optimization of socially responsible investments and portfolio. Decis. Support Syst. 2019, 124, 113097. [Google Scholar] [CrossRef]
  42. Fang, Y.; Chen, J.; Xue, Z. Research on quantitative investment strategies based on deep learning. Algorithms 2019, 12, 35. [Google Scholar] [CrossRef] [Green Version]
  43. Lei, K.; Zhang, B.; Li, Y.; Yang, M.; Shen, Y. Time-driven feature-aware jointly deep reinforcement learning for financial signal representation and algorithmic trading. Expert Syst. Appl. 2020, 140. [Google Scholar] [CrossRef]
  44. Sabeena, J.; Venkata Subba Reddy, P. A modified deep learning enthused adversarial network model to predict financial fluctuations in stock market. Int. J. Eng. Adv. Technol. 2019, 8, 2996–3000. [Google Scholar] [CrossRef]
  45. Das, S.; Mishra, S. Advanced deep learning framework for stock value prediction. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 2358–2367. [Google Scholar] [CrossRef]
  46. Go, Y.H.; Hong, J.K. Prediction of stock value using pattern matching algorithm based on deep learning. Int. J. Recent Technol. Eng. 2019, 8, 31–35. [Google Scholar] [CrossRef]
  47. Gonçalves, R.; Ribeiro, V.M.; Pereira, F.L.; Rocha, A.P. Deep learning in exchange markets. Inf. Econ. Policy 2019, 47, 38–51. [Google Scholar] [CrossRef]
  48. Moews, B.; Herrmann, J.M.; Ibikunle, G. Lagged correlation-based deep learning for directional trend change prediction in financial time series. Expert Syst. Appl. 2019, 120, 197–206. [Google Scholar] [CrossRef] [Green Version]
  49. Song, Y.; Lee, J.W.; Lee, J. A study on novel filtering and relationship between input-features and target-vectors in a deep learning model for stock price prediction. Appl. Intell. 2019, 49, 897–911. [Google Scholar] [CrossRef]
  50. Long, W.; Lu, Z.; Cui, L. Deep learning-based feature engineering for stock price movement prediction. Knowl. Based Syst. 2019, 164, 163–173. [Google Scholar] [CrossRef]
  51. Rajesh, P.; Srinivas, N.; Vamshikrishna Reddy, K.; VamsiPriya, G.; Vakula Dwija, M.; Himaja, D. Stock trend prediction using Ensemble learning techniques in python. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 150–155. [Google Scholar]
  52. Sim, H.S.; Kim, H.I.; Ahn, J.J. Is Deep Learning for Image Recognition Applicable to Stock Market Prediction? Complexity 2019, 2019. [Google Scholar] [CrossRef]
  53. Agrawal, M.; Khan, A.U.; Shukla, P.K. Stock indices price prediction based on technical indicators using deep learning model. Int. J. Emerg. Technol. 2019, 10, 186–194. [Google Scholar]
  54. Tashiro, D.; Matsushima, H.; Izumi, K.; Sakaji, H. Encoding of high-frequency order information and prediction of short-term stock price by deep learning. Quant. Financ. 2019, 19, 1499–1506. [Google Scholar] [CrossRef]
  55. Sirignano, J.; Cont, R. Universal features of price formation in financial markets: Perspectives from deep learning. Quant. Financ. 2019, 19, 1449–1459. [Google Scholar] [CrossRef]
  56. Weng, B.; Lu, L.; Wang, X.; Megahed, F.M.; Martinez, W. Predicting short-term stock prices using ensemble methods and online data sources. Expert Syst. Appl. 2018, 112, 258–273. [Google Scholar] [CrossRef]
  57. Preeti; Dagar, A.; Bala, R.; Singh, R.P. Financial time series forecasting using deep learning network. In Communications in Computer and Information Science; Springer: Heidelberg, Germany, 2019; Volume 899, pp. 23–33. [Google Scholar]
  58. Sohangir, S.; Wang, D.; Pomeranets, A.; Khoshgoftaar, T.M. Big Data: Deep Learning for financial sentiment analysis. J. Big Data 2018, 5. [Google Scholar] [CrossRef] [Green Version]
  59. Lien Minh, D.; Sadeghi-Niaraki, A.; Huy, H.D.; Min, K.; Moon, H. Deep learning approach for short-term stock trends prediction based on two-stream gated recurrent unit network. IEEE Access 2018, 6, 55392–55404. [Google Scholar] [CrossRef]
  60. Das, S.R.; Mokashi, K.; Culkin, R. Are markets truly efficient? Experiments using deep learning algorithms for market movement prediction. Algorithms 2018, 11, 138. [Google Scholar] [CrossRef] [Green Version]
  61. Kim, J.J.; Cha, S.H.; Cho, K.H.; Ryu, M. Deep reinforcement learning based multi-agent collaborated network for distributed stock trading. Int. J. Grid Distrib. Comput. 2018, 11, 11–20. [Google Scholar] [CrossRef]
  62. Faghihi-Nezhad, M.-T.; Minaei-Bidgoli, B. Prediction of Stock Market Using an Ensemble Learning-based Intelligent Model. Ind. Eng. Manag. Syst. 2018, 17, 479–496. [Google Scholar] [CrossRef]
  63. Chong, E.; Han, C.; Park, F.C. Deep learning networks for stock market analysis and prediction: Methodology, data representations, and case studies. Expert Syst. Appl. 2017, 83, 187–205. [Google Scholar] [CrossRef] [Green Version]
  64. Dingli, A.; Fournier, K.S. Financial time series forecastingA deep learning approach. Int. J. Mach. Learn. Comput. 2017, 7, 118–122. [Google Scholar] [CrossRef]
  65. Singh, R.; Srivastava, S. Stock prediction using deep learning. Multimed. Tools Appl. 2017, 76, 18569–18584. [Google Scholar] [CrossRef]
  66. Shekhar, S.; Varshney, N. A hybrid GA-SVM and sentiment analysis for forecasting stock market movement direction. Test Eng. Manag. 2020, 82, 64–72. [Google Scholar]
  67. Ahmadi, E.; Jasemi, M.; Monplaisir, L.; Nabavi, M.A.; Mahmoodi, A.; Amini Jam, P. New efficient hybrid candlestick technical analysis model for stock market timing on the basis of the Support Vector Machine and Heuristic Algorithms of Imperialist Competition and Genetic. Expert Syst. Appl. 2018, 94, 21–31. [Google Scholar] [CrossRef]
  68. Ebadati, O.M.E.; Mortazavi, M.T. An efficient hybrid machine learning method for time series stock market forecasting. Neural Netw. World 2018, 28, 41–55. [Google Scholar] [CrossRef] [Green Version]
  69. Johari, S.N.M.; Farid, F.H.M.; Nasrudin, N.A.E.B.; Bistamam, N.S.L.; Shuhaili, N.S.S.M. Predicting stock market index using hybrid intelligence model. Int. J. Eng. Technol. (UAE) 2018, 7, 36–39. [Google Scholar] [CrossRef] [Green Version]
  70. Ghamisi, P.; Höfle, B.; Zhu, X.X. Hyperspectral and LiDAR data fusion using extinction profiles and deep convolutional neural network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 10, 3011–3024. [Google Scholar] [CrossRef]
  71. Chen, Y.; Jiang, H.; Li, C.; Jia, X.; Ghamisi, P. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6232–6251. [Google Scholar] [CrossRef] [Green Version]
  72. Duan, P.; Kang, X.; Li, S.; Ghamisi, P. Multichannel Pulse-Coupled Neural Network-Based Hyperspectral Image Visualization. IEEE Trans. Geosci. Remote Sens. 2019, 58, 2444–2456. [Google Scholar] [CrossRef]
  73. Li, S.; Song, W.; Fang, L.; Chen, Y.; Ghamisi, P.; Benediktsson, J.A. Deep learning for hyperspectral image classification: An overview. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6690–6709. [Google Scholar] [CrossRef] [Green Version]
  74. Paolanti, M.; Romeo, L.; Martini, M.; Mancini, A.; Frontoni, E.; Zingaretti, P. Robotic retail surveying by deep learning visual and textual data. Robot. Auton. Syst. 2019, 118, 179–188. [Google Scholar] [CrossRef]
  75. Dingli, A.; Marmara, V.; Fournier, N.S. Comparison of deep learning algorithms to predict customer churn within a local retail industry. Int. J. Mach. Learn. Comput. 2017, 7, 128–132. [Google Scholar] [CrossRef]
  76. Ładyżyński, P.; Żbikowski, K.; Gawrysiak, P. Direct marketing campaigns in retail banking with the use of deep learning and random forests. Expert Syst. Appl. 2019, 134, 28–35. [Google Scholar] [CrossRef]
  77. Ullah, I.; Raza, B.; Malik, A.K.; Imran, M.; Islam, S.U.; Kim, S.W. A churn prediction model using random forest: Analysis of machine learning techniques for churn prediction and factor identification in telecom sector. IEEE Access 2019, 7, 60134–60149. [Google Scholar] [CrossRef]
  78. Agarwal, S. Deep Learning-based Sentiment Analysis: Establishing Customer Dimension as the Lifeblood of Business Management. Glob. Bus. Rev. 2019. [Google Scholar] [CrossRef]
  79. Shamshirband, S.; Khader, J.; Gani, S. Predicting consumer preferences in electronic market based on IoT and Social Networks using deep learning based collaborative filtering techniques. Electron. Commer. Res. 2019. [Google Scholar] [CrossRef]
  80. Lei, Z. Research and analysis of deep learning algorithms for investment decision support model in electronic commerce. Electron. Commer. Res. 2019. [Google Scholar] [CrossRef]
  81. Leung, K.H.; Choy, K.L.; Ho, G.T.S.; Lee, C.K.M.; Lam, H.Y.; Luk, C.C. Prediction of B2C e-commerce order arrival using hybrid autoregressive-adaptive neuro-fuzzy inference system (AR-ANFIS) for managing fluctuation of throughput in e-fulfilment centres. Expert Syst. Appl. 2019, 134, 304–324. [Google Scholar] [CrossRef]
  82. Cai, Q.; Filos-Ratsikas, A.; Tang, P.; Zhang, Y. Reinforcement Mechanism Design for e-commerce. In Proceedings of the 2018 World Wide Web Conference, Lyon, France, 23–27 April 2018; pp. 1339–1348. [Google Scholar]
  83. Ha, J.-W.; Pyo, H.; Kim, J. Large-scale item categorization in e-commerce using multiple recurrent neural networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13 August 2016; pp. 107–115. [Google Scholar]
  84. Xu, Y.Z.; Zhang, J.L.; Hua, Y.; Wang, L.Y. Dynamic credit risk evaluation method for e-commerce sellers based on a hybrid artificial intelligence model. Sustainability (Switz.) 2019, 11, 5521. [Google Scholar] [CrossRef] [Green Version]
  85. Saravanan, V.; Sathya Charanya, C. E-commerce product classification using lexical based hybrid feature extraction and SVM. Int. J. Innov. Technol. Explor. Eng. 2019, 9, 1885–1891. [Google Scholar] [CrossRef]
  86. Wang, Y.; Mo, D.Y.; Tseng, M.M. Mapping customer needs to design parameters in the front end of product design by applying deep learning. CIRP Ann. 2018, 67, 145–148. [Google Scholar] [CrossRef]
  87. Wu, C.; Yan, M. Session-aware information embedding for e-commerce product recommendation. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, Singapore, 19 July 2017; pp. 2379–2382. [Google Scholar]
  88. Lahmiri, S.; Bekiros, S. Cryptocurrency forecasting with deep learning chaotic neural networks. Chaos Solitons Fractals 2019, 118, 35–40. [Google Scholar] [CrossRef]
  89. Altan, A.; Karasu, S.; Bekiros, S. Digital currency forecasting with chaotic meta-heuristic bio-inspired signal processing techniques. Chaos Solitons Fractals 2019, 126, 325–336. [Google Scholar] [CrossRef]
  90. Jiang, Z.; Liang, J. Cryptocurrency portfolio management with deep reinforcement learning. In Proceedings of the 2017 Intelligent Systems Conference (IntelliSys), London, UK, 7–8 September 2017; pp. 905–913. [Google Scholar]
  91. Chen, Z.; Chen, W.; Shi, Y. Ensemble learning with label proportions for bankruptcy prediction. Expert Syst. Appl. 2020, 146, 113155. [Google Scholar] [CrossRef]
  92. Lin, W.C.; Lu, Y.H.; Tsai, C.F. Feature selection in single and ensemble learning-based bankruptcy prediction models. Expert Syst. 2019, 36, e12335. [Google Scholar] [CrossRef] [Green Version]
  93. Lahmiri, S.; Bekiros, S.; Giakoumelou, A.; Bezzina, F. Performance assessment of ensemble learning systems in financial data classification. Intell. Syst. Account. Financ. Manag. 2020, 1–7. [Google Scholar] [CrossRef]
  94. Faris, H.; Abukhurma, R.; Almanaseer, W.; Saadeh, M.; Mora, A.M.; Castillo, P.A.; Aljarah, I. Improving financial bankruptcy prediction in a highly imbalanced class distribution using oversampling and ensemble learning: A case from the Spanish market. Prog. Artif. Intell. 2019, 9, 1–23. [Google Scholar] [CrossRef]
  95. Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimensionality of data with neural networks. Science 2006, 313, 504–507. [Google Scholar] [CrossRef] [Green Version]
  96. Zhang, Q.; Yang, L.T.; Chen, Z.; Li, P. A survey on deep learning for big data. Inf. Fusion 2018, 42, 146–157. [Google Scholar] [CrossRef]
  97. Oláh, J.; Krisán, E.; Kiss, A.; Lakner, Z.; Popp, J. PRISMA Statement for Reporting Literature Searches in Systematic Reviews of the Bioethanol Sector. Energies 2020, 13, 2323. [Google Scholar] [CrossRef]
Figure 1. Rapid rise in the applications of data science in economics.
Figure 1. Rapid rise in the applications of data science in economics.
Mathematics 08 01799 g001
Figure 2. Diagram of the systematic selection, evaluation, and quality control of the database using the Prisma model.
Figure 2. Diagram of the systematic selection, evaluation, and quality control of the database using the Prisma model.
Mathematics 08 01799 g002
Figure 3. Notable methods of deep learning and hybrid deep learning models applied in economics-related fields; the size of the rectangle is proportional to the number of publications (source: WoS).
Figure 3. Notable methods of deep learning and hybrid deep learning models applied in economics-related fields; the size of the rectangle is proportional to the number of publications (source: WoS).
Mathematics 08 01799 g003
Figure 4. The structure of the long short-term memory (LSTM) network.
Figure 4. The structure of the long short-term memory (LSTM) network.
Mathematics 08 01799 g004
Figure 5. Structure of the deep neural network.
Figure 5. Structure of the deep neural network.
Mathematics 08 01799 g005
Figure 6. Structure of the convolutional neural network (CNN).
Figure 6. Structure of the convolutional neural network (CNN).
Mathematics 08 01799 g006
Figure 7. Taxonomy of data science methods (machine learning and deep learning) in economics.
Figure 7. Taxonomy of data science methods (machine learning and deep learning) in economics.
Mathematics 08 01799 g007
Figure 8. Comparison of root-mean-squared error values of hybrid deep learning models and deep learning.
Figure 8. Comparison of root-mean-squared error values of hybrid deep learning models and deep learning.
Mathematics 08 01799 g008
Table 1. Examples of notable classic machine learning methods applied in economics-related fields.
Table 1. Examples of notable classic machine learning methods applied in economics-related fields.
SourcesMachine Learning ModelsObjectives
Lee et al. [12]Support Vector Regression (SVR)Anomaly Detection
Husejinović [13]Naive Bayesian And C4.5 Decision Tree ClassifiersCredit Card Fraud Detection
Zhang [14]Improved BP Neural NetworkAquatic Product Export Volume Prediction
Sundar and Satyanarayana [15]Multilayer Feed Forward Neural NetworkStock Price Prediction
Hew et al. [16]Artificial Neural Network (ANN)Mobile Social Commerce
Abdillah and Suharjito [17]Adaptive Neuro-Fuzzy Inference System (ANFIS)E-Banking Failure
Sabaitytė et al. [18]Decision Tree (DT)Customer Behavior
Zatevakhina, Dedyukhina, and Klioutchnikov [19]Deep Neural Network (ANN)Recommender Systems
Benlahbib and Nfaoui [20]Naïve Bayes and Linear Support Vector Machine (LSVM)Sentiment Analysis
Table 2. Notable machine learning and deep learning methods in stock market.
Table 2. Notable machine learning and deep learning methods in stock market.
SourceModeling MethodsData SourceResearch Objective
Wang et al. [35]LSTM Comparing with SVM, RF, DNN, and ARIMAsFinancial Time SeriesPortfolio management
Lei et al. [43]time-driven feature-aware and DRLFinancial Time SeriesAlgorithmic trading
Vo et al. [41]Multivariate Bidirectional LSTM Comparing with DRLFinancial Time SeriesSocially Responsible Investment Portfolios
Sabeena and Venkata Subba Reddy [44]GRU–CNNFinancial Time SeriesStock Price Prediction
Das and Mishra [45]Adam optimizer-MDNNFinancial Time SeriesStock Price Prediction
Go and Hong [46]DNNFinancial Time SeriesStock Price Prediction
Agrawal et al. [38]O-LSTM-STIFinancial Time SeriesStock Price Prediction
Gonçalves et al. [47]CNN comparing with DNNC and LSTMFinancial Time SeriesStock Price Prediction
Moews et al. [48]DNN-SLRFinancial Time SeriesStock Price Prediction
Song et al. [49]DNNFinancial Time SeriesStock Price Prediction
Fang et al. [42]LSTM-SVR comparing with RF and LSTMFinancial Time SeriesExchange-trade-fund (EFT) Options Prices Prediction
Long et al. [50]MFNN (CNN and RNN)Financial Time SeriesStock Price Prediction
Fister et al. [36]LSTMFinancial Time SeriesAutomated Stock Trading
Rajesh [51]RF, SVM, and KNNFinancial Time SeriesStock Price Prediction
Moon and Kim [32]LSTMFinancial Time SeriesStock Price Prediction
Sim Kim, and Ahn [52]CNN comparing with ANN and SVMFinancial Time SeriesStock Price Prediction
Agrawal et al. [53]LSTM-STIsFinancial Time SeriesStock Price Prediction
Tashiro et al. [54]CNNFinancial Time SeriesStock Price Prediction
Sirignano and Cont [55]LSDLFinancial Time SeriesStock Price Prediction
Weng et al. [56]BRT comparing with NNRE, SVRE, and RFRFinancial Time SeriesStock Price Prediction
Preeti et al. [57]ELM-AE comparing with GARCH, GRNN, MLP, RF, and GRDHFinancial Time SeriesStock Price Prediction
Sohangir et al. [58]CNN comparing with doc2vec and LSTMSocial mediaSentiment Analysis
Fischer and Krauss [33]LSTM comparing with RF, DNN, and LOGFinancial Time SeriesStock Price Prediction
Lien Minh et al. [59]two-stream GRUFinancial newsSentiment Analysis
Das et al. [60]DNNFinancial Time SeriesThe S&P 500 Index Trend Prediction
Yan and Ouyang [40]wavelet analysis with LSTM, comparing with SVM, KNN, and MLPFinancial Time SeriesStock Price Prediction
Kim et al. [61]MACNFinancial Time SeriesStock Price Prediction
Faghihi-Nezhad and Minaei-Bidgoli [62]EL-ANNFinancial Time SeriesStock Price Prediction
Tamura et al. [34]LSTMFinancial Time SeriesStock Price Prediction
Chong et al. [63]DNN comparing with PCA, Autoencoder, and RBMFinancial Time SeriesStock Price Prediction
Dingli and Fournier [64]CNNFinancial Time SeriesStock Price Prediction
Singh and Srivastava [65](2D)2PCA–DNN comparing with RBFNNFinancial Time SeriesStock Price Prediction
Bao et al. [39]WT-SAEs-LSTMFinancial Time SeriesStock Price Prediction
Shekhar and Varshney [66]GA-SVMFinancial Time SeriesStock Price Prediction
Ahmadi et al. [67]ICA- SVMFinancial Time SeriesStock Price Prediction
Ebadati and Mortazavi [68]GA-ANNFinancial Time SeriesStock Price Prediction
Johari et al. [69]GARCH-SVMFinancial Time SeriesStock Price Prediction
Table 3. Classification of articles using data science by research purpose and data source in the stock market section.
Table 3. Classification of articles using data science by research purpose and data source in the stock market section.
Research ObjectiveData SourceNumber of Documents
Stock Price PredictionFinancial Time Series29
Sentiment AnalysisFinancial News, Social Media2
Portfolio managementFinancial Time Series1
Algorithmic tradingFinancial Time Series1
Socially Responsible Investment PortfoliosFinancial Time Series1
Automated Stock TradingFinancial Time Series1
The S&P 500 Index Trend PredictionFinancial Time Series1
Exchange-trade-fund (EFT) Options Prices PredictionFinancial Time Series1
Table 4. Notable machine learning and deep learning methods in Marketing.
Table 4. Notable machine learning and deep learning methods in Marketing.
SourceModeling MethodsData SourceResearch Objective
Ładyżyński et al. [76]RF–DNNTime Series data of Customers Customer Behavior
Ullah et al. [77]RFTime Series data of CustomersCustomer Behavior
Paolanti et al. [74]DCNNPrimary DataDetection of Shelf Out of Stock (SOOS) and Promotional Activities
Agarwal [78]RNNs-CNNsSocial mediaSentiment Analysis
Shamshirband et al. [79]SN-CFMSocial mediaCustomer behavior
Dingli et al. [75]RBMPrimary DataCustomer behavior
Table 5. Notable machine learning and deep learning methods in e-commerce.
Table 5. Notable machine learning and deep learning methods in e-commerce.
SourceModeling MethodsData SourceResearch Objective
Lei [80]GRUFinancial Time SeriesInvestment Quality Evaluation Model
Leung et al. [81]AR-ANFISPrimary DataOrder Arrival Prediction
Cai et al. [82]GRUCustomers Time SeriesImpression Allocation Problem
Ha et al. [83] RNNPrimary DataItem Categorization
Xu et al. [84]DT—ANNCredit DataDynamic Credit Risk Evaluation
Saravanan and Charanya [85]PCA-t-SNE-SVMPrimary DataProduct Recommendation
Wang et al. [86]RNNPrimary DataProduct Recommendation
Wu and Yan [87]LWDNNCustomers Time SeriesProduct Recommendation
Table 6. Notable machine learning and deep learning methods in Cryptocurrency.
Table 6. Notable machine learning and deep learning methods in Cryptocurrency.
SourceModeling MethodsData SourceResearch Objective
Lahmiri and Bekiros [88]LSTM comparing with GRNNFinancial Time SeriesCryptocurrencies Price prediction
Altana et al. [89]LSTM-EWTFinancial Time SeriesCryptocurrencies Price prediction
Jiang and Liang [90]CNNFinancial Time SeriesCryptocurrencies Price prediction
Table 7. Notable machine learning and deep learning methods in corporate bankruptcy prediction.
Table 7. Notable machine learning and deep learning methods in corporate bankruptcy prediction.
SourceModeling MethodsData SourceResearch Objective
Chen et al. [91]Bagged-pSVM and Boosted-pSVMUCI and LibSVM datasetsbankruptcy prediction
Lin et al. [92] Genetic Algorithm with the Naïve Bayes and SVM classifiersAustralian credit, German credit, and Taiwan bankruptcy datasetsbankruptcy prediction
Lahmiri et al. [93]AdaBoostUniversity of California Irvine (UCI) Machine Learning Repositorybankruptcy prediction
Faris et al. [94]SMOTE-AdaBoost-REP TreeInfotel databasebankruptcy prediction
Table 8. List of single deep learning methods employed in economics related fields.
Table 8. List of single deep learning methods employed in economics related fields.
MethodApplications
Stock MarketMarketingCryptocurrencyE-Commerce
LSTMMoon and Kim [32], Fischer and Krauss [33], Tamura et al. [34], Wang et al. [35], Fister et al. [36],Lahmiri and Bekiros [88]
CNNGonçalves et al. [47], Sim Kim, and Ahn [52], Tashiro et al. [54], Sohangir et al. [58], Dingli and Fournier [64],Jiang and Liang [90]
DNNGo and Hong [46], Song et al. [49], Das et al. [60], Chong et al. [63]
GRULei [80], Cai et al. [82]
RNNHa et al. [83], Wang et al. [86]
LSDLSirignano and Cont [55]
MACNKim et al. [61]
DCNNPaolanti et al. [74]
RBMDingli et al. [75]
Table 9. List of hybrid deep learning models employed in economic related fields.
Table 9. List of hybrid deep learning models employed in economic related fields.
ApplicationThe Hybrid MethodSource
Stock MarketTDFA-DRLLei et al. [43]
MB-LSTM Vo et al. [41]
GRU–CNNSabeena and Venkata Subba Reddy [44]
AO-MDNNDas and Mishra [45]
OLSTM-STIAgrawal et al. [38]
DNN-SLRMoews et al. [48]
LSTM-SVR Fang et al. [42]
MFNN (CNN and RNN)Long et al. [50]
LSTM-STIsAgrawal et al. [53]
ELM-AEPreeti et al. [57]
TS-GRULien Minh et al. [59]
WA-LSTMYan and Ouyang [40]
(2D)2PCA–DNN Singh and Srivastava [65]
WT-SAEs-LSTMBao et al. [39]
RNNs- CNNsAgarwal [78]
SN-CFMShamshirband, et al. [79]
E-commerceLWDNNWu and Yan [87]
CryptocurrencyLSTM-EWTAltan et al. [89]
Table 10. List of hybrid machine learning models employed in economic related fields.
Table 10. List of hybrid machine learning models employed in economic related fields.
ApplicationThe Hybrid MethodSource
Stock MarketGA-SVMShekhar and Varshney [66]
ICA- SVMAhmadi et al. [67]
GA-ANNEbadati and Mortazavi [68]
GARCH-SVMJohari et al. [69]
E-commerceAR-ANFISLeung et al. [81]
DT—ANNXu et al. [84]
PCA- t-SNE-SVMSaravanan and Charanya [85]
Table 11. List of ensemble models applied in the database on the current study.
Table 11. List of ensemble models applied in the database on the current study.
ApplicationThe Ensemble MethodSource
Stock MarketRF-SVM-K-neighborsRajesh et al. [51]
NNREWeng et al. [56]
ANN-ELFaghihi-Nezhad and Minaei-Bidgoli [62]
Corporate BankruptcyBagged-pSVM and Boosted-pSVMChen et al. [91]
Genetic Algorithm with the Naïve Bayes and SVMLin et al. [92]
SMOTE-AdaBoost-REP TreeLahmiri et al. [93]
MarketingRF–DNNŁadyżyński et al. [76]
RFUllah et al. [77]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nosratabadi, S.; Mosavi, A.; Duan, P.; Ghamisi, P.; Filip, F.; Band, S.S.; Reuter, U.; Gama, J.; Gandomi, A.H. Data Science in Economics: Comprehensive Review of Advanced Machine Learning and Deep Learning Methods. Mathematics 2020, 8, 1799. https://doi.org/10.3390/math8101799

AMA Style

Nosratabadi S, Mosavi A, Duan P, Ghamisi P, Filip F, Band SS, Reuter U, Gama J, Gandomi AH. Data Science in Economics: Comprehensive Review of Advanced Machine Learning and Deep Learning Methods. Mathematics. 2020; 8(10):1799. https://doi.org/10.3390/math8101799

Chicago/Turabian Style

Nosratabadi, Saeed, Amirhosein Mosavi, Puhong Duan, Pedram Ghamisi, Ferdinand Filip, Shahab S. Band, Uwe Reuter, Joao Gama, and Amir H. Gandomi. 2020. "Data Science in Economics: Comprehensive Review of Advanced Machine Learning and Deep Learning Methods" Mathematics 8, no. 10: 1799. https://doi.org/10.3390/math8101799

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop