Next Article in Journal
A Survey of Intelligent End-to-End Networking Solutions: Integrating Graph Neural Networks and Deep Reinforcement Learning Approaches
Previous Article in Journal
Convex Regularized Recursive Minimum Error Entropy Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Photovoltaic Power Prediction Method Based on a Long Short-Term Memory Network Optimized by an Improved Sparrow Search Algorithm

College of Information Science and Technology, Donghua University, 2999 North Renmin Road, Shanghai 201620, China
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(5), 993; https://doi.org/10.3390/electronics13050993
Submission received: 7 January 2024 / Revised: 2 February 2024 / Accepted: 6 February 2024 / Published: 6 March 2024
(This article belongs to the Section Artificial Intelligence)

Abstract

:
Photovoltaic (PV) power prediction plays a significant role in supporting the stable operation and resource scheduling of integrated energy systems. However, the randomness and volatility of photovoltaic power generation will greatly affect the prediction accuracy. Focusing on this issue, a prediction framework is proposed in this research by developing an improved sparrow search algorithm (ISSA) to optimize the hyperparameters of long short-term memory (LSTM) neural networks. The ISSA is specially designed from the following three aspects to support a powerful search performance. Firstly, the initial population variety is enriched by using an enhanced sine chaotic mapping. Secondly, the relative position of neighboring producers is introduced to improve the producer position-updating strategy to enhance the global search capabilities. Then the Cauchy–Gaussian variation is utilized to help avoid the local optimal solution. Numerical experiments on 20 test functions indicate that ISSA could identify the optimal solution with better precision compared to SSA and PSO algorithms. Furthermore, a comparative study of PV power prediction methods is provided. The ISSA-LSTM algorithm developed in this paper and five benchmark models are implemented on a real dataset gathered from the Alice Springs area in Australia. In contrast to the SSA-LSTM model, most MAE, MAPE, and RMSE values of the proposed model are reduced by 20∼60%, demonstrating the superiority of the proposed model under various weather conditions and typical seasons.

1. Introduction

Due to the growing scarcity of conventional energy sources and the deterioration of the worldwide environment, renewable energy sources are becoming increasingly important [1]. The International Renewable Energy Agency (IRENA) estimates that Photovoltaic (PV) will account for 25% of worldwide electricity, and PV will lead the way in the transformation of the global power industry [2]. The grid integration process of PV is prone to various problems such as over–under-voltage, reverse current, and frequency fluctuations [3]. Therefore, it is crucial to make accurate predictions of PV output power for the power industry.
The primary PV power forecast models include physical models, statistical models, and artificial intelligence (AI) models [4]. Physical models typically use intricate modeling techniques and depend on a wide range of environmental parameters [5]. Statistical models are also developed such as exponential smoothing, autoregressive moving average (ARMA) [6], and autoregressive-integrated moving average (ARIMA) [7], which are not suitable for predicting nonlinear data. Compared with statistical models, AI models have an increased capacity for self-learning and can optimize model parameters and improve prediction accuracy; thus, they are widely used in short-term PV power prediction. In past decades, some early AI techniques have been introduced, including support vector machines (SVMs) [8], multilayer perceptron neural networks (MLPNNs) [9], and so on.
Recently, recurrent neural networks (RNNs) have been widely used to solve time series prediction problems. RNN gradient disappearance problems can be solved by the variant long short-term memory (LSTM)  [10], which can also be used in photovoltaic power prediction. The prediction accuracy depends on the network structure, the hyperparameters of LSTM, and the input data. As an example of structural optimization for LSTM, the Bi-LSTM neural network was combined with the generative adversarial network in [11]. Data preprocessing is also an effective way to improve the model’s accuracy. For instance, the clearness index was introduced into row data [12], and the wavelet transform was applied to a multi-layered LSTM PV prediction model in [13]. Compared with the studies mentioned above, less concentration is on the optimization of the hyperparameters, which are usually selected based on manual adjustment or experience. This severely restricts the optimal accuracy of the model [14].
Recently, some swarm intelligence algorithms have gained popularity as effective search tools for hyperparameter optimization, including marine predators [15], firefly colonies [16], and bird flocks, due to their characteristics of self-organization, parallel operation, flexibility, and robustness [17]. For example, the traditional particle swarm algorithm (PSO) is used to adjust the parameters of BPNN for PV power prediction [18]. In [19], the artificial bee colony (ABC) algorithm was used to optimize the SVM model to predict the load. However, traditional intelligence algorithms exhibit some inherent shortcomings, such as the well-known tendency of PSO to become trapped in local optima and the slow convergence rate of the ABC algorithm [20].
The sparrow search algorithm (SSA) represents a novel swarm intelligence optimization algorithm introduced in [21]. Its benefits include fewer parameters to be regulated and better performance of optimization search. This promotes its wide application in many scenarios, such as 3D route planning for unmanned aerial vehicles [22], economic optimization of microgrid clusters [23], and threshold segmentation for image processing [24]. However, the accuracy of its optimization search is still constrained by the SSA’s reliance on the initial population, the ease of being trapped in local optimal value, and other flaws.
In this research, an improved sparrow search algorithm (ISSA) incorporates enhancements in initial population diversity, the new producer position updating strategy, and the use of the Cauchy–Gaussian variation strategy to pursue a better global searching performance compared to [21]. Based on ISSA, the hyperparameters of the LSTM neural network are optimized. It is shown that this method of fusing ISSA and LSTM could effectively overcome the blindness and uncertain circumstances of the parameters, obtaining ideal weights and bias of LSTM. This guarantees a higher prediction accuracy of PV power generation compared with other common models. Simulations demonstrate that the suggested model is accurate, dependable, and capable of generalization.
The remainder of this paper is as follows. Section 2 presents the improved sparrow search algorithm, including three major designs and the test comparisons with SSA and other searching tools. In Section 3, we systemically introduce the overall framework of ISSA-LSTM and show the coupled relationship between ISSA and LSTM during the training process. In Section 4, an actual dataset test is presented to illustrate the efficacy of the suggested ISSA-LSTM precision method. The work is finally concluded in Section 5.

2. Improved Sparrow Search Algorithm

The SSA processes the benefits of fast convergence and greater optimization search efficiency. To enhance the performance of PV power prediction, the sparrow search algorithm is further improved and applied to search the hyperparameters of LSTM.
In this section, we will briefly introduce the standard sparrow search algorithm.
The standard SSA separates the population into producers and followers according to each sparrow’s fitness, and the two identities can alternate during each iteration [21]. Throughout the foraging process, if any sparrow individual detects the presence of a predator, it will quickly alert and move.
The producers represent 10∼20% of the total size, and their updating rules for the t + 1th iteration are categorized into two types: extensive search in a safe environment and random movement in a hazardous environment. R 2 represents the warning value, ST denotes the safety threshold. Apply the first line when the warning value is less than the threshold; otherwise, apply the second line. The formula is as follows:
P i , j t + 1 = P i , j t · exp i α · t max i f R 2 < S T P i , j t + Q · L i f R 2 S T
where t m a x indicates the maximum iterations; α 0 , 1 ; P i , j indicates the producer position of the ith individual in the jth dimension; Q satisfies the normal distribution; L is a 1 ×   d matrix with a full element of 1.
The remaining sparrow individuals are all followers, and the updated rule for the t + 1th iteration is divided into two types: individuals with poor fitness value fly elsewhere to forage, and others forage around the location of the optimal producers:
F i , j t + 1 = Q · exp X w o r s t F i , j t i 2 i f i > n / 2 X b e s t t + 1 + F i , j t X b e s t t + 1 · A + · L o t h e r w i s e
where F i , j denotes the follower position information; X b e s t is the optimal position; X w o r s t is the current position with the worst global fitness; A is a 1 × d matrix with each element randomly taken as plus or minus one, and  A + = A T A A T 1 .
Assuming that sparrow individuals with anti-surveillance mechanisms represent 10∼20% of the total population, whose positions are randomly generated, there are two different moving rules for each iteration. The formula is described as follows:
S i , j t + 1 = X b e s t t + β · S i , j t X b e s t t i f f i > f g S i , j t + K · S i , j t X w o r s t t f i f w + ε i f f i = f g
where S i , j denotes the sentry position information; f i , f g , and f w are the current sparrow individual, global best, and worst fitness values, respectively. β is the step control parameter, which obeys the N(0,1) distribution; the smallest constant ε should be added to avoid being zero; and K is the sparrow’s moving direction.
However, the standard SSA is prone to local optimum and slow convergence. The above problems can be solved by utilizing the enhanced sine chaotic mapping method for the initial generation of sparrow populations, introducing the ABC algorithm to update the positions of the producers, and performing mutation optimization search in the later stages of iteration.

2.1. Improved Sine Chaotic Mapping Initial Population

The principle of chaotic mapping is to use chaotic mapping rules to map the optimization variables to the interval value spaces of chaotic variables, which can produce higher-quality results than pseudo-random numbers. Logistic mapping and sine mapping are both commonly used chaotic mapping methods, while in [25], it proves that sine chaos exhibits more obvious chaotic properties compared to logistic chaos. Sine chaotic mapping has a simple structure and high efficiency, but the uneven probability density distribution is a drawback. Since the improved sine chaotic mapping has better space traversal, richer population diversity, and a more uniform distribution in the phase plane, it is chosen to produce a better initial population for the SSA. The improved sine chaotic formula is as follows:
a i + 1 = sin k π a i b i + 1 = sin k π b i y i + 1 = a i + 1 + b i + 1
where the initial values of a i and b i take the range of (0,1); k denotes the control parameter; y i + 1 is the iterative chaotic sequence value; where a i + 1 + b i + 1 denotes the fractional part of a i + 1 + b i + 1 . The distribution of the chaotic system is more uniform when a control parameter is a real number larger than 1000, so it is set to 1200.
Figure 1 demonstrates the stochasticity and traversal of the improved sin chaotic mapping, where Figure 1a,b are different chaotic sequences generated with the same number of iterations for different initial values, whereas from Figure 1c, it is clear that the system traverses the entire solution interval once the number of iterations reaches a particular value.

2.2. Improvement of Producer Position Update Strategy

The SSA is characterized by strong local search capability but insufficient global search capability. The producer position update rule of the standard SSA only makes use of its own position’s information at the last moment and does not fully utilize the information carried by other individuals. This leads to a reduction in the algorithm’s effective exploration area and a reduction in its capacity to explore the global scope.
The artificial bee colony algorithm is a stochastic optimization algorithm with excellent global search capability [26], whose leading bee search strategy involves generating a new nectar source with reference to the positions of the other leading bees, extending the scope of the global search and ensuring convergence.
Therefore, the ABC algorithm is introduced to propose an improved producer position updating rule: when R 2 is lower than the safety threshold ST, the producer sparrow refers to the position of the other producer sparrows for foraging. Then, based on the greedy selection method, check whether the new position’s fitness is better. If so, the new position is retained; otherwise, the previous one is kept.
P i t + 1 = P i t + ϕ P k t P i t i f f i t + 1 f i t P i t i f f i t + 1 < f i t
where ϕ is a uniform number in 1 , 1 ; P k t represents a random producer individual neighboring the current sparrow at the moment.
The updating formula—when R 2 is above the safety threshold ST—is consistent with the standard algorithm:
P i t + 1 = P i t + Q · L

2.3. Cauchy–Gaussian Variation Strategy

Quick assimilation of sparrow individuals causes local optimal stagnation to occur frequently in the later stages of the standard SSA algorithm iteration. The Cauchy–Gaussian mutation strategy [27] is used to solve the specific issue by choosing the optimal current fitness for mutation, contrasting its pre- and post-mutation positions, and choosing the superior position to be substituted into the following iteration:
U b e s t t = X b e s t t 1 + λ 1 c a u c h y 0 , σ 2 + λ 2 G a u s s 0 , σ 2
σ = 1 , f X b e s t < f X i ; exp f X b e s t < f X i f X b e s t ,   o t h e r w i s e
where U b e s t t denotes the individual with the optimal current fitness after mutation; λ 1 = 1 t 2 / T max 2 ; and λ 2 = t 2 / T max 2 is a dynamic parameter that is adaptively adjusted with the number of iterations.

2.4. Computational Complexity

The computational complexity (O) of the ISSA algorithm is mainly influenced by the initialization process, fitness calculation, and updating process. The initialization in SSA is O(N × D); N means the sparrow flock size and D means the problem dimension. In ISSA, each sparrow individual needs to evaluate the objective function, so the complexity of this process is O(M × N × D), where M stands for the iteration number. The complexity of the update process is O(M × N × D). So the computational complexity of ISSA is O(N × D × (2M + 1)).

2.5. Performance Analysis

To thoroughly assess the effectiveness and stability of the ISSA algorithm, a series of benchmarking tests on function optimization were conducted using 20 test functions from CEC2005 and CEC2019; Table 1 shows detailed information on test functions.
The parameter settings in the comparative experiment are as follows: the initial population size of both the SSA and ISSA algorithms is 30. The cognitive component ( C 1 ), social component ( C 2 ), and inertia weight of the PSO are set to 2.0, 2.0, and 0.4. A sensitivity analysis was conducted to determine the optimal proportion of producers in Appendix A. The comparative test results for the SSA, PSO, and ISSA algorithms on 20 test functions are shown in Table 2.
The results include the average (Avg) value, standard deviation (Std) value, p-value, and h-value. The last two indicators imply whether the ISSA algorithm is significantly different from the other algorithms. Convergence curves of the ISSA, PSO, and SSA algorithms for 8 of 20 test functions are shown in Figure 2.
The results presented in Table 2 indicate that with the exception of functions f 11 , f 13 , f 14 , f 17 , SSA consistently exhibits lower or equivalent Avg and Std values compared to the other algorithms. This suggests a superior and more stable performance by ISSA. ISSA also greatly contributes to the rate of convergence even when final optimal results are the same, as  Figure 2 illustrates.
As a result, the ISSA algorithm can more precisely identify the function’s optimal solution, supporting the subsequent search for the LSTM’s ideal parameters.

3. A Framework of ISSA-LSTM for PV Power Forecasting

The general structure of the PV power prediction in the present work is depicted in Figure 3 below. The entire process is divided into two components: data pre-processing and ISSA-LSTM algorithm.

3.1. Data Processing

Data processing is designed to obtain high-quality datasets, which is the precondition of an accurate prediction model. The datasets from various sources are merged to carry out a series of operations, including missing value filling, outlier detection, feature selection, and normalization.
The main sub-problems in data pre-processing are data filling, outlier detection, and data normalization.
When there are more than 12 missing data values in a day (counted in 5 min increments), the relevant information for that day will be deleted. For other individual missing data values, the Lagrange interpolation method was applied. For abnormal data, the isolated forest algorithm [28] was employed.
The inconsistency of the magnitude of each feature in the original dataset will affect the model convergence. Therefore, normalization of the training dataset can improve the training speed and accuracy. The normalization process can be described by Equation (9).
P n = P P min P max P min
where P is the input sample matrix; P n is the data matrix after normalization; and P max and P min represent the maximum and minimum values in the historical input data.
It is necessary to perform the inverse normalization process to transform the training results from the normalized data into the PV power value of the true magnitude in order to make it easier to observe the prediction results after they have been made. The inverse normalization formula is as follows:
T ^ n = P max P min T n + P min
where T n is the normalized PV power prediction and T ^ n is the PV power prediction under true magnitude.

3.2. ISSA-LSTM Algorithm

With the pre-processed datasets, an ISSA-LSTM algorithm is developed in this paper to pursue better prediction performance.
LSTM is capable of addressing the difficulty of utilizing long temporal information in RNN [29], and its classical structure is shown in Figure 4. LSTM introduces three gate control units to add or remove information in the state memory units to achieve dynamic updating of important long-term information.
The forgetting gate determines the retention level of the previous moment through the current input, the intermediate output of the last moment, and the state memory unit of the last moment. The input gate determines the current valid information filtering through the joint action of the tanh function and σ . The output gate determines the update of the intermediate output, and the memory update formula of LSTM is shown in Equations (11)–(16), as follows:
f t = σ W f h t 1 , x t + b f
i t = σ W i h t 1 , x t + b i
C ˜ t = tan h W c h t 1 , x t + b c
C t = f t C t 1 + i t C ˜ t
o t = σ W o h t 1 , x t + b o
h t = o t tan h C t
where x t denotes the input; h t denotes the intermediate output; C t denotes the state memory unit; f t denotes the forgetting gate; i t denotes the input gate; o t denotes the output gate; and W and b denote the weight matrix and deviation matrix, respectively.
To obtain good prediction accuracy, the LSTM-based prediction method has to choose the proper parameters. Therefore, ISSA is chosen to optimize the LSTM’s hyperparameters.
The ISSA-LSTM power prediction flow chart is depicted in Figure 5, and the steps of the algorithm are presented in Algorithm 1:
Algorithm 1 ISSA-LSTM algorithm
Input: 
Initialize the population based on improved sine mapping and divide the producers and followers according to the corresponding fitness: P i , j 1 , F i , j 1 , S i , j 1 , t 1
1:
record X b e s t , X w o r s t
2:
repeat
3:
    t t + 1
4:
   Update P i , j t + 1 based on Equations (5) and (6)
5:
   Update F i , j t + 1 based on Equation (2)
6:
   Update S i , j t + 1 based on Equation (3)
7:
   recalculate fitness
8:
   while in the later stage of iteration do
9:
     use Equations (7) and (8) to assist the ISSA-LSTM algorithm in jumping out of local extremes
10:
   end while
11:
until t reaches its preset value
Output: 
the final X b e s t is the optimized hyperparameter
The ISSA-LSTM model, different from the traditional LSTM, adopts the ISSA to optimize the hyperparameters of LSTM, including the learning rate, the number of iterations, and the number of layer 1 and 2 neurons, which are usually selected by manual adjustment or experience. On the other hand, the algorithm of ISSA searches its extremum with fitness calculated according to the prediction output of LSTM. One may easily find that the searching process of ISSA and the training process of LSTM are coupled deeply. They are bound together to steer the hyperparameter optimization of LSTM and the training process.
It is worth noting that such an ISSA-LSTM framework improves the prediction performance by virtue of the searching characteristics of ISSA. To some extent, the search algorithm affects and even determines the prediction performance of ISSA-LSTM.

4. Results Analysis

4.1. Data Sources

The research in the present work uses data from photovoltaic power stations [30] and related meteorological data from 2019 to 2021 in Alice Springs, Australia, with a sampling frequency of five minutes. The power station is situated at 23.76° S, 133.87° E, and has an installed capacity of 26.5 KW. The data from 07:00 to 19:00 are intercepted because the PV does not produce power at night. Of the 156,694 data points collected, the data from 2019 to 2020 are divided by 80% and 20% for training and validation, respectively. The test set is made up of data from 2021’s typical seasonal and weather conditions. As illustrated in Figure 6, the trend of PV power generation between 2019 and 2021 is essentially the same, and the data are relatively uniform. The seasonality of PV power generation is very strong since Australia is located in the southern hemisphere, the power generation from April to September is lower than that from October to March of the next year.
PV power is affected by a variety of external factors such as solar radiation, temperature, precipitation, etc., but the more characteristic variables involved in the process of PV power prediction are not better. Table 3 displays the environmental variables associated with the PV power dataset, with statistics of the average value, standard deviation, and correlation coefficient with PV power for each dimension. In general, two variables are considered to be unrelated if their correlation coefficients are less than 0.3, so irrelevant variables with correlation coefficients below 0.3 are discarded.

4.2. Evaluation Indicators

Model evaluation metrics can be very intuitive to reflect how well the model is built. For most predictive tasks, there are three commonly used evaluation metrics, which are the mean absolute percentage error (MAPE), root mean square error (RMSE), and mean absolute error (MAE). The model’s accuracy is inversely proportional to the values of these evaluation metrics.
The formulas for three error metrics are given in the following equation:
M A P E = 100 % n i = 1 n y ^ i y i y i
R M S E = 1 n i = 1 n y ^ i y i 2
M A E = 1 n i = 1 n y ^ i y i

4.3. Benchmark Model Parameterization

Through the analysis of Section 4.1, relevant environmental factors highly correlated with actual PV power have been selected as the multi-input.
To accurately verify the superiority of the ISSA-LSTM algorithm presented in this work, five models, including BP, LSTM, PSO-BP [31], PSO-LSTM [32], and SSA-LSTM, were selected for comparative study. Among them, the iteration number of the BP model is 30, with a network depth of 20 and the same number of hidden layers. LSTM is a double hidden layer structure, the hidden neuron number in both layers is set to 20, the iteration number is 30, the batch size is 32, the loss function is RMSE, and the optimizer is Adam. The initial hyperparameters of all the algorithms to be optimized are the same as those of the above benchmark model. The initial population size is set to 20, and the iteration rounds for PSO, SSA, and ISSA algorithms are set to 30.
The PSO, SSA, and ISSA algorithms all select the optimal values within the searching boundary shown in Table 4. The detailed hyperparameters obtained from the PSO, SSA, and ISSA search optimizations are given in Table 5.
In this work, Python is used as the primary programming language; Numpy and Pandas packages are used for data preprocessing; the TensorFlow toolkit is used for model building and optimization, and Matplotlib is for plotting graphics. Computer resources for this work include an Intel Core i5-8250U running at 1.6 GHz with 8 GB RAM.

4.4. Analysis of Experimental Results under Different Weather Conditions

PV power systems are highly dependent on radiation, so poor radiation conditions on rainy days and high volatility of radiation on cloudy days are great challenges for stable PV power prediction. Several sets of comparative experiments were conducted for a thorough evaluation to confirm the reliability of ISSA-LSTM under various weather conditions.
The proposed model is compared with the benchmark model constructed in the previous subsection for one-day-ahead PV power prediction under typical weather conditions, namely, sunny, cloudy, and rainy days. The model performance is assessed using three statistical metrics, namely MAE, MAPE, and RMSE, recorded in Table 6, with the red color representing the best values among the six models. The comparison graphs of the prediction results are depicted in Figure 7, Figure 8 and Figure 9, where the top half of the subfigure shows the comparison curves of different models with true values, the bottom half of the subfigure shows three models—ISSA-LSTM, SSA-LSTM, and PSO-LSTM—and creates a bar graph, showing the error between the predicted and true values. The loss curve is shown in Figure 10.
As illustrated in the figure, the prediction curves of ISSA-LSTM under sunny, cloudy, and rainy days, are better fitted to the actual values with the smallest errors compared to the benchmark models, which have more comprehensive performance. As Figure 7 shows, the accuracy of all models is relatively higher due to the smooth PV power curve on sunny days, while cloudy days have the strongest volatility, and the model has the largest error, in Figure 8. In contrast to the SSA-LSTM model, which has the best relative results among the benchmark models, the MAE, MAPE, and RMSE values of the proposed model under sunny days are reduced by 61.3%, 55.3%, and 53.3%, respectively, while those of the proposed model under cloudy days are reduced by 13.2% and 18.2%, respectively, and those of the proposed model under rainy days are reduced by 24.6% and 27.8%. Additionally, the figure shows that the benchmark model struggles to capture the fluctuations in the spikes under rainy weather, while the proposed model is more capable of tracking the spikes. To summarize, the proposed algorithm not only has fairly high accuracy but also fits the spikes very well.
Taken together, the suggested model can cope with complex weather changes. By predicting the future fluctuations of PV power in advance, corresponding measures can be taken to optimize grid scheduling and energy management and adjust the power generation plan in time to improve the efficiency of energy utilization as well as prevent the occurrence of grid instability.

4.5. Analysis of Experimental Results in Different Seasons

Seasonal variations also influence PV power prediction a lot. To verify the practicability of the model in different seasons, the ISSA-LSTM contrasts the constructed benchmark model. A random week in typical months (February, May, August, and November) for each season is displayed in Figure 11, Figure 12, Figure 13 and Figure 14.
Figure 11 illustrates that the suggested model has a high prediction accuracy when the PV power changes rapidly and can respond to large fluctuations in radiation in a timely manner. As seen in Figure 14, power prediction accuracy based on ISSA-LSTM is lower than PSO-BP in the second half but has a better fit in the first half, and the simulation is better at the peak, which indicates its better overall performance.
Table 7 lists the model performances by MAE, MAPE, and RMSE metrics for typical seasons. Based on the red markers in the table, it is clear that the ISSA-LSTM model significantly outperforms the five benchmark models comprehensively and exhibits a more robust performance. The model’s predictions are more consistent and less anomalous than the other models, which further proves the advantages of its accuracy and reliability. In contrast to the SSA-LSTM model, which has the best relative results among the benchmark models, the MAE, MAPE, and RMSE values of ISSA-LSTM in February are reduced by 35.0%, 34.3%, and 25.0%, respectively, the RMSE of the ISSA-LSTM in May is reduced by 4.3%, and the MAE, MAPE, and RMSE values of the ISSA-LSTM in August are reduced by 16.7%, 69.7%, and 12.0%, respectively. The MAE, MAPE, and RMSE values of the ISSA-LSTM in November decreased by 25.7%, 3.6%, and 12.5%, respectively. In summary, ISSA-LSTM provides a more adaptable and efficient method for PV power prediction.

5. Conclusions

For the standard SSA, which easily falls into the local extremum and has insufficient convergence speed, a series of improvements are made in the initial population generation, producer position updating rules, and Cauchy–Gaussian variation in the later stages of iteration. An innovative prediction model is constructed based on the LSTM neural network optimized by the improved sparrow search algorithm. The proposed method was tested with a series of benchmark models in the real dataset of the PV power plant in the Alice Springs area of Australia. The conclusions are as follows:
1
The improved sparrow search algorithm has a better optimization-finding ability than the standard sparrow search algorithm and can search for better multi-dimensional hyperparameters of neural networks in a shorter time;
2
The ISSA-LSTM prediction model can cope with complex weather conditions and PV power fluctuations, which can make highly accurate predictions when used for PV power prediction.
Therefore, the algorithm proposed in this research helps the power dispatching department to adjust the scheduling plan in time, which is extremely significant to improve the stability and safety of the photovoltaic grid-connected system and the power system.
Nevertheless, there are still some shortcomings in this study: there are few environmental indicators in the dataset, and only the correlation coefficient method is used for feature screening. In a follow-up work, we will continue to improve the input data quality. As we mentioned earlier, the ISSA-LSTM algorithm stands out not only by its exceptional accuracy in predictions but also by its ability to fit spikes well. A potential drawback of the ISSA-LSTM algorithm is its relatively longer prediction time. This method could be applied to many different scenarios in future work, such as wind power prediction, etc.

Author Contributions

Conceptualization, Y.C.; methodology, Y.C.; software, Y.C.; validation, X.L. and S.Z.; formal analysis, Y.C.; investigation, Y.C. and X.L.; resources, X.L.; data curation, Y.C.; writing—original draft preparation, Y.C.; writing—review and editing, X.L. and S.Z.; visualization, Y.C.; supervision, X.L.; project administration, X.L.; funding acquisition, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Sensitivity Analysis

In the sparrow search algorithm, the producer proportion parameter α is one of the key factors to adjust the search strategy. The role of the producer in the algorithm is to introduce a certain degree of randomness and diversity, thus enhancing the algorithm’s ability to search the solution space globally, so it is important to set sensitivity analysis to find the proper parameter. Table A1 shows the comparative results for four sets of parameters 0.2, 0.4, 0.6, and 0.8.
Table A1. Sensitivity analysis results of the producer proportion α .
Table A1. Sensitivity analysis results of the producer proportion α .
Function Name 0.20.40.60.8
f 3 Avg2.14 × 10 08 2.17 × 10 08 2.16 × 10 08 2.18 × 10 08
Std1.17 × 10 07 1.23 × 10 07 1.22 × 10 07 1.25 × 10 07
f 7 Avg4.76 × 10 32 4.91 × 10 32 4.86 × 10 32 4.79 × 10 32
Std1.48 × 10 03 1.54 × 10 03 1.62 × 10 03 1.58 × 10 03
f 8 Avg7.27 × 10 30 7.28 × 10 30 7.27 × 10 30 7.28 × 10 30
Std3.97 × 10 29 3.32 × 10 29 4.02 × 10 29 4.22 × 10 29
f 16 Avg2.09 × 10 + 01 2.12 × 10 + 01 2.10 × 10 + 01 2.11 × 10 + 01
Std4.23 × 10 + 00 3.96 × 10 + 00 4.02 × 10 + 00 4.31 × 10 + 00
f 18 Avg3.88 × 10 + 02 3.95 × 10 + 02 4.01 × 10 + 02 3.96 × 10 + 02
Std2.65 × 10 + 02 2.77 × 10 + 02 2.69 × 10 + 02 2.43 × 10 + 02
From the results, it was observed that the optimal value was found when the parameter is 0.2. This indicates that, under this condition, ISSA tends to perform optimally, leading to a reduction in the objective function value.

References

  1. Murdock, H.E.; Gibb, D.; André, T.; Sawin, J.L.; Brown, A.; Ranalder, L.; Collier, U.; Dent, C.; Epp, B.; Hareesh Kumar, C.; et al. Renewables 2021-Global Status Report. 2021. Available online: https://inis.iaea.org/search/search.aspx?orig_q=RN:52059346 (accessed on 5 January 2024).
  2. IRENA. Global Renewables Outlook: Energy Transformation 2050. Abu Dhabi. 2020. Available online: https://www.irena.org/publications/2020/Apr/Global-Renewables-Outlook-2020 (accessed on 5 January 2024).
  3. Paudel, A.; Khorasany, M.; Gooi, H.B. Decentralized local energy trading in microgrids with voltage management. IEEE Trans. Ind. Inform. 2020, 17, 1111–1121. [Google Scholar] [CrossRef]
  4. Ahmed, R.; Sreeram, V.; Mishra, Y.; Arif, M. A review and evaluation of the state-of-the-art in PV solar power forecasting: Techniques and optimization. Renew. Sustain. Energy Rev. 2020, 124, 109792. [Google Scholar] [CrossRef]
  5. Yang, Z.; Wang, J. A hybrid forecasting approach applied in wind speed forecasting based on a data processing strategy and an optimized artificial intelligence algorithm. Energy 2018, 160, 87–100. [Google Scholar] [CrossRef]
  6. Moon, J.; Hossain, M.B.; Chon, K.H. AR and ARMA model order selection for time-series modeling with ImageNet classification. Signal Process. 2021, 183, 108026. [Google Scholar] [CrossRef]
  7. Benvenuto, D.; Giovanetti, M.; Vassallo, L.; Angeletti, S.; Ciccozzi, M. Application of the ARIMA model on the COVID-2019 epidemic dataset. Data Brief 2020, 29, 105340. [Google Scholar] [CrossRef] [PubMed]
  8. Liu, M.; Cao, Z.; Zhang, J.; Wang, L.; Huang, C.; Luo, X. Short-term wind speed forecasting based on the Jaya-SVM model. Int. J. Electr. Power Energy Syst. 2020, 121, 106056. [Google Scholar] [CrossRef]
  9. Konakoglu, B. Prediction of geodetic point velocity using MLPNN, GRNN, and RBFNN models: A comparative study. Acta Geod. Geophys. 2021, 56, 271–291. [Google Scholar] [CrossRef]
  10. Jung, Y.; Jung, J.; Kim, B.; Han, S. Long short-term memory recurrent neural network for modeling temporal patterns in long-term power forecasting for solar PV facilities: Case study of South Korea. J. Clean. Prod. 2020, 250, 119476. [Google Scholar] [CrossRef]
  11. Huang, X.; Li, Q.; Tai, Y.; Chen, Z.; Liu, J.; Shi, J.; Liu, W. Time series forecasting for hourly photovoltaic power using conditional generative adversarial network and Bi-LSTM. Energy 2022, 246, 123403. [Google Scholar] [CrossRef]
  12. Yu, Y.; Cao, J.; Zhu, J. An LSTM short-term solar irradiance forecasting under complicated weather conditions. IEEE Access 2019, 7, 145651–145666. [Google Scholar] [CrossRef]
  13. Ospina, J.; Newaz, A.; Faruque, M.O. Forecasting of PV plant output using hybrid wavelet-based LSTM-DNN structure model. IET Renew. Power Gener. 2019, 13, 1087–1095. [Google Scholar] [CrossRef]
  14. Yufang, L.; Mingnuo, C.; Wanzhong, Z. Investigating long-term vehicle speed prediction based on BP-LSTM algorithms. IET Intell. Transp. Syst. 2019, 13, 1281–1290. [Google Scholar] [CrossRef]
  15. Zhao, S.; Wu, Y.; Tan, S.; Wu, J.; Cui, Z.; Wang, Y.G. QQLMPA: A quasi-opposition learning and Q-learning based marine predators algorithm. Expert Syst. Appl. 2023, 213, 119246. [Google Scholar] [CrossRef]
  16. Tan, S.; Zhao, S.; Wu, J. QL-ADIFA: Hybrid optimization using Q-learning and an adaptive logarithmic spiral-levy firefly algorithm. Math. Biosci. Eng. 2023, 20, 13542–13561. [Google Scholar] [CrossRef] [PubMed]
  17. Tang, J.; Liu, G.; Pan, Q. A review on representative swarm intelligence algorithms for solving optimization problems: Applications and trends. IEEE/CAA J. Autom. Sin. 2021, 8, 1627–1643. [Google Scholar] [CrossRef]
  18. Guo, K.; Cheng, X.; Ge, H.; Shi, J. Short-Term Photovoltaic Power Forcasting Based on PSO-BP Neural Network. In Proceedings of the 2020 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia), IEEE, Weihai, China, 13–15 July 2020; pp. 303–309. [Google Scholar]
  19. Ruxue, L.; Shumin, L.; Miaona, Y.; Jican, L. Load forecasting based on weighted grey relational degree and improved ABC-SVM. J. Electr. Eng. Technol. 2021, 16, 2191–2200. [Google Scholar] [CrossRef]
  20. Boudardara, F.; Gorkemli, B. Solving artificial ant problem using two artificial bee colony programming versions. Appl. Intell. 2020, 50, 3695–3717. [Google Scholar] [CrossRef]
  21. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  22. Liu, G.; Shu, C.; Liang, Z.; Peng, B.; Cheng, L. A modified sparrow search algorithm with application in 3d route planning for UAV. Sensors 2021, 21, 1224. [Google Scholar] [CrossRef]
  23. Wang, P.; Zhang, Y.; Yang, H. Research on economic optimization of microgrid cluster based on chaos sparrow search algorithm. Comput. Intell. Neurosci. 2021, 2021, 5556780. [Google Scholar] [CrossRef]
  24. Wu, D.; Yuan, C. Threshold image segmentation based on improved sparrow search algorithm. Multimed. Tools Appl. 2022, 81, 33513–33546. [Google Scholar] [CrossRef] [PubMed]
  25. Yang, H.; Jiaqiang, E. An adaptive chaos immune optimization algorithm with mutative scale and its application. Control Theory Appl. 2009, 26, 1069–1074. [Google Scholar]
  26. Öztürk, Ş.; Ahmad, R.; Akhtar, N. Variants of Artificial Bee Colony algorithm and its applications in medical image processing. Appl. Soft Comput. 2020, 97, 106799. [Google Scholar] [CrossRef]
  27. Li, K.; Li, S.; Huang, Z.; Zhang, M.; Xu, Z. Grey Wolf Optimization algorithm based on Cauchy-Gaussian mutation and improved search strategy. Sci. Rep. 2022, 12, 18961. [Google Scholar] [CrossRef]
  28. Li, C.; Guo, L.; Gao, H.; Li, Y. Similarity-measured isolation forest: Anomaly detection method for machine monitoring data. IEEE Trans. Instrum. Meas. 2021, 70, 1–12. [Google Scholar] [CrossRef]
  29. Abbasimehr, H.; Shabani, M.; Yousefi, M. An optimized model using LSTM network for demand forecasting. Comput. Ind. Eng. 2020, 143, 106435. [Google Scholar] [CrossRef]
  30. Solar Data Download Location: Alice Springs. 2020. Available online: http://dkasolarcentre.com.au/ (accessed on 5 January 2024).
  31. Huang, Y.; Xiang, Y.; Zhao, R.; Cheng, Z. Air quality prediction using improved PSO-BP neural network. IEEE Access 2020, 8, 99346–99353. [Google Scholar] [CrossRef]
  32. Gundu, V.; Simon, S.P. PSO–LSTM for short term forecast of heterogeneous time series electricity price signals. J. Ambient. Intell. Humaniz. Comput. 2021, 12, 2375–2385. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of chaotic mapping: (a) a = 0.2, b = 0.4, (b) a = 0.6, b = 0.2, (c) a = 0.2, b = 0.4.
Figure 1. Schematic diagram of chaotic mapping: (a) a = 0.2, b = 0.4, (b) a = 0.6, b = 0.2, (c) a = 0.2, b = 0.4.
Electronics 13 00993 g001
Figure 2. Convergence curves of the ISSA, PSO, and SSA algorithm for 8 of 20 test functions. (a) f3; (b) f7; (c) f8; (d) f10; (e) f16; (f) f18; (g) f19; (h) f20.
Figure 2. Convergence curves of the ISSA, PSO, and SSA algorithm for 8 of 20 test functions. (a) f3; (b) f7; (c) f8; (d) f10; (e) f16; (f) f18; (g) f19; (h) f20.
Electronics 13 00993 g002aElectronics 13 00993 g002b
Figure 3. The overall framework of PV power prediction.
Figure 3. The overall framework of PV power prediction.
Electronics 13 00993 g003
Figure 4. The network structure of LSTM.
Figure 4. The network structure of LSTM.
Electronics 13 00993 g004
Figure 5. ISSA-LSTM method flow chart.
Figure 5. ISSA-LSTM method flow chart.
Electronics 13 00993 g005
Figure 6. The maximum power generation for each day from 2019 to 2021 (Unit: kW).
Figure 6. The maximum power generation for each day from 2019 to 2021 (Unit: kW).
Electronics 13 00993 g006
Figure 7. One-day-ahead prediction results and error analysis on a sunny day.
Figure 7. One-day-ahead prediction results and error analysis on a sunny day.
Electronics 13 00993 g007
Figure 8. One-day-ahead prediction results and error analysis on a cloudy day.
Figure 8. One-day-ahead prediction results and error analysis on a cloudy day.
Electronics 13 00993 g008
Figure 9. One-day-ahead prediction results and error analysis on a rainy day.
Figure 9. One-day-ahead prediction results and error analysis on a rainy day.
Electronics 13 00993 g009
Figure 10. Loss curve.
Figure 10. Loss curve.
Electronics 13 00993 g010
Figure 11. PV power prediction results of a random week in February.
Figure 11. PV power prediction results of a random week in February.
Electronics 13 00993 g011
Figure 12. PV power prediction results of a random week in May.
Figure 12. PV power prediction results of a random week in May.
Electronics 13 00993 g012
Figure 13. PV power prediction results of a random week in August.
Figure 13. PV power prediction results of a random week in August.
Electronics 13 00993 g013
Figure 14. PV power prediction results of a random week in November.
Figure 14. PV power prediction results of a random week in November.
Electronics 13 00993 g014
Table 1. Test functions.
Table 1. Test functions.
FunctionFunction ExpressionDimensionRangeOptimum Value
f 1 f 1 x = i = 1 n x i 2 10[−100,100]0
f 2 f 2 x = max i | x i | ,   1 i n 10[−100,100]0
f 3 f 3 x = i = 1 n 1 100 x i + 1 x i 2 2 + x i 1 2 10[−30,30]0
f 4 f 4 x = i = 1 n x i + 0.5 2 10[−100,100]0
f 5 f 5 x = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 10[−1.28,1.28]0
f 6 f 6 x = i = 1 n x i sin x i 10[−500,500]−4189.82
f 7 f 7 x = π n { 10 sin 2 π y 1 + i = 1 n 1 y i 1 2 1 + 10 sin 2 π y i + 1
+ y n 1 2 } + i = 1 n u x i , 10 , 100 , 4
y i = 1 + x i + 1 4 , u x i , a , k , m = { k x i a m x i > a 0 a < x i < a k x i a m x i < a
10[−50,50]0
f 8 f 8 x = 0.1 { sin 2 3 π x 1 + i = 1 n 1 x i 1 2 1 + sin 2 3 π x i + 1
+ x n 1 2 1 + sin 2 2 π x n } + i = 1 n u x i , 5 , 100 , 4
10[−50,50]0
f 9 f 9 x = 1 500 + j = 1 25 1 j + i = 1 2 x i a i j 6 1 2[−65,65]1
f 10 f 10 x = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 2 4[−5,5]0.0003
f 11 f 11 x = 1 + x 1 + x 2 + 1 2 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2
× 30 + 2 x 1 3 x 2 2 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2
2[−2,2]3
f 12 f 12 x = i = 1 4 c i exp j = 1 3 a i j x j p i j 2 3[1,3]−3.86
f 13 f 13 x = i = 1 4 c i exp j = 1 6 a i j x j p i j 2 6[0,1]−3.32
f 14 f 14 x = i = 1 5 x a i x a i T + c i 1 4[0,10]−10.1532
f 15 f 15 x = 12.7120622568 + i = 1 n 1 j = i + 1 n 1 d i j 2 2 d i j 2
d i j = k = 0 2 x 3 i + k 2 x 3 j + k 2 2 3 , n = D 3
18[−4,4]1
f 16 f 16 x = i = 1 D x i 2 10 cos 2 π x i + 10 10[−100,100]1
f 17 f 17 x = i = 1 D x i 2 4000 i = 1 D cos x i i + 1 10[−100,100]1
f 18 f 18 x = i = 1 D k = 0 k m a x a k cos 2 π b k x i + 0.5
D k = 0 k m a x a k cos π b k
10[−100,100]1
f 19 f 19 = g x 1 , x 2 + g x 2 , x 3 + + g x D 1 , x D + g x D , x 1
g x , y = 0.5 + sin 2 x 2 + y 2 0.5 1 + 0.001 x 2 + y 2 2
10[−100,100]1
f 20 f 20 = 20 e x p 0.2 1 D i = 1 D x i 2 exp 1 D i = 1 D cos 2 π x i + 20 + e 10[−100,100]1
Table 2. Results on 20 test functions.
Table 2. Results on 20 test functions.
FunctionISSASSAPSOFunctionISSASSAPSO
f 1 Avg1.24 × 10 156 4.20 × 10 70 2.22 × 10 11 f 2 Avg1.56 × 10 73 2.16 × 10 32 3.06 × 10 3
Std2.32 × 10 157 7.90 × 10 71 2.50 × 10 11 Std8.46 × 10 73 8.99 × 10 30 4.40 × 10 3
p-val-5.75 × 10 11 1.21 × 10 12 p-val-4.57 × 10 12 1.21 × 10 12
h-val-11h-val-11
f 3 Avg2.14 × 10 8 2.64 × 10 6 1.17 × 10 1 f 4 Avg1.04 × 10 30 3.18 × 10 22 4.18 × 10 11
Std1.17 × 10 7 5.30 × 10 6 3.71 × 10 1 Std2.56 × 10 30 5.29 × 10 22 1.21 × 10 10
p-val-3.82 × 10 10 3.02 × 10 11 p-val-1.66 × 10 8 1.21 × 10 12
h-val-11h-val-11
f 5 Avg1.66 × 10 3 1.94 × 10 2 3.73 × 10 3 f 6 Avg−4.19 × 10 3 −1.26 × 10 4 −1.26 × 10 4
Std1.48 × 10 3 1.06 × 10 2 2.37 × 10 3 Std1.72 × 10 12 2.78 × 10 2 2.20 × 10 2
p-val-1.89 × 10 3 4.98 × 10 11 p-val-5.17 × 10 12 5.18 × 10 12
h-val-11h-val-11
f 7 Avg4.76 × 10 32 7.00 × 10 30 6.71 × 10 13 f 8 Avg7.27 × 10 30 1.16 × 10 29 7.53 × 10 12
Std8.44 × 10 34 3.24 × 10 29 3.96 × 10 13 Std3.97 × 10 29 4.21 × 10 29 1.69 × 10 11
p-val-1.62 × 10 9 1.61 × 10 11 p-val-4.38 × 10 9 5.22 × 10 12
h-val-11h-val-11
f 9 Avg9.9 × 10 1 5.99 × 10 0 9.9 × 10 1 f 10 Avg3.07 × 10 3 3.07 × 10 3 3.06 × 10 3
Std1.82 × 10 16 5.70 × 10 0 5.83 × 10 17 Std1.33 × 10 10 7.93 × 10 11 6.90 × 10 3
p-val-3.68 × 10 1 5.14 × 10 6 p-val-9.62 × 10 2 4.49 × 10 11
h-val-01h-val-01
f 11 Avg3.90 × 10 0 3.00 × 10 0 3.00 × 10 0 f 12 Avg−3.86 × 10 0 −3.86 × 10 0 −3.86 × 10 0
Std4.93 × 10 0 1.62 × 10 15 1.55 × 10 15 Std2.27 × 10 15 2.66 × 10 15 2.51 × 10 15
p-val-9.8 × 10 2 4.01 × 10 1 p-val-9.62 × 10 3 2.14 × 10 8
h-val-00h-val-11
f 13 Avg−3.32 × 10 0 −3.29 × 10 0 −3.25 × 10 0 f 14 Avg−1.01 × 10 1 −9.64 × 10 0 −5.72 × 10 0
Std1.64 × 10 15 5.34 × 10 2 6.65 × 10 2 Std5.22 × 10 15 1.56 × 10 0 3.32 × 10 0
p-val-3.4 × 10 1 4.3 × 10 2 p-val-4.59 × 10 1 9.31 × 10 7
h-val-01h-val-01
f 15 Avg1.38 × 10 0 3.30 × 10 0 6.25 × 10 0 f 16 Avg2.09 × 10 1 5.05 × 10 1 3.12 × 10 1
Std7.22 × 10 5 6.31 × 10 3 8.21 × 10 4 Std4.23 × 10 0 1.11 × 10 1 6.03 × 10 1
p-val-3.02 × 10 11 2.52 × 10 10 p-val-4.26 × 10 12 4.22 × 10 7
h-val-11h-val-11
f 17 Avg1.22 × 10 0 1.49 × 10 0 1.20 × 10 0 f 18 Avg3.88 × 10 2 6.13 × 10 2 1.02 × 10 3
Std2.11 × 10 1 2.42 × 10 1 1.31 × 10 1 Std2.65 × 10 2 2.98 × 10 2 4.32 × 10 2
p-val-9.6 × 10 2 2.1 × 10 1 p-val-2.3 × 10 2 1.21 × 10 12
h-val-00h-val-11
f 19 Avg3.04 × 10 0 3.75 × 10 0 5.02 × 10 0 f 20 Avg2.09 × 10 1 2.10 × 10 1 2.13 × 10 1
Std3.03 × 10 1 6.75 × 10 1 4.42 × 10 1 Std5.24 × 10 2 6.45 × 10 2 1.86 × 10 1
p-val-4.24 × 10 2 1.03 × 10 8 p-val-7.38 × 10 1 1.83 × 10 4
h-val-11h-val-01
Table 3. Correlation coefficient between actual PV power and related environmental factors.
Table 3. Correlation coefficient between actual PV power and related environmental factors.
VariableMeanStandard
Deviation
Correlation
Coefficient
VariableMeanStandard
Deviation
Correlation
Coefficient
Temperature Celsius21.1711.050.4Wind Direction30.3913.83−0.03
Relative Humidity32.621.51−0.38Daily Rainfall0.221.8−0.02
Global Horizontal Radiation276374.230.92Active Energy Delivered Received305,330.1613,454.130.003
Diffuse Horizontal Radiation52.1386.190.56Current Phase Average2.83.620.99
Table 4. Search space for hyperparameters.
Table 4. Search space for hyperparameters.
HyperparameterLearning RateEpochsFirst Hidden
Layer Nodes
Second Hidden
Layer Nodes
Lower bound0.001111
Upper bound0.5100100100
Table 5. Parameters obtained by optimization.
Table 5. Parameters obtained by optimization.
ModelsLearning RateEpochsFirst Hidden
Layer Nodes
Second Hidden
Layer Nodes
PSO-LSTM0.1927261385
SSA-LSTM0.0892782064
ISSA-LSTM0.0634926543
Table 6. Comparison of experimental results under various weather conditions.
Table 6. Comparison of experimental results under various weather conditions.
ModelMetricSunny DayWindy DayRainy Day
MAE0.6411.1160.749
MAPE0.1280.5270.679
BPRMSE0.6531.3650.480
MAE0.2220.6540.355
MAPE0.0410.1570.300
PSO-BPRMSE0.2640.9480.319
MAE0.3710.6760.860
MAPE0.0700.1840.793
LSTMRMSE0.3950.9560.535
MAE0.4290.6040.474
MAPE0.0940.1660.415
PSO-LSTMRMSE0.4660.8950.304
MAE0.2660.5840.349
MAPE0.0470.1280.269
SSA-LSTMRMSE0.3130.8430.163
MAE0.1030.5070.263
MAPE0.0210.1340.194
ISSA-LSTMRMSE0.1460.6890.166
Table 7. Comparison of model results in typical seasons in 2021.
Table 7. Comparison of model results in typical seasons in 2021.
ModelMetric2021/022021/052021/082021/11
MAE0.760.630.751.16
BPMAPE0.537.2710.640.77
RMSE1.260.70.821.24
MAE1.060.250.270.27
PSO-BPMAPE0.433.196.020.31
RMSE1.890.310.350.37
MAE0.860.480.50.6
LSTMMAPE0.495.863.010.34
RMSE1.280.580.610.66
MAE0.790.640.630.7
PSO-LSTMMAPE0.486.388.250.49
RMSE1.290.770.770.79
MAE0.60.140.180.35
SSA-LSTMMAPE0.322.1911.040.28
RMSE1.120.230.250.4
MAE0.390.160.150.26
ISSA-LSTMMAPE0.212.263.340.27
RMSE0.840.220.220.35
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, Y.; Li, X.; Zhao, S. A Novel Photovoltaic Power Prediction Method Based on a Long Short-Term Memory Network Optimized by an Improved Sparrow Search Algorithm. Electronics 2024, 13, 993. https://doi.org/10.3390/electronics13050993

AMA Style

Chen Y, Li X, Zhao S. A Novel Photovoltaic Power Prediction Method Based on a Long Short-Term Memory Network Optimized by an Improved Sparrow Search Algorithm. Electronics. 2024; 13(5):993. https://doi.org/10.3390/electronics13050993

Chicago/Turabian Style

Chen, Yue, Xiaoli Li, and Shuguang Zhao. 2024. "A Novel Photovoltaic Power Prediction Method Based on a Long Short-Term Memory Network Optimized by an Improved Sparrow Search Algorithm" Electronics 13, no. 5: 993. https://doi.org/10.3390/electronics13050993

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop