Next Article in Journal
GMDSS Equipment Usage: Seafarers’ Experience
Previous Article in Journal
Effect of Maritime Traffic on Water Quality Parameters in Santa Marta, Colombia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Numerical Simulation of Wind Wave Using Ensemble Forecast Wave Model: A Case Study of Typhoon Lingling

1
Operational Systems Development Department, National Institute of Meteorological Sciences, Jeju 63568, Korea
2
Department of Civil Engineering, Kunsan National University, Kunsan 54150, Korea
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2021, 9(5), 475; https://doi.org/10.3390/jmse9050475
Submission received: 19 March 2021 / Revised: 16 April 2021 / Accepted: 26 April 2021 / Published: 28 April 2021
(This article belongs to the Section Ocean Engineering)

Abstract

:
A wave forecast numerical simulation was performed for Typhoon Lingling around the Korean Peninsula and in the East Asia region using sea winds from 24 members produced by the Ensemble Prediction System for Global (EPSG) of Korea Meteorological Administration (KMA). Significant wave height was observed by the ocean data buoys used to verify data of the ensemble wave model, and the results of the ensemble members were analyzed through probability verification. The forecast performance for the significant wave height improved by approximately 18% in the root mean square error in the three-day lead time compared to that of the deterministic model, and the difference in performance was particularly distinct towards mid-to-late lead times. The ensemble spread was relatively appropriate, even in the longer lead time, and each ensemble model runs were all stable. As a result of the probability verification, information on the uncertainty that could not be provided in the deterministic model could be obtained. It was found that all the Relative Operating Characteristic (ROC) curves were 0.9 or above, demonstrating good predictive performance, and the ensemble wave model is expected to be useful in identifying and determining hazardous weather conditions.

1. Introduction

The ensemble prediction technique is widely used to compensate for the limitations of deterministic prediction of a deterministic forecast model, which has prediction errors due to the initial conditions of the numerical model and the uncertainty of the prediction model [1,2,3]. This technique performs probabilistic prediction by considering the possibilities for various initial conditions, physical processes, and boundary conditions and includes both the prediction information and information on forecast uncertainty provided by the conventional deterministic forecast model [4,5,6]. Therefore, it has better predictive performance than the deterministic forecast model and is extremely useful for determining various hazardous weather conditions, since it is based on possible marine weather scenarios. Recently, studies to improve the prediction accuracy of ensemble models using machine learning, etc., have been conducted [7,8]. However, the economic aspects and computational efficiency need to be considered when using the ensemble model, and a high-performance supercomputer capable of computing a large amount of data is required. In addition, the prediction results may have a large deviation, depending on the numerical model, since there is a large amount of prediction data, and it may be difficult to obtain the probability analysis data. However, it is a prediction technique that is extremely useful, as it can identify the uncertainty in prediction and information on various hazardous weather conditions.
The wave forecast model to which the ensemble technique is applied provides prediction information through a field operation and is used for probabilistic wave forecast research for hazardous weather [9,10,11,12]. In the forecast model, although prediction uncertainty is affected by physical processes, initial conditions, and boundary conditions, it is known that the wave model results are directly affected by the prediction accuracy of sea surface winds by atmospheric models the most [13,14,15]. Although the result of the wave forecast model differs depending on the accuracy of the input sea surface wind, the prediction uncertainty cannot be determined by the deterministic forecast model, even though there is uncertainty in predicting the actual event, since the information is limited. On the other hand, the ensemble model can identify the uncertainty of prediction, as various sea surface wind scenarios are composed of each ensemble member, and a prediction model is performed [16].
Verification indices of the probabilistic forecast performance of the ensemble model include the Brier score (BS), Brier skill score (BSS), reliability diagram, and Relative Operating Characteristic (ROC), which are verification indices that can be generally used to determine the accuracy of the probabilistic forecast and uncertainty in prediction [17,18,19]. In addition, the ensemble spread is the standard deviation of each member of the ensemble mean. It is a representative verification index to diagnose the prediction results of the ensemble model [20,21,22]. This is an important diagnostic tool that can determine the accuracy of the prediction results of each member and the performance of numerical simulations for an actual event, which is used to verify the probabilistic prediction of the ensemble model.
In this study, numerical simulation of wave prediction was performed on Typhoon Lingling, which caused great damage to the Korean Peninsula in September 2019, using the sea surface wind data produced by Ensemble Prediction System for Global (EPSG) of Korea Meteorological Administration (KMA) based on the ensemble prediction technique, and the third-generation wave model WAVEWATCH III was identified by applying the probabilistic prediction verification method. In addition, the prediction accuracy of the ensemble model for rapidly changing marine weather due to hazardous weather was evaluated by comparing it with the prediction results of the deterministic forecast model, and the prediction uncertainty was analyzed.

2. Methodology

2.1. Ensemble Wave Model Setup

The Northeast Asian region with the latitude of 20–50° N and longitude of 115–150° E is the computational domain of the ensemble wave model as the regional wave model operated by KMA. The bathymetry data was established, as shown in Figure 1, based on ETOPO1 and the global self-consistent hierarchical high-resolution shoreline (GSHHS) coastline data provided by the National Geophysical Data Center (NGDC). The spherical coordinate system was used as the coordinate system of the model, and the resolution of the model is approximately 8 km at 1/12°.
The best track of Typhoon Lingling provided through the Joint Typhoon Warning Center (JTWC), is shown in Figure 1. Lingling formed at 9 a.m. on 2 September 2019, on the sea about 560 km east of Manila, Philippines. At 3 p.m. on 5 September, it developed to a central air pressure of 940 hPa, maximum wind speed of 46 m/s, and wind radius of 390 km (east radius) about 320 km southwest of Okinawa, Japan. Then, it moved north from the East China Sea and passed the southern coast of the Korean Peninsula at 6 a.m. on 7 September 2019. It weakened as it moved north to northeast at a rapid pace on the western coast of the Korean Peninsula and transformed into an extratropical cyclone of 985 hPa from the land around 160 km northwest of Vladivostok, Russia at 9 a.m. on 8 September.
The ensemble wave forecast model was applied with the wave energy physical package proposed by Ardhuin et al. [23] based on the WAVEWATCH III developed by the U.S. National Weather Service, and a version with improved accuracy through the optimization process for the physical variables of the wave model was used [24,25,26]. For the wind forcing, 10 m high sea wind data of a total of 24 ensemble members produced by EPSG of KMA was used. The prediction results of the model performed 12 h in advance by each ensemble member were used for the initial field, and the boundary data for the same computational domain of the regional wave model, a deterministic forecast model, was used for the boundary field. The ensemble wave model was performed twice a day (00 UTC, 12 UTC), which predicted up to 120 h from the start time of the model. In this study, the analysis was performed by setting the lead time for predicting performance verification up to 120 h at 3-h intervals (Table 1).
The overall operation diagram of the EPSG and ensemble wave forecast system is as shown in Figure 2, and the operation of the EPSG produced the sea surface wind of 11 members predicted 6 h in advance and the sea surface wind of 13 members predicted while the model was executed, including the control member by introducing the time-lag technique to prevent overloading of computing resources. The prediction was performed up to 240 h at 3-h intervals by using the produced sea wind prediction data of a total of 24 members as input data for the wave model.

2.2. Numerical Method

In this study, the ensemble wave model was used to perform numerical simulations for the Typhoon Lingling that passed near the Korean Peninsula in September 2019. The model was executed using a total of 24 members, including control members at 00 UTC and 12 UTC, and the analysis period was from 00 UTC on 1 September 2019, to 12 UTC on 9 September 2019. To verify the predictive performance of the model, the predicted results of up to 120 h from the time the wave model was executed at 3-h intervals were used, and the prediction results were classified and analyzed by the lead time based on the time the model was executed.
The significant wave height data observed at 17 ocean data buoys around the Korean Peninsula operated by KMA (Table 2) was used for the probabilistic prediction verification of the ensemble wave model, and the probabilistic predictive analysis was performed, excluding cases in which data was lost due to the influence of typhoons during the verification process. In addition, the prediction performance of the ensemble wave model was compared using the prediction results of the deterministic forecast model. The prediction results of the regional wave model performed in the same computational domain as the ensemble model were used for the deterministic forecast model. The regional wave model was also performed twice a day at 00 UTC and 12 UTC as the ensemble wave model, and the prediction was performed up to 120 h at 3-h intervals.
The proper distribution of the model prediction results and the performance of ensemble members in implementing the actual event were first diagnosed through the rank histogram and spread-skill graph based on the comparison between the prediction results of each member of the ensemble model and the results observed by the ocean data buoys as a method of verifying the wave model, and the probabilistic wave model’s prediction accuracy and prediction uncertainty in hazardous weather conditions were evaluated through the probabilistic prediction verification indices of the BS and ROC.

3. Results

3.1. Sea Wind Prediction Results of 24 Ensemble Members

The initial field of 23 members is produced through a perturbation process using the ensemble transformation Kalman filter (ETKF) when the initial field of the atmospheric model is produced. The control member that performed the model without adding perturbation and the perturbation member that added the analysis field by ETKF to the initial field were determined, and 24 different sea wind prediction results were obtained. The sea wind prediction results of three random members out of 24 members performed at 00 UTC on September 5, 2019, are as shown in Figure 3. The results of the sea surface wind with forecast lead time of control member of +00 h, +24 h, +48 h are shown in Figure 3a–c. The results by forecast lead time of the 13th member are as shown in Figure 3d–f, and the prediction results of the 24th member are as shown in Figure 3g–i. The prediction result of each member was slightly different than that of the control member, and the prediction deviation increased as the lead time increased. This increasing prediction deviation indicates the uncertainty of the prediction model, and it was found that there may be a large prediction error at the final lead time when predicting through a deterministic forecast model.

3.2. Ensemble Wave Model Forecasting Results

The wave model was run for 5 days before the typhoon moved north, using the sea wind data of 24 members. This was to simulate a realistic numerical simulation by implementing a state in which waves are sufficiently developed by the wind. The spaghetti contours of 3 m and 5 m significant wave heights, which were predicted at 00 UTC on 3 September 2019, are shown in Figure 4 and Figure 5, respectively. The spaghetti contours, according to the threshold significant wave height, are shown by forecast lead times 00 h, 24 h, 48 h, and 72 h, respectively. Overall, a high significant wave height was predicted around the eye of the typhoon along the moving path of the typhoon, and the difference between the ensemble mean and the predicted significant wave height for each of 24 members was found. In addition, although the results of the deterministic forecast model tended to be similar to those of the ensemble model, the prediction result does not reflect the prediction uncertainty of the ensemble model, because the model was performed under an individual condition; thus, it has limitations for use in probabilistic forecasting.
The prediction results for each member were compared with the observation results. The predicted significant wave heights for each ensemble member were compared with the observed significant wave height, and the appropriateness of the spread of the ensemble wave model was diagnosed through the histogram for all buoy locations. The rank histogram is a representative diagnostic tool for evaluating the ensemble spread, which is generated by observed values from an ensemble sorted from lowest to highest [17]. A left- or right-skewed rank histogram means that there is a deviation in the ensemble spread; the U-shaped rank histogram means that the ensemble spread is either low and high biases, and a concave rank histogram means that the ensemble spread is narrow. Lastly, flat rank histograms indicate that the probability of the observed values belonging to each interval of the ensemble spread is similar and that the ensemble spread is relatively appropriate.
The rank histograms of lead times 00 h, 24 h, 48 h, and 72 h, respectively, are as shown in Figure 6a–d. The rank histogram up to a lead time of 24 h shows a U-shape with bias on the left and right, indicating that the ensemble does not spread out enough. After 24 h, it can be determined that the spread of the ensemble members is appropriate, because there was no bias with a flat shape, and that the range predicted by the model is reliable. The ensemble mean and spread were compared to determine whether the ensemble members could represent the actual event. The ensemble mean and the root-mean-square error (RMSE) of the observed data were calculated, and spread, which is the standard deviation of the members to the ensemble mean, was calculated through Equation (1) [27].
SPREAD = 1 M m = 1 M f m f ¯ 2
where M denotes the size of the ensemble, f m denotes the predicted value of the m member, and f ¯ denotes the ensemble mean. In general, it is known that there is a high correlation between RMSE and spread at the beginning of the forecast lead time, and the correlation decreases as the forecast continues [20]. It means that the ensemble spread simulates the actual event well when it is closer to the diagonal line of the graph. The relationship between the spread and the RMSE for the significant wave height over all buoy observation locations is as shown in Figure 7, and the spread frequency in the ensemble spread interval is as shown in the histogram. The 72 h forecast result tended to deviate slightly from the diagonal line than the beginning of the lead time, while it matched well with the diagonal line up to approximately 1 m. It was found that there was no significant correlation depending on the lead time, as this was a result of verifying a short-term probabilistic prediction for the typhoon period, and it was generally consistent with the diagonal line, indicating that the ensemble spread represents the actual event well.

3.3. Ensemble Wave Model Forecast Performance and Probabilistic Verification Result

The prediction results of the deterministic forecast model and the predicted result of the ensemble mean were compared with the observed significant wave height, and the bias and RMSE for each lead time were calculated through Equations (2) and (3).
b i a s = 1 n i = 1 n F i A i
RMSE = 1 n i = 1 n F i A i 2
where, F i is the predicted value, A i is the observed value, and n is the sample size. The results predicted up to 120 h at 3-h intervals in which the model was executed were compared with the significant wave heights observed in 17 ocean data buoys. In order to compare with the deterministic model result, averaged bias and RMSE values are calculated over all buoy observation points for each ensemble. As the result, both ensemble model and the deterministic model showed a tendency to over-estimation for the typhoon. The prediction error tended to increase when the lead time increased in both the ensemble model and the deterministic forecast model, as shown in Figure 8. The prediction error of the deterministic forecast model was particularly larger than that of the ensemble model, and the model tended to overestimate after 3 days of lead time. The ensemble wave model forecast performance for the significant wave height improved by approximately 18% in the RMSE in the 3-day lead time compared to the deterministic forecast model. In addition, it was confirmed that the typhoon intensity declined after 96 h of the forecast lead time, and the positive bias had decreased at the same forecast lead time. The time series of the observed significant wave height and the predicted data by both models at all buoy observation points were compared to determine the variations of bias and RMSE values, as seen in Figure 8. There are differences that were found in the prediction results of the wave model performed before and after the typhoon passed the buoy observation points. Overall, the model results predicted before the typhoon passed the observation points predicted that the maximum significant wave would occur earlier than the buoy observation, indicating that the positive bias and RMSE variations would occur later in the forecast lead time. Forecasting of waves conducted from the time the typhoon passed did not show significant differences between the predicted and observed values over the entire forecast lead time, and this difference seems to have reduced positive bias and forecast errors.
The BS and ROC were used to verify the probabilistic prediction performance of the ensemble wave model. The BS is a representative probabilistic forecast verification index that determines the probabilistic prediction accuracy of the ensemble model. It consists of a total of three terms, as shown in Equation (4) [28].
BS = 1 N k = 1 K n k f k o ¯ k 2 1 N k = 1 K n k o ¯ k o ¯ 2 + o ¯ 1 o ¯
where N is the number of forecasts for the actual event, f k and o ¯ k are the predicted probability and the mean of the observation frequencies in probability interval k , n k is the number of samples of the forecast, and o ¯ is the mean of the total observation frequencies. The first term represents reliability, which means how close the predicted probability is to the actual probability of occurrence. A reliability value of 0 indicates a perfect prediction. The second term represents resolution, which means how far the actual probability of occurrence is from the mean of the observed frequencies in different probability intervals. Finally, the third term is uncertainty, which means the uncertainty included in the actual phenomenon. It has no relation to the accuracy of the prediction probability, since it represents the difficulty of the prediction situation. The BS consists of these three terms and indicates that the accuracy of probabilistic prediction is high if the sum is close to 0. The total BS and element-specific values of the ensemble wave model calculated using Equation (4) is shown in Figure 9. The BS increased as the lead time increased, and although the occurrence probability decreased when the predicted significant wave height was large, the accuracy of the ensemble prediction was higher than that of the case where the predicted significant wave height was small.
ROC is a method of evaluating the predictive performance of a binary classification system, which is a graph of the true- and false-positive rates for events above a certain threshold [29,30]. In general, the x-axis of the graph is the false-positive rate, and the y-axis is the true-positive rate. It indicates that the forecast is perfect when the area of the ROC graph is close to 1, while it indicates that the forecast value is not large if the area is below 0.5. The ROC graph, when the threshold was above 2 m of significant wave height, is shown in Figure 10. Overall, although the accuracy of the forecast was relatively high, since the area of the graph was close to 1, the accuracy of the forecast tended to decrease slightly as the forecasting lead time increased. This trend was also found in the ROC graph with a significant wave height above the threshold of 4 m, as shown in Figure 11, and although the frequency of observation decreased as the threshold for the significant wave height increased, the predictive performance for the probability of wave tended to increase.

4. Conclusions

In this study, a numerical simulation of wave forecast at the time of Typhoon Lingling moving north was performed using the sea wind forecast field of EPSG, and the forecast performance of the ensemble wave model was verified using the significant wave height observed from the ocean data buoy around the Korean Peninsula.
As the results of the probability verification, the appropriateness of the distribution of the ensemble spread was diagnosed through the rank histogram, and it was found that the spread was appropriate without bias after 24 h of lead time. As a result of determining the accuracy of probabilistic prediction using the BS verification index, the prediction accuracy for the prediction of the ensemble model decreased as the lead time increased, and the BS was close to 0 when the significant wave height threshold increased. Even in ROC with a significant wave height of 2 m or above, it was found that the area under the curve gradually decreased to nearly 1 as the lead time increased, and the same trend was observed in ROC with a significant wave height of 4 m or above. This means that the accuracy of probabilistic prediction decreased along with the lead time, and although the observation frequency decreased as the threshold for the significant wave height increased, the forecast performance tended to increase compared to the high observation frequency.
The averaged RMSE of the ensemble members became smaller than the RMSE of the deterministic forecast model performed during the same period as the lead time increased, and it was particularly distinct after 3 days of lead time. The relatively stable forecast performance of the ensemble model was confirmed, despite the rapid changes in the sea wind due to the typhoon. Furthermore, the simulation result with the ensemble forecast model includes the uncertainty information. The prediction uncertainty is reflected in the wave model through the ensemble technique, which shows improved accuracy and forecast performance compared to the prediction result of a deterministic forecast model. Although the ensemble model necessitates a greater amount of computational time compared to that of the deterministic model, the ensemble model has greater prediction accuracy. It has been determined that the probabilistic forecast performance was secured, even though it was the result of short-term verification performed during the typhoon period. It will be useful in the field of probabilistic forecast with more improved accuracy of probabilistic prediction based on sufficient verification data.

Author Contributions

Conceptualization, M.R. and H.-S.K.; methodology, M.R., H.-S.K. and P.-H.C.; software, M.R. and S.-M.O.; validation, M.R. and S.-M.O.; formal analysis, M.R.; data curation, H.-S.K.; writing—original draft preparation, M.R.; writing—review and editing, M.R. and H.-S.K.; supervision, P.-H.C. and S.-M.O.; project administration, P.-H.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Korea Meteorological Administration Research and Development Program “Development of Marine Meteorology Monitoring and Next-generation Ocean Forecasting System” under Grant (KMA2018-00323).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data and model output that support the findings of this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Molteni, F.; Buizza, R.; Palmer, T.N.; Petroliagis, T. The ECMWF ensemble prediction system: Methodology and validation. Q. J. R. Meteorol. Soc. 1996, 122, 73–119. [Google Scholar] [CrossRef]
  2. Palmer, T.N. A nonlinear dynamical perspective on model error: A proposal for non-local stochastic-dynamic parameterization in weather and climate prediction models. Q. J. R. Meteorol. Soc. 2001, 127, 279–304. [Google Scholar]
  3. Campos, R.M.; Alves, J.-H.G.M.; Penny, S.G.; Krasnopolsky, V. Global assessments of the NCEP Ensemble Forecast System using altimeter data. Ocean Dyn. 2019, 70, 405–419. [Google Scholar] [CrossRef]
  4. Buizza, R.; Miller, M.; Palmer, T.N. Stochastic representation of model uncertainties in the ECMWF ensemble prediction system. Q. J. R. Meteorol. Soc. 1999, 125, 2887–2908. [Google Scholar] [CrossRef]
  5. Harrison, M.S.J.; Palmer, T.N.; Richardson, D.S.; Buizza, R. Analysis and model dependencies in medium-range ensembles: Two transplant case-studies. Q. J. R. Meteorol. Soc. 1999, 125, 2487–2515. [Google Scholar] [CrossRef]
  6. Lenartz, F.; Beckers, J.-M.; Chiggiato, J.; Mourre, B.; Troupin, C.; Vandenbulcke, L.; Rixen, M. Super-ensemble techniques applied to wave forecast: Performance and limitations. Ocean Sci. 2010, 6, 595–604. [Google Scholar] [CrossRef] [Green Version]
  7. O’Donncha, F.; Zhang, Y.; Chen, B.; James, S.C. Ensemble model aggregation using a computationally lightweight machine-learning model to forecast ocean waves. J. Mar. Syst. 2019, 199, 103206. [Google Scholar] [CrossRef] [Green Version]
  8. Campos, R.M.; Krasnopolsky, V.; Alves, J.-H.G.M.; Penny, S.G. Improving NCEP’s global-scale wave ensemble averages using neural networks. Ocean Model. 2020, 149, 101617. [Google Scholar] [CrossRef]
  9. Saetra, Ø.; Bidlot, J.R. Potential benefits of using probabilistic forecasts for waves and marine winds based on the ECMWF Ensemble Prediction System. Weather Forecast. 2004, 19, 673–689. [Google Scholar] [CrossRef]
  10. Chen, H.S. Ensemble Prediction of Ocean Waves at NCEP. In Proceedings of the 28th Ocean Engineering Conference, NSYSU, Taipei, Taiwan, 30 November–2 December 2006; pp. 25–37. [Google Scholar]
  11. Cao, D.; Tolman, H.L.; Chen, H.S.; Chawla, A.; Gerald, V.M. Performance of the ocean wave ensemble forecast system at NCEP. In Proceedings of the 11th International Workshop on Wave Hindcasting & Forecasting and 2nd Coastal Hazards Symposium, Halifax, NS, Canada, 18–23 October 2009. [Google Scholar]
  12. Chang, H.-W.; Yen, C.-C.; Lin, M.-C.; Chu, C.-H. Establishment and performance of the ocean wave ensemble forecast system at CWB. J. Mar. Sci. Technol. 2017, 25, 588–598. [Google Scholar]
  13. Tuomi, L.; Pettersson, H.; Fortelius, C.; Tikka, K.; Björkqvist, J.-V.; Kahma, K.K. Wave modelling in archipelagos. Coast. Eng. 2014, 83, 205–220. [Google Scholar] [CrossRef]
  14. Gorman, R.M.; Oliver, H.J. Automated model optimisation using the Cylc workflow engine (Cyclops v1.0). Geosci. Model Dev. 2018, 11, 2153–2173. [Google Scholar] [CrossRef] [Green Version]
  15. Osinski, R.D.; Radtke, H. Ensemble hindcasting of wind and wave conditions with WRF and WAVEWATCH IIIr driven by ERA5. Ocean Sci. 2020, 16, 355–371. [Google Scholar] [CrossRef] [Green Version]
  16. Pardowitz, T.; Befort, D.J.; Leckebusch, G.C.; Ulbrich, U. Estimating uncertainties from high resolution simulations of extreme wind storms and consequences for impacts. Meteorol. Z. 2016, 25, 531–541. [Google Scholar] [CrossRef]
  17. Hamill, T.M. Interpretation of rank histograms for verifying ensemble forecasts. Mon. Weather Rev. 2001, 129, 550–560. [Google Scholar] [CrossRef]
  18. Cao, D.; Chen, H.S.; Tolman, H.L. Verification of Ocean Wave Ensemble Forecast at NCEP. In Proceedings of the 10th International Workshop on Wave Hindcasting and Forecasting & Coastal Hazards Symposium, Oahu, HI, USA, 11–16 November 2007. [Google Scholar]
  19. Bunney, C.; Saulter, A. An ensemble forecast system for prediction of Atlantic–UK wind waves. Ocean Model. 2015, 96, 103–116. [Google Scholar] [CrossRef]
  20. Barker, T.W. The relationship between spread and forecast error in extended-range forecasts. J. Clim. 1991, 4, 733–742. [Google Scholar] [CrossRef] [Green Version]
  21. Whitaker, J.S.; Loughe, A.F. The relationship between ensemble spread and ensemble mean skill. Mon. Weather Rev. 1998, 126, 3292–3302. [Google Scholar] [CrossRef] [Green Version]
  22. Hopson, T.M. Assessing the ensemble spread–error relationship. Mon. Weather Rev. 2014, 142, 1125–1142. [Google Scholar] [CrossRef]
  23. Tolman, H.L.; Chalikov, D. Source terms in a third-generation wind wave model. J. Phys. Oceanogr. 1996, 26, 2497–2518. [Google Scholar] [CrossRef] [Green Version]
  24. Hasselmann, K.; Barnett, T.P.; Bouws, E.; Carlson, H.; Cartwright, D.E.; Enke, K.; Ewing, A.; Gienapp, H.; Hasselmann, D.E.; Kruseman, P.; et al. Measurements of windwave growth and swell decay during the Joint North Sea Wave Project (JONSWAP). Dtsch. Hydrogr. Z. 1973, 8, 1–95. [Google Scholar]
  25. Ardhuin, F.; Rogers, E.; Babanin, A.V.; Filipot, J.-F.; Magne, R.; Roland, A.; Westhuysen, A.; Queffeulou, P.; Lefevre, J.-M.; Aouf, L.; et al. Semiempirical dissipation source functions for ocean waves. Part I: Definition, Calibration, and Validation. J. Phys. Oceanogr. 2010, 40, 1917–1941. [Google Scholar] [CrossRef] [Green Version]
  26. Park, J.S.; Kang, K.K.; Kang, H.-S. Development and evaluation of an ensemble forecasting system for the regional ocean wave of Korea. J. Korean Soc. Coast. Ocean Eng. 2018, 30, 84–94. [Google Scholar] [CrossRef]
  27. Palmer, T.; Buizza, R.; Hagedorn, R.; Lawrence, A.; Leutbecher, M.; Smith, L. Ensemble prediction: A pedagogical perspective. ECMWF Newsl. 2005, 106, 10–17. [Google Scholar]
  28. Murphy, A.H.; Winkler, R.L. Diagnostic verification of probability forecasts. Int. J. Forecast. 1992, 7, 435–455. [Google Scholar] [CrossRef]
  29. Mason, I. A model for assessment of weather forecasts. Aust. Meteor. Mag. 1982, 30, 291–303. [Google Scholar]
  30. Chang, H.-L.; Yang, S.-C.; Yuan, H.; Lin, P.-L.; Liou, Y.-C. Analysis of the relative operating characteristic and economic value using the LAPS Ensemble Prediction System in Taiwan. Mon. Weather Rev. 2015, 143, 1833–1848. [Google Scholar] [CrossRef]
Figure 1. Ensemble wave model computational domain and Typhoon Lingling best track.
Figure 1. Ensemble wave model computational domain and Typhoon Lingling best track.
Jmse 09 00475 g001
Figure 2. Operation diagram of forecast system for EPSG and ensemble wave model (EPSG 24 ensemble members sea winds consist of 13 members with control member at same time of wave model runs (00 UTC, 12 UTC) and 11 members performed 6 h prior to wave model runs (18 UTC, 06 UTC)).
Figure 2. Operation diagram of forecast system for EPSG and ensemble wave model (EPSG 24 ensemble members sea winds consist of 13 members with control member at same time of wave model runs (00 UTC, 12 UTC) and 11 members performed 6 h prior to wave model runs (18 UTC, 06 UTC)).
Jmse 09 00475 g002
Figure 3. Simulated wind vectors and speeds (m/s) of the lead time +00 h, +24 h, and +48 h on the EPSG ((ac): control member, (df): 13th member, and (gi): 24th member).
Figure 3. Simulated wind vectors and speeds (m/s) of the lead time +00 h, +24 h, and +48 h on the EPSG ((ac): control member, (df): 13th member, and (gi): 24th member).
Jmse 09 00475 g003
Figure 4. Comparison of the results of each ensemble model member with a significant wave height of 3 m, the result of the ensemble mean, and the result of a deterministic forecast model (forecast lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h).
Figure 4. Comparison of the results of each ensemble model member with a significant wave height of 3 m, the result of the ensemble mean, and the result of a deterministic forecast model (forecast lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h).
Jmse 09 00475 g004
Figure 5. Comparison of the results of each ensemble model member with a significant wave height of 5 m, the result of the ensemble mean, and the result of a deterministic forecast model (forecast lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h).
Figure 5. Comparison of the results of each ensemble model member with a significant wave height of 5 m, the result of the ensemble mean, and the result of a deterministic forecast model (forecast lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h).
Jmse 09 00475 g005
Figure 6. Rank histogram to determine the suitability of ensemble spread for all observation points with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Figure 6. Rank histogram to determine the suitability of ensemble spread for all observation points with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Jmse 09 00475 g006
Figure 7. Spread skill relationship for significant wave height for forecast lead times of (a) +00 h and (b) +72 h.
Figure 7. Spread skill relationship for significant wave height for forecast lead times of (a) +00 h and (b) +72 h.
Jmse 09 00475 g007
Figure 8. (a) bias and (b) RMSE for significant wave heights of ensemble model and deterministic model by forecast lead time (Analysis period: 00 UTC 1 September 2019, to 12 UTC 9 September 2019).
Figure 8. (a) bias and (b) RMSE for significant wave heights of ensemble model and deterministic model by forecast lead time (Analysis period: 00 UTC 1 September 2019, to 12 UTC 9 September 2019).
Jmse 09 00475 g008
Figure 9. Comparison of the Brier score of forecast lead time according to threshold for significant wave heights of (a) 2 m or above and (b) 4 m or above.
Figure 9. Comparison of the Brier score of forecast lead time according to threshold for significant wave heights of (a) 2 m or above and (b) 4 m or above.
Jmse 09 00475 g009
Figure 10. ROC with significant wave height of 2 m or above with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Figure 10. ROC with significant wave height of 2 m or above with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Jmse 09 00475 g010
Figure 11. ROC with significant wave height of 4 m or above with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Figure 11. ROC with significant wave height of 4 m or above with forecasting lead times of (a) +00 h, (b) +24 h, (c) +48 h, and (d) +72 h.
Jmse 09 00475 g011
Table 1. Description of ensemble wave forecast model.
Table 1. Description of ensemble wave forecast model.
ModelWAVEWATCH III Ver. 4.18
Coordinate Spherical coordinate
Domain115° E–150° E, 20° N–50° N
Resolution1/12° × 1/12° (421 × 361)
Forecast Time+120 h (3 h of time-interval)
Initial Condition12 h forecast from the previous run
Boundary ConditionFrom the regional wave model
Wind Forcing DataEPSG (UM N400 L70 M49) 10 m sea winds
Model Cycle2/day (00 UTC, 12 UTC)
Ensemble Member24
Table 2. Location information of ocean data buoys (KMA).
Table 2. Location information of ocean data buoys (KMA).
No.Buoy IDLocationLongitude (Deg.)Latitude (Deg.)
122104Geojedo128.9034.77
222106Pohang129.7836.35
322188Tongyeong128.2334.39
422189Ulsan129.8435.35
522101Deokjeokdo126.0237.23
622108Oeyeondo125.7536.25
722185Incheon125.4337.09
822105Donghae130.0037.53
921229Ulleungdo131.1137.46
1022190Uljin129.8736.91
1122102Chilbaldo125.7734.80
1222103Geomundo127.5034.00
1322184Chujado126.1433.79
1422183Shinan126.2434.73
1522186Buan125.8135.66
1622187Seogwipo127.0233.13
1722107Marado126.0333.08
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Roh, M.; Kim, H.-S.; Chang, P.-H.; Oh, S.-M. Numerical Simulation of Wind Wave Using Ensemble Forecast Wave Model: A Case Study of Typhoon Lingling. J. Mar. Sci. Eng. 2021, 9, 475. https://doi.org/10.3390/jmse9050475

AMA Style

Roh M, Kim H-S, Chang P-H, Oh S-M. Numerical Simulation of Wind Wave Using Ensemble Forecast Wave Model: A Case Study of Typhoon Lingling. Journal of Marine Science and Engineering. 2021; 9(5):475. https://doi.org/10.3390/jmse9050475

Chicago/Turabian Style

Roh, Min, Hyung-Suk Kim, Pil-Hun Chang, and Sang-Myeong Oh. 2021. "Numerical Simulation of Wind Wave Using Ensemble Forecast Wave Model: A Case Study of Typhoon Lingling" Journal of Marine Science and Engineering 9, no. 5: 475. https://doi.org/10.3390/jmse9050475

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop