Numerical Simulation and Analysis in Applied Mechanics and Engineering

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Computational and Applied Mathematics".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 7628

Special Issue Editors


E-Mail Website
Guest Editor
School of Civil Engineering, Qingdao University of Technology, Qingdao 266520, China
Interests: evolutionary computation; machine learning; production optimization; history matching; development of nonconventional reservoir
Special Issues, Collections and Topics in MDPI journals
School of Civil Engineering, Qingdao University of Technology, Qingdao 266520, China
Interests: method numerical modeling; machine learning; computational fluid dynamics

Special Issue Information

Dear Colleagues,

Based on numerical methods such as finite difference, finite volume, finite element, boundary element, and meshless methods, numerical simulations for various problems in the fields of applied mechanics and engineering have rapidly developed in the recent decades. Various numerical methods are presented for solving the problems in different fields. With the development of machine learning theory, numerical simulation and combined machine learning method will play an increasingly important role for studying the problems in applied mechanics and engineering.

This Special Issue intends to collect original research and review articles that focus on the most recent developments in numerical simulation and analysis, especially for complicated problems. Moreover, manuscripts on the theories of numerical simulation and analysis for applied mechanics, engineering, or science problems are welcome. We are particularly interested in manuscripts that report on the relevance of numerical simulation and machine learning theory and those on the application of machine learning theory in the field of civil engineering, petroleum engineering, water resources, geotechnical engineering, and other emerging fields. We are also concerned with development of the corresponding aspects based on machine learning. 

Prof. Dr. Kai Zhang
Dr. Piyang Liu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • new theories for numerical simulation
  • numerical simulation for complicated problems
  • machine learning applications in numerical simulation
  • reservoir characterization and modeling
  • pore- and reservoir-scale simulation studies
  • production optimization
  • future strategies of numerical simulation and machine learning

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

31 pages, 17358 KiB  
Article
Non-Invasive Identification of Vehicle Suspension Parameters: A Methodology Based on Synthetic Data Analysis
by Alfonso de Hoyos Fernández de Córdova, José Luis Olazagoitia and Carlos Gijón-Rivera
Mathematics 2024, 12(3), 397; https://doi.org/10.3390/math12030397 - 26 Jan 2024
Viewed by 768
Abstract
In this study, we introduce an innovative approach for the identification of vehicle suspension parameters, employing a methodology that utilizes synthetic and experimental data for non-invasive analysis. Central to our approach is the application of a basic local optimization algorithm, chosen to establish [...] Read more.
In this study, we introduce an innovative approach for the identification of vehicle suspension parameters, employing a methodology that utilizes synthetic and experimental data for non-invasive analysis. Central to our approach is the application of a basic local optimization algorithm, chosen to establish a baseline for parameter identification in increasingly complex vehicle models, ranging from quarter-vehicle to half-vehicle (bicycle) models. This methodology enables the accurate simulation of the vehicle dynamics and the identification of suspension parameters under various conditions, including road perturbations such as speed bumps and curbs, as well as in the presence of noise. A significant aspect of our work is the ability to process real-world data, making it applicable in practical scenarios where data are obtained from onboard sensor equipment. The methodology was developed in MatLab, ensuring portability across platforms that support this software. Furthermore, the study explores the application of this methodology as a tool for denoising, enhancing its utility in real-world data analysis and predictive maintenance. The findings of this research provide valuable insights for vehicle suspension design, offering a cost-effective and efficient solution for dynamic parameter identification without the need for physical disassembly. Full article
Show Figures

Figure 1

26 pages, 1500 KiB  
Article
Construction of Software Supply Chain Threat Portrait Based on Chain Perspective
by Maoyang Wang, Peng Wu and Qin Luo
Mathematics 2023, 11(23), 4856; https://doi.org/10.3390/math11234856 - 02 Dec 2023
Viewed by 1060
Abstract
With the rapid growth of the software industry, the software supply chain (SSC) has become the most intricate system in the complete software life cycle, and the security threat situation is becoming increasingly severe. For the description of the SSC, the relevant research [...] Read more.
With the rapid growth of the software industry, the software supply chain (SSC) has become the most intricate system in the complete software life cycle, and the security threat situation is becoming increasingly severe. For the description of the SSC, the relevant research mainly focuses on the perspective of developers, lacking a comprehensive understanding of the SSC. This paper proposes a chain portrait framework of the SSC based on a resource perspective, which comprehensively depicts the threat model and threat surface indicator system of the SSC. The portrait model includes an SSC threat model and an SSC threat indicator matrix. The threat model has 3 levels and 32 dimensions and is based on a generative artificial intelligence model. The threat indicator matrix is constructed using the Attack Net model comprising 14-dimensional attack strategies and 113-dimensional attack techniques. The proposed portrait model’s effectiveness is verified through existing SSC security events, domain experts, and event visualization based on security analysis models. Full article
Show Figures

Figure 1

16 pages, 1614 KiB  
Article
Mathematical Modeling and Multi-Criteria Optimization of Design Parameters for the Gyratory Crusher
by Vitalii P. Kondrakhin, Nikita V. Martyushev, Roman V. Klyuev, Svetlana N. Sorokova, Egor A. Efremenkov, Denis V. Valuev and Qi Mengxu
Mathematics 2023, 11(10), 2345; https://doi.org/10.3390/math11102345 - 17 May 2023
Cited by 24 | Viewed by 1433
Abstract
There are a sufficient number of works devoted to modeling crushing machines. Nevertheless, the fact that there are a large number of working conditions, and the ongoing development of science and technology, require continuous improvement and specification of the models intended for crushing [...] Read more.
There are a sufficient number of works devoted to modeling crushing machines. Nevertheless, the fact that there are a large number of working conditions, and the ongoing development of science and technology, require continuous improvement and specification of the models intended for crushing processes and those of the devices concerned. However, there are few studies related to single-roll gyratory crushers. Such crushers are promising for use in mines to crush rocks laid in the developed space. Mathematical modeling and optimization of the design parameters of the working chamber and the executive body (roll) of a single-roll gyratory shaft crusher, designed for crushing strong rocks, was performed in this paper. A differential equation was derived. As a result of its solution, the rational shape of the working chamber cheek of the single-roll gyratory crusher was established, representing a logarithmic spiral arc. Analytical expressions were derived to determine the rational rotation speed and productivity of the crusher under consideration. Expressions for calculating the kinematic load components acting on the roll were formulated. They are the periodic functions of the shaft rotation angle. The Fourier series expansion showed that the loads contained harmonics of the first, second, third and fourth orders. Using the concept of fuzzy sets, a multi-criteria optimization of the design parameters of the working chamber was performed, including the values of the eccentricity and the central angle of the beginning of the cheek profile. The variation coefficients of the kinematic components of the loads acting on the working body reduced, due to the optimal choice of the working chamber profile and the angular coordinates of the installation of the fixed cheeks. The torque reduced 1.67 times, while the radial load decreased 1.2 times. Full article
Show Figures

Figure 1

23 pages, 8439 KiB  
Article
An Axially Compressed Moving Nanobeam Based on the Nonlocal Couple Stress Theory and the Thermoelastic DPL Model
by Ahmed E. Abouelregal, S. S. Askar and Marin Marin
Mathematics 2023, 11(9), 2155; https://doi.org/10.3390/math11092155 - 04 May 2023
Cited by 3 | Viewed by 924
Abstract
This article introduces a new model that can be used to describe elastic thermal vibrations caused by changes in temperature in elastic nanobeams in response to transverse external excitations. Using the idea of nonlocal elasticity and the dual-phase lagging thermoelastic model (DPL), the [...] Read more.
This article introduces a new model that can be used to describe elastic thermal vibrations caused by changes in temperature in elastic nanobeams in response to transverse external excitations. Using the idea of nonlocal elasticity and the dual-phase lagging thermoelastic model (DPL), the coupled equations of motion and heat transfer were derived to explain small-scale effects. Additionally, modified couple stress theory (MCST) and Euler–Bernoulli (EB) beam assumptions were considered. The proposed theory was verified by considering the thermodynamic response of nanobeams moving horizontally at a constant speed while one end is subjected to a periodic thermal load. The system of governing equations has been solved numerically with the help of Laplace transforms and one of the tested evolutionary algorithms. The effects of changing the nonlocal modulus, the magnitude of the external force, and the length scale parameter on the system fields were investigated. It is also shown how the behavior of the thermal nanobeam changes depending on the phase delay factors in addition to the horizontal velocity of the beam. To determine this model’s accuracy, its results were compared with the results of the classical continuity model and thermoelastic concepts. The numerical results show that when the nanobeam moves, the length scale can change the studied thermal and mechanical vibration wave patterns and physical fields. Additionally, during thermally stimulated vibrations, thermodynamic effects that have implications for the dynamic design and performance improvement of nanostructures must be considered. Full article
Show Figures

Figure 1

15 pages, 7565 KiB  
Article
High-Enthalpy Geothermal Simulation with Continuous Localization in Physics
by Yang Wang and Denis Voskov
Mathematics 2022, 10(22), 4328; https://doi.org/10.3390/math10224328 - 18 Nov 2022
Cited by 1 | Viewed by 1050
Abstract
Simulation of heat production in high-enthalpy geothermal systems is associated with a complex physical process in which cold water invades steam-saturated control volumes. The fully implicit, fully coupled numerical strategy is commonly adopted to solve the governing system composed of mass and energy [...] Read more.
Simulation of heat production in high-enthalpy geothermal systems is associated with a complex physical process in which cold water invades steam-saturated control volumes. The fully implicit, fully coupled numerical strategy is commonly adopted to solve the governing system composed of mass and energy conservation equations. A conventional nonlinear solver is generally challenged by the strong nonlinearity present during phase transition because of a huge contrast of thermodynamics between hot steam and cool water. In the process of solution, due to the steam condensation, the reduction in fluid volume reduces the pressure in the control volume. This generates multiple local minima in the physical parameter space, which indicates the Newton initial guess should be carefully selected to guarantee the nonlinear convergence. Otherwise, the Newton iteration will approach a local minimum and the solution based on Newton’s update cannot converge and needs to be repeated for a smaller timestep. This problem brings simulation to the stalling behavior where a nonlinear solver wastes a lot of computations and performs at a very small timestep. To tackle this problem, we formulated continuous localization of Newton’s method based on an Operator-Based Linearization (OBL) approach. In OBL, the physical space can be parameterized in terms of operators with supporting points at different levels of resolution. During the simulation, the operator values at supporting points are obtained through reference physics and the remaining part of the space is interpolated. In this way, the nonlinear physical parameter space can be flexibly characterized with different degrees of accuracy. In our proposed approach, the nonlinear Newton iterations are performed in parameter space with different resolutions from coarse to fine. Specifically, the Newton solution under coarser resolution is taken as the initial guess for that under finer resolution. A coarser parameter space represents more linear physics, under which the nonlinear solver quickly converges to a localized solution near the true solution. With refinement in physics, the Newton iteration will approach the true solution and the stalling behaviour in the simulation is avoided. Therefore, a larger timestep can be utilized in the simulation compared with the conventional nonlinear solvers. Full article
Show Figures

Figure 1

Review

Jump to: Research

44 pages, 9485 KiB  
Review
Progress and Challenges of Integrated Machine Learning and Traditional Numerical Algorithms: Taking Reservoir Numerical Simulation as an Example
by Xu Chen, Kai Zhang, Zhenning Ji, Xiaoli Shen, Piyang Liu, Liming Zhang, Jian Wang and Jun Yao
Mathematics 2023, 11(21), 4418; https://doi.org/10.3390/math11214418 - 25 Oct 2023
Viewed by 1626
Abstract
Machine learning techniques have garnered significant attention in various engineering disciplines due to their potential and benefits. Specifically, in reservoir numerical simulations, the core process revolves around solving the partial differential equations delineating oil, gas, and water flow dynamics in porous media. Discretizing [...] Read more.
Machine learning techniques have garnered significant attention in various engineering disciplines due to their potential and benefits. Specifically, in reservoir numerical simulations, the core process revolves around solving the partial differential equations delineating oil, gas, and water flow dynamics in porous media. Discretizing these partial differential equations via numerical methods is one cornerstone of this simulation process. The synergy between traditional numerical methods and machine learning can enhance the precision of partial differential equation discretization. Moreover, machine learning algorithms can be employed to solve partial differential equations directly, yielding rapid convergence, heightened computational efficiency, and accuracies surpassing 95%. This manuscript offers an overview of the predominant numerical methods in reservoir simulations, focusing on integrating machine learning methodologies. The innovations in fusing deep learning techniques to solve reservoir partial differential equations are illuminated, coupled with a concise discussion of their inherent advantages and constraints. As machine learning continues to evolve, its conjunction with numerical methods is poised to be pivotal in addressing complex reservoir engineering challenges. Full article
Show Figures

Figure 1

Back to TopTop