Journal Description
Computation
Computation
is a peer-reviewed journal of computational science and engineering published monthly online by MDPI.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, ESCI (Web of Science), CAPlus / SciFinder, Inspec, dblp, and other databases.
- Journal Rank: CiteScore - Q2 (Applied Mathematics)
- Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 16.2 days after submission; acceptance to publication is undertaken in 4.8 days (median values for papers published in this journal in the second half of 2022).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
Latest Articles
Mathematical Modeling of Multi-Phase Filtration in a Deformable Porous Medium
Computation 2023, 11(6), 112; https://doi.org/10.3390/computation11060112 - 08 Jun 2023
Abstract
In this paper, a mathematical model of multiphase filtration in a deformable porous medium is presented. Based on the proposed model, the influence of the deformation of a porous medium on the filtration processes is studied. Numerical calculations are performed and the characteristics
[...] Read more.
In this paper, a mathematical model of multiphase filtration in a deformable porous medium is presented. Based on the proposed model, the influence of the deformation of a porous medium on the filtration processes is studied. Numerical calculations are performed and the characteristics of the process are determined. This paper shows that an increase in the compressibility coefficient leads to a sharp decrease in porosity, absolute permeability and internal pressure of the medium near the well, and a decrease in the distance between wells leads to a sharp decrease in hydrodynamic parameters in the inter-well zone.
Full article
(This article belongs to the Special Issue Computational Techniques for Fluid Dynamics Problems)
►
Show Figures
Open AccessFeature PaperArticle
Determination of Characteristics of Associative Storage Devices in Radio Telemetry Systems with Data Compression
by
, , , , , , and
Computation 2023, 11(6), 111; https://doi.org/10.3390/computation11060111 - 06 Jun 2023
Abstract
In the radio telemetry systems of spacecraft, various data compression methods are used for data processing. When using any compression methods, the data obtained as a result of compression is formed randomly, and transmission over radio communication channels should be carried out evenly
[...] Read more.
In the radio telemetry systems of spacecraft, various data compression methods are used for data processing. When using any compression methods, the data obtained as a result of compression is formed randomly, and transmission over radio communication channels should be carried out evenly over time. This leads to the need to use special buffer storage devices. In addition, existing spacecraft radio telemetry systems require grouping of compressed data streams by certain characteristics. This leads to the need to sort compressed data by streams. Therefore, it is advisable to use associative buffer storage devices in such systems. This article is devoted to the analysis of the processes of formation of output streams of compressed data generated at the output of an associative storage device (ASD). Since the output stream of compressed data is random, queue theory and probability theory are used for analysis. At the same time, associative memory is represented as a queue system. Writing and reading in an ASD can be interpreted as servicing orders in a queue system. The purpose of the analysis is to determine the characteristics of an associative storage device (ASD). Such characteristics are the queue length M{N} in the ASD, the deviation of the queue length D{N} in the ASD and the probability of a given volume of compressed data in the ASD (including the probability of emptying and the probability of memory overflow). The results obtained are of great practical importance, since they can be used to select the amount of memory of an associative storage device (ASD) when designing compression devices for telemetry systems of spacecraft.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Buckling Analysis of Laminated Stiffened Plates with Material Anisotropy Using the Rayleigh–Ritz Approach
Computation 2023, 11(6), 110; https://doi.org/10.3390/computation11060110 - 30 May 2023
Abstract
An energy-based solution for calculating the buckling loads of partially anisotropic stiffened plates is presented, such as antisymmetric cross-ply and angle-ply laminations. A discrete approach, for the mathematical modelling and formulations of the stiffened plates, is followed. The developed formulations extend the Rayleigh–Ritz
[...] Read more.
An energy-based solution for calculating the buckling loads of partially anisotropic stiffened plates is presented, such as antisymmetric cross-ply and angle-ply laminations. A discrete approach, for the mathematical modelling and formulations of the stiffened plates, is followed. The developed formulations extend the Rayleigh–Ritz method and explore the available anisotropic unstiffened plate buckling solutions to the interesting cases of stiffened plates with some degree of material anisotropy. The examined cases consider simply supported unstiffened and stiffened plates under uniform and linearly varying compressive loading. Additionally, a reference finite element (FE) model is developed to compare the calculated buckling loads and validate the modelling approach for its accuracy. The results of the developed method are also compared with the respective experimental results for the cases where they were available in the literature. Finally, an extended discussion regarding the assumptions and restrictions of the applied Rayleigh–Ritz method is made, so that the limitations of the developed method are identified and documented.
Full article
(This article belongs to the Special Issue Finite Element Methods with Applications in Civil and Mechanical Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Calculation of Linear Buckling Load for Frames Modeled with One-Finite-Element Beams and Columns
by
and
Computation 2023, 11(6), 109; https://doi.org/10.3390/computation11060109 - 30 May 2023
Abstract
Critical linear buckling load calculation is one of the possible ways to check structural stability. Structural analysis programs usually model beams and columns with just one element, but this is not enough to obtain an accurate value of the critical buckling load when
[...] Read more.
Critical linear buckling load calculation is one of the possible ways to check structural stability. Structural analysis programs usually model beams and columns with just one element, but this is not enough to obtain an accurate value of the critical buckling load when the buckling mode is associated with an effective length that is less than twice the element length. This paper presents a method for the accurate calculation of the buckling load of frames modeled with only one finite element per structural element. For this purpose, a local correction is applied to some elements a few times until convergence is achieved. The validity of the presented method is confirmed by several examples ranging from simple canonical cases to large structures.
Full article
(This article belongs to the Special Issue Finite Element Methods with Applications in Civil and Mechanical Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Application of DFT and TD-DFT on Langmuir Adsorption of Nitrogen and Sulfur Heterocycle Dopants on an Aluminum Surface Decorated with Magnesium and Silicon
by
and
Computation 2023, 11(6), 108; https://doi.org/10.3390/computation11060108 - 29 May 2023
Abstract
In this study, we investigated the abilities of nitrogen and sulfur heterocyclic carbenes of benzotriazole, 2-mercaptobenzothiazole, 8-hydroxyquinoline, and 3-amino-1,2,4-triazole-5-thiol regarding adsorption on an Al-Mg-Si alloy toward corrosion inhibition of the surface. Al-Si(14), Al-Si(19), and Al-Si(21) in the Al-Mg-Si alloy surface with the highest
[...] Read more.
In this study, we investigated the abilities of nitrogen and sulfur heterocyclic carbenes of benzotriazole, 2-mercaptobenzothiazole, 8-hydroxyquinoline, and 3-amino-1,2,4-triazole-5-thiol regarding adsorption on an Al-Mg-Si alloy toward corrosion inhibition of the surface. Al-Si(14), Al-Si(19), and Al-Si(21) in the Al-Mg-Si alloy surface with the highest fluctuation in the shielding tensors of the “NMR” spectrum generated by intra-atomic interaction directed us to the most influence in the neighbor atoms generated by interatomic reactions of N → Al, O → Al, and S → Al through the coating and adsorbing process of Langmuir adsorption. The values of various thermodynamic properties and dipole moments of benzotriazole, 2-mercaptobenzothiazole, 8-hydroxyquinoline, and 3-amino-1,2,4-triazole-5-thiol adsorbed on the Al-Mg-Si increased by enhancing the molecular weight of these compounds as well as the charge distribution between organic compounds (electron donor) and the alloy surface (electron acceptor). Finally, this research can build up our knowledge of the electronic structure, relative stability, and surface bonding of various metal alloy surfaces, metal-doped alloy nanosheets, and other dependent mechanisms such as heterogeneous catalysis, friction lubrication, and biological systems.
Full article
(This article belongs to the Special Issue Density Functional Theory: Theory, Methods, Applications, and Recent Advances)
►▼
Show Figures

Figure 1
Open AccessArticle
Application of Graph Theory and Automata Modeling for the Study of the Evolution of Metabolic Pathways with Glycolysis and Krebs Cycle as Case Studies
Computation 2023, 11(6), 107; https://doi.org/10.3390/computation11060107 - 28 May 2023
Abstract
Today, graph theory represents one of the most important modeling techniques in biology. One of the most important applications is in the study of metabolic networks. During metabolism, a set of sequential biochemical reactions takes place, which convert one or more molecules into
[...] Read more.
Today, graph theory represents one of the most important modeling techniques in biology. One of the most important applications is in the study of metabolic networks. During metabolism, a set of sequential biochemical reactions takes place, which convert one or more molecules into one or more final products. In a biochemical reaction, the transformation of one metabolite into the next requires a class of proteins called enzymes that are responsible for catalyzing the reaction. Whether by applying differential equations or automata theory, it is not easy to explain how the evolution of metabolic networks could have taken place within living organisms. Obviously, in the past, the assembly of biochemical reactions into a metabolic network depended on the independent evolution of the enzymes involved in the isolated biochemical reactions. In this work, a simulation model is presented where enzymes are modeled as automata, and their evolution is simulated with a genetic algorithm. This protocol is applied to the evolution of glycolysis and the Krebs cycle, two of the most important metabolic networks for the survival of organisms. The results obtained show how Darwinian evolution is able to optimize a biological network, such as in the case of glycolysis and Krebs metabolic networks.
Full article
(This article belongs to the Special Issue 10th Anniversary of Computation—Computational Biology)
►▼
Show Figures

Figure 1
Open AccessArticle
Addressing the Folding of Intermolecular Springs in Particle Simulations: Fixed Image Convention
Computation 2023, 11(6), 106; https://doi.org/10.3390/computation11060106 - 26 May 2023
Abstract
Mesoscopic simulations of long polymer chains and soft matter systems are conducted routinely in the literature in order to assess the long-lived relaxation processes manifested in these systems. Coarse-grained chains are, however, prone to unphysical intercrossing due to their inherent softness. This issue
[...] Read more.
Mesoscopic simulations of long polymer chains and soft matter systems are conducted routinely in the literature in order to assess the long-lived relaxation processes manifested in these systems. Coarse-grained chains are, however, prone to unphysical intercrossing due to their inherent softness. This issue can be resolved by introducing long intermolecular bonds (the so-called slip-springs) which restore these topological constraints. The separation vector of intermolecular bonds can be determined by enforcing the commonly adopted minimum image convention (MIC). Because these bonds are soft and long (ca 3–20 nm), subjecting the samples to extreme deformations can lead to topology violations when enforcing the MIC. We propose the fixed image convention (FIC) for determining the separation vectors of overextended bonds, which is more stable than the MIC and applicable to extreme deformations. The FIC is simple to implement and, in general, more efficient than the MIC. Side-by-side comparisons between the MIC and FIC demonstrate that, when using the FIC, the topology remains intact even in situations with extreme particle displacement and nonaffine deformation. The accuracy of these conventions is the same when applying affine deformation. The article is accompanied by the corresponding code for implementing the FIC.
Full article
(This article belongs to the Special Issue 10th Anniversary of Computation - Computational Chemistry)
►▼
Show Figures

Figure 1
Open AccessArticle
CrohnDB: A Web Database for Expression Profiling of Protein-Coding and Long Non-Coding RNA Genes in Crohn Disease
Computation 2023, 11(6), 105; https://doi.org/10.3390/computation11060105 - 24 May 2023
Abstract
►▼
Show Figures
Crohn disease (CD) is a type of inflammatory bowel disease that causes inflammation in the digestive tract. Cases of CD are increasing worldwide, calling for more research to elucidate the pathogenesis of CD. For this purpose, the usage of the RNA-sequencing (RNA-seq) technique
[...] Read more.
Crohn disease (CD) is a type of inflammatory bowel disease that causes inflammation in the digestive tract. Cases of CD are increasing worldwide, calling for more research to elucidate the pathogenesis of CD. For this purpose, the usage of the RNA-sequencing (RNA-seq) technique is increasingly appreciated, as it captures RNA expression patterns at a particular time point in a high-throughput manner. Although many RNA-seq datasets are generated from CD patients and compared to those of healthy donors, most of these datasets are analyzed only for protein-coding genes, leaving non-coding RNAs (ncRNAs) undiscovered. Long non-coding RNAs (lncRNAs) are any ncRNAs that are longer than 200 nucleotides. Interest in studying lncRNAs is increasing rapidly, as lncRNAs bind other macromolecules (DNA, RNA, and/or proteins) to finetune signaling pathways. To fill the gap in knowledge about lncRNAs in CD, we performed secondary analysis of published RNA-seq data of CD patients compared to healthy donors to identify lncRNA genes and their expression changes. To further facilitate lncRNA research in CD, we built a web database, CrohnDB, to provide a one-stop-shop for expression profiling of protein-coding and lncRNA genes in CD patients compared to healthy donors.
Full article

Figure 1
Open AccessArticle
Explainable Ensemble-Based Machine Learning Models for Detecting the Presence of Cirrhosis in Hepatitis C Patients
by
, , , , , , and
Computation 2023, 11(6), 104; https://doi.org/10.3390/computation11060104 - 23 May 2023
Abstract
Hepatitis C is a liver infection caused by a virus, which results in mild to severe inflammation of the liver. Over many years, hepatitis C gradually damages the liver, often leading to permanent scarring, known as cirrhosis. Patients sometimes have moderate or no
[...] Read more.
Hepatitis C is a liver infection caused by a virus, which results in mild to severe inflammation of the liver. Over many years, hepatitis C gradually damages the liver, often leading to permanent scarring, known as cirrhosis. Patients sometimes have moderate or no symptoms of liver illness for decades before developing cirrhosis. Cirrhosis typically worsens to the point of liver failure. Patients with cirrhosis may also experience brain and nerve system damage, as well as gastrointestinal hemorrhage. Treatment for cirrhosis focuses on preventing further progression of the disease. Detecting cirrhosis earlier is therefore crucial for avoiding complications. Machine learning (ML) has been shown to be effective at providing precise and accurate information for use in diagnosing several diseases. Despite this, no studies have so far used ML to detect cirrhosis in patients with hepatitis C. This study obtained a dataset consisting of 28 attributes of 2038 Egyptian patients from the ML Repository of the University of California at Irvine. Four ML algorithms were trained on the dataset to diagnose cirrhosis in hepatitis C patients: a Random Forest, a Gradient Boosting Machine, an Extreme Gradient Boosting, and an Extra Trees model. The Extra Trees model outperformed the other models achieving an accuracy of 96.92%, a recall of 94.00%, a precision of 99.81%, and an area under the receiver operating characteristic curve of 96% using only 16 of the 28 features.
Full article
(This article belongs to the Special Issue Interpretable Statistical Machine for Decision Making: Modeling, Learning and Application Perspectives)
►▼
Show Figures

Figure 1
Open AccessArticle
Opinion Formation on Social Networks—The Effects of Recurrent and Circular Influence
by
Computation 2023, 11(5), 103; https://doi.org/10.3390/computation11050103 - 22 May 2023
Abstract
We present a generalised complex contagion model for describing behaviour and opinion spreading on social networks. Recurrent interactions between adjacent nodes and circular influence in loops in the network structure enable the modelling of influence spreading on the network scale. We have presented
[...] Read more.
We present a generalised complex contagion model for describing behaviour and opinion spreading on social networks. Recurrent interactions between adjacent nodes and circular influence in loops in the network structure enable the modelling of influence spreading on the network scale. We have presented details of the model in our earlier studies. Here, we focus on the interpretation of the model and discuss its features by using conventional concepts in the literature. In addition, we discuss how the model can be extended to account for specific social phenomena in social networks. We demonstrate the differences between the results of our model and a simple contagion model. Results are provided for a small social network and a larger collaboration network. As an application of the model, we present a method for profiling individuals based on their out-centrality, in-centrality, and betweenness values in the social network structure. These measures have been defined consistently with our spreading model based on an influence spreading matrix. The influence spreading matrix captures the directed spreading probabilities between all node pairs in the network structure. Our results show that recurrent and circular influence has considerable effects on node centrality values and spreading probabilities in the network structure.
Full article
(This article belongs to the Special Issue Computational Social Science and Complex Systems)
►▼
Show Figures

Figure 1
Open AccessArticle
Τhe Behavior of Hybrid Reinforced Concrete-Steel Buildings under Sequential Ground Excitations
Computation 2023, 11(5), 102; https://doi.org/10.3390/computation11050102 - 18 May 2023
Abstract
In common construction practice, various examples can be found involving a building type consisting of a lower, older, reinforced concrete structure and a more recent upper steel part, forming a so-called “hybrid” building. Conventional seismic design rules give full guidelines for the earthquake
[...] Read more.
In common construction practice, various examples can be found involving a building type consisting of a lower, older, reinforced concrete structure and a more recent upper steel part, forming a so-called “hybrid” building. Conventional seismic design rules give full guidelines for the earthquake design of buildings constructed with the same material throughout. The current seismic codes neglect to provide specific design and detailing guidelines for vertical hybrid buildings and limited existing research is available in the literature, thus leaving a scientific gap that needs to be investigated. In the present work, an effort is made to fill this gap in the knowledge about the behavior of this hybrid building type in sequential earthquakes, which are found in the literature to burden the seismic structural response. Three-dimensional models of hybrid reinforced concrete–steel frames are exposed to sequential ground excitations in horizontal and vertical directions while considering the elastoplastic behavior of these structural elements in the time domain. The lower reinforced concrete parts of the hybrid buildings are detailed here as corresponding to a former structure by a simple approximation. In addition, two boundary connections of the structural steel part upon the r/c part are distinguished for examination in the elastoplastic analyses. Comparisons of the arithmetical analysis results of the hybrid frames for the examined connections are carried out. The seismic response plots of the current non-linear dynamic time-domain analyses of the 3D hybrid frames subjected to sequential ground excitations yield useful conclusions to provide guidelines for a safer seismic design of the hybrid building type, which is not covered by the current codes despite being a common practice.
Full article
(This article belongs to the Special Issue Finite Element Methods with Applications in Civil and Mechanical Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Social Learning and the Exploration-Exploitation Tradeoff
by
and
Computation 2023, 11(5), 101; https://doi.org/10.3390/computation11050101 - 18 May 2023
Abstract
Cultures around the world show varying levels of conservatism. While maintaining traditional ideas prevents wrong ones from being embraced, it also slows or prevents adaptation to new times. Without exploration there can be no improvement, but often this effort is wasted as it
[...] Read more.
Cultures around the world show varying levels of conservatism. While maintaining traditional ideas prevents wrong ones from being embraced, it also slows or prevents adaptation to new times. Without exploration there can be no improvement, but often this effort is wasted as it fails to produce better results, making it better to exploit the best known option. This tension is known as the exploration/exploitation issue, and it occurs at the individual and group levels, whenever decisions are made. As such, it is has been investigated across many disciplines. We extend previous work by approximating a continuum of traits under local exploration, employing the method of adaptive dynamics, and studying multiple fitness functions. In this work, we ask how nature would solve the exploration/exploitation issue, by allowing natural selection to operate on an exploration parameter in a variety of contexts, thinking of exploration as mutation in a trait space with a varying fitness function. Specifically, we study how exploration rates evolve by applying adaptive dynamics to the replicator-mutator equation, under two types of fitness functions. For the first, payoffs are accrued from playing a two-player, two-action symmetric game, we consider representatives of all games in this class, including the Prisoner’s Dilemma, Hawk-Dove, and Stag Hunt games, finding exploration rates often evolve downwards, but can also undergo neutral selection as well depending on the games parameters or initial conditions. Second, we study time dependent fitness with a function having a single oscillating peak. By increasing the period, we see a jump in the optimal exploration rate, which then decreases towards zero as the frequency of environmental change increases. These results establish several possible evolutionary scenarios for exploration rates, providing insight into many applications, including why we can see such diversity in rates of cultural change.
Full article
(This article belongs to the Special Issue Computational Social Science and Complex Systems)
►▼
Show Figures

Figure 1
Open AccessFeature PaperArticle
A Multiple Response Prediction Model for Dissimilar AA-5083 and AA-6061 Friction Stir Welding Using a Combination of AMIS and Machine Learning
by
, , , and
Computation 2023, 11(5), 100; https://doi.org/10.3390/computation11050100 - 15 May 2023
Abstract
This study presents a methodology that combines artificial multiple intelligence systems (AMISs) and machine learning to forecast the ultimate tensile strength (UTS), maximum hardness (MH), and heat input (HI) of AA-5083 and AA-6061 friction stir welding. The machine learning model integrates two machine
[...] Read more.
This study presents a methodology that combines artificial multiple intelligence systems (AMISs) and machine learning to forecast the ultimate tensile strength (UTS), maximum hardness (MH), and heat input (HI) of AA-5083 and AA-6061 friction stir welding. The machine learning model integrates two machine learning methods, Gaussian process regression (GPR) and a support vector machine (SVM), into a single model, and then uses the AMIS as the decision fusion strategy to merge SVM and GPR. The generated model was utilized to anticipate three objectives based on seven controlled/input parameters. These parameters were: tool tilt angle, rotating speed, travel speed, shoulder diameter, pin geometry, type of reinforcing particles, and tool pin movement mechanism. The effectiveness of the model was evaluated using a two-experiment framework. In the first experiment, we used two newly produced datasets, (1) the 7PI-V1 dataset and (2) the 7PI-V2 dataset, and compared the results with state-of-the-art approaches. The second experiment used existing datasets from the literature with varying base materials and parameters. The computational results revealed that the proposed method produced more accurate prediction results than the previous methods. For all datasets, the proposed strategy outperformed existing methods and state-of-the-art processes by an average of 1.35% to 6.78%.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Development of Trading Strategies Using Time Series Based on Robust Interval Forecasts
Computation 2023, 11(5), 99; https://doi.org/10.3390/computation11050099 - 15 May 2023
Abstract
►▼
Show Figures
The task of time series forecasting is to estimate future values based on available observational data. Prediction Intervals methods are aimed at finding not the next point, but the interval that the future value or several values on the forecast horizon can fall
[...] Read more.
The task of time series forecasting is to estimate future values based on available observational data. Prediction Intervals methods are aimed at finding not the next point, but the interval that the future value or several values on the forecast horizon can fall into given current and historical data. This article proposes an approach for modeling a robust interval forecast for a stock portfolio. Here, a trading strategy was developed to profit from trading stocks in the market. The study used real trading data of real stocks. Forty securities were used to calculate the IMOEX. The securities with the highest weight were the following: GAZP, LKOH, SBER. This definition of the strategy allows operating with large portfolios. Increasing the accuracy of the forecast was carried out by estimating the interval of the forecast. Here, a range of values was considered to be a result of forecasting without considering specific moments, which guarantees the reliability of the forecast. The use of a predictive interval approach for the price of shares allows increasing their profitability.
Full article

Figure 1
Open AccessArticle
Reconstruction of Meteorological Records by Methods Based on Dimension Reduction of the Predictor Dataset
Computation 2023, 11(5), 98; https://doi.org/10.3390/computation11050098 - 12 May 2023
Abstract
The reconstruction or prediction of meteorological records through the Analog Ensemble (AnEn) method is very efficient when the number of predictor time series is small. Thus, in order to take advantage of the richness and diversity of information contained in a large number
[...] Read more.
The reconstruction or prediction of meteorological records through the Analog Ensemble (AnEn) method is very efficient when the number of predictor time series is small. Thus, in order to take advantage of the richness and diversity of information contained in a large number of predictors, it is necessary to reduce their dimensions. This study presents methods to accomplish such reduction, allowing the use of a high number of predictor variables. In particular, the techniques of Principal Component Analysis (PCA) and Partial Least Squares (PLS) are used to reduce the dimension of the predictor dataset without loss of essential information. The combination of the AnEn and PLS techniques results in a very efficient hybrid method (PLSAnEn) for reconstructing or forecasting unstable meteorological variables, such as wind speed. This hybrid method is computationally demanding but its performance can be improved via parallelization or the introduction of variants in which all possible analogs are previously clustered. The multivariate linear regression methods used on the new variables resulting from the PCA or PLS techniques also proved to be efficient, especially for the prediction of meteorological variables without local oscillations, such as the pressure.
Full article
(This article belongs to the Special Issue Applications of Computational Mathematics to Simulation and Data Analysis II)
►▼
Show Figures

Figure 1
Open AccessArticle
A Flexible and General-Purpose Platform for Heterogeneous Computing
Computation 2023, 11(5), 97; https://doi.org/10.3390/computation11050097 - 11 May 2023
Abstract
In the big data era, processing large amounts of data imposes several challenges, mainly in terms of performance. Complex operations in data science, such as deep learning, large-scale simulations, and visualization applications, can consume a significant amount of computing time. Heterogeneous computing is
[...] Read more.
In the big data era, processing large amounts of data imposes several challenges, mainly in terms of performance. Complex operations in data science, such as deep learning, large-scale simulations, and visualization applications, can consume a significant amount of computing time. Heterogeneous computing is an attractive alternative for algorithm acceleration, using not one but several different kinds of computing devices (CPUs, GPUs, or FPGAs) simultaneously. Accelerating an algorithm for a specific device under a specific framework, i.e., CUDA/GPU, provides a solution with the highest possible performance at the cost of a loss in generality and requires an experienced programmer. On the contrary, heterogeneous computing allows one to hide the details pertaining to the simultaneous use of different technologies in order to accelerate computation. However, effective heterogeneous computing implementation still requires mastering the underlying design flow. Aiming to fill this gap, in this paper we present a heterogeneous computing platform (HCP). Regarding its main features, this platform allows non-experts in heterogeneous computing to deploy, run, and evaluate high-computational-demand algorithms following a semi-automatic design flow. Given the implementation of an algorithm in C with minimal format requirements, the platform automatically generates the parallel code using a code analyzer, which is adapted to target a set of available computing devices. Thus, while an experienced heterogeneous computing programmer is not required, the process can run over the available computing devices on the platform as it is not an ad hoc solution for a specific computing device. The proposed HCP relies on the OpenCL specification for interoperability and generality. The platform was validated and evaluated in terms of generality and efficiency through a set of experiments using the algorithms of the Polybench/C suite (version 3.2) as the input. Different configurations for the platform were used, considering CPUs only, GPUs only, and a combination of both. The results revealed that the proposed HCP was able to achieve accelerations of up to 270× for specific classes of algorithms, i.e., parallel-friendly algorithms, while its use required almost no expertise in either OpenCL or heterogeneous computing from the programmer/end-user.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Diabetes Classification Using Machine Learning Techniques
Computation 2023, 11(5), 96; https://doi.org/10.3390/computation11050096 - 10 May 2023
Abstract
Machine learning techniques play an increasingly prominent role in medical diagnosis. With the use of these techniques, patients’ data can be analyzed to find patterns or facts that are difficult to explain, making diagnoses more reliable and convenient. The purpose of this research
[...] Read more.
Machine learning techniques play an increasingly prominent role in medical diagnosis. With the use of these techniques, patients’ data can be analyzed to find patterns or facts that are difficult to explain, making diagnoses more reliable and convenient. The purpose of this research was to compare the efficiency of diabetic classification models using four machine learning techniques: decision trees, random forests, support vector machines, and K-nearest neighbors. In addition, new diabetic classification models are proposed that incorporate hyperparameter tuning and the addition of some interaction terms into the models. These models were evaluated based on accuracy, precision, recall, and the F1-score. The results of this study show that the proposed models with interaction terms have better classification performance than those without interaction terms for all four machine learning techniques. Among the proposed models with interaction terms, random forest classifiers had the best performance, with 97.5% accuracy, 97.4% precision, 96.6% recall, and a 97% F1-score. The findings from this study can be further developed into a program that can effectively screen potential diabetes patients.
Full article
(This article belongs to the Special Issue Application of Machine learning and Operation Research in Healthcare Management)
►▼
Show Figures

Figure 1
Open AccessArticle
A Two-Step Machine Learning Method for Predicting the Formation Energy of Ternary Compounds
by
, , , , , and
Computation 2023, 11(5), 95; https://doi.org/10.3390/computation11050095 - 09 May 2023
Abstract
Predicting the chemical stability of yet-to-be-discovered materials is an important aspect of the discovery and development of virtual materials. The conventional approach for computing the enthalpy of formation based on ab initio methods is time consuming and computationally demanding. In this regard, alternative
[...] Read more.
Predicting the chemical stability of yet-to-be-discovered materials is an important aspect of the discovery and development of virtual materials. The conventional approach for computing the enthalpy of formation based on ab initio methods is time consuming and computationally demanding. In this regard, alternative machine learning approaches are proposed to predict the formation energies of different classes of materials with decent accuracy. In this paper, one such machine learning approach, a novel two-step method that predicts the formation energy of ternary compounds, is presented. In the first step, with a classifier, we determine the accuracy of heuristically calculated formation energies in order to increase the size of the training dataset for the second step. The second step is a regression model that predicts the formation energy of the ternary compounds. The first step leads to at least a 100% increase in the size of the dataset with respect to the data available in the Materials Project database. The results from the regression model match those from the existing state-of-the-art prediction models. In addition, we propose a slightly modified version of the Adam optimizer, namely centered Adam, and report the results from testing the centered Adam optimizer.
Full article
(This article belongs to the Section Computational Chemistry)
►▼
Show Figures

Figure 1
Open AccessArticle
An Algebraic Approach to the Solutions of the Open Shop Scheduling Problem
Computation 2023, 11(5), 94; https://doi.org/10.3390/computation11050094 - 08 May 2023
Abstract
The open shop scheduling problem (OSSP) is one of the standard scheduling problems. It consists of scheduling jobs associated with a finite set of tasks developed by different machines. In this case, each machine processes at most one operation at a time, and
[...] Read more.
The open shop scheduling problem (OSSP) is one of the standard scheduling problems. It consists of scheduling jobs associated with a finite set of tasks developed by different machines. In this case, each machine processes at most one operation at a time, and the job processing order on the machines does not matter. The goal is to determine the completion times of the operations processed on the machines to minimize the largest job completion time, called Cmax. This paper proves that each OSSP has associated a path algebra called Brauer configuration algebra whose representation theory (particularly its dimension and the dimension of its center) can be given using the corresponding Cmax value. It has also been proved that the dimension of the centers of Brauer configuration algebras associated with OSSPs with minimal Cmax are congruent modulo the number of machines.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
A Novel Finite Element Model for the Study of Harmful Vibrations on the Aging Spine
Computation 2023, 11(5), 93; https://doi.org/10.3390/computation11050093 - 05 May 2023
Abstract
The human spine is susceptible to a wide variety of adverse consequences from vibrations, including lower back discomfort. These effects are often seen in the drivers of vehicles, earth-moving equipment, and trucks, and also in those who drive for long hours in general.
[...] Read more.
The human spine is susceptible to a wide variety of adverse consequences from vibrations, including lower back discomfort. These effects are often seen in the drivers of vehicles, earth-moving equipment, and trucks, and also in those who drive for long hours in general. The human spine is composed of vertebrae, discs, and tissues that work together to provide it with a wide range of movements and significant load-carrying capability needed for daily physical exercise. However, there is a limited understanding of vibration characteristics in different age groups and the effect of vibration transmission in the spinal column, which may be harmful to the different sections. In this work, a novel finite element model (FEM) was developed to study the variation of vibration absorption capacity due to the aging effect of the different sections of the human spine. These variations were observed from the first three natural frequencies of the human spine structure, which were obtained by solving the eigenvalue problem of the novel finite element model for different ages. From the results, aging was observed to lead to an increase in the natural frequencies of all three spinal segments. As the age increased beyond 30 years, the natural frequency significantly increased for the thoracic segment, compared to lumber and cervical segments. A range of such novel findings indicated the harmful frequencies at which resonance may occur, causing spinal pain and possible injuries. This information would be indispensable for spinal surgeons for the prognosis of spinal column injury (SCI) patients affected by harmful vibrations from workplaces, as well as manufacturers of automotive and aerospace equipment for designing effective dampers for better whole-body vibration mitigation.
Full article
(This article belongs to the Special Issue Application of Finite Element Methods)
►▼
Show Figures

Figure 1

Journal Menu
► ▼ Journal Menu-
- Computation Home
- Aims & Scope
- Editorial Board
- Reviewer Board
- Topical Advisory Panel
- Instructions for Authors
- Special Issues
- Topics
- Sections
- Article Processing Charge
- Indexing & Archiving
- Most Cited & Viewed
- Journal Statistics
- Journal History
- Journal Awards
- Conferences
- Editorial Office
- 10th Anniversary of Computation
Journal Browser
► ▼ Journal BrowserHighly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Entropy, Algorithms, Computation, Fractal Fract
Computational Complex Networks
Topic Editors: Alexandre G. Evsukoff, Yilun ShangDeadline: 30 June 2023
Topic in
Axioms, Computation, Dynamics, Mathematics, Symmetry
Structural Stability and Dynamics: Theory and Applications
Topic Editors: Harekrushna Behera, Chia-Cheng Tsai, Jen-Yi ChangDeadline: 30 September 2023
Topic in
Entropy, Algorithms, Computation, MAKE, Energies, Materials
Artificial Intelligence and Computational Methods: Modeling, Simulations and Optimization of Complex Systems
Topic Editors: Jaroslaw Krzywanski, Yunfei Gao, Marcin Sosnowski, Karolina Grabowska, Dorian Skrobek, Ghulam Moeen Uddin, Anna Kulakowska, Anna Zylka, Bachil El FilDeadline: 20 October 2023
Topic in
Applied Sciences, BioMedInformatics, BioTech, Genes, Computation
Computational Intelligence and Bioinformatics (CIB)
Topic Editors: Marco Mesiti, Giorgio Valentini, Elena Casiraghi, Tiffany J. CallahanDeadline: 31 October 2023

Conferences
Special Issues
Special Issue in
Computation
Intelligent Computing, Modeling and its Applications
Guest Editors: José Antonio Marmolejo Saucedo, Joshua Thomas, Leo Mrsic, Román Rodríguez Aguilar, Pandian VasantDeadline: 30 June 2023
Special Issue in
Computation
Computation to Fight SARS-CoV-2 (CoVid-19)
Guest Editors: Simone Brogi, Vincenzo CalderoneDeadline: 31 July 2023
Special Issue in
Computation
Applications of Statistics and Machine Learning in Electronics
Guest Editors: Stefan Hensel, Marin B. Marinov, Malinka Ivanova, Maya Dimitrova, Hiroaki WagatsumaDeadline: 31 August 2023
Special Issue in
Computation
Solstice 2023—International Conference on Discrete Models of Complex Systems
Guest Editors: Franco Bagnoli, Anna T. LawniczakDeadline: 15 September 2023