Feature Papers in Computers 2023

A special issue of Computers (ISSN 2073-431X).

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 34236

Special Issue Editor


E-Mail Website
Guest Editor
Faculty of Applied Mathematics, Silesian University of Technology, 44-100 Gliwice, Poland
Interests: disease diagnostics using artificial intelligence methods
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This is a Special Issue formed of high-quality papers in open-access form by Editorial Board Members, or those invited by the Editorial Office and the Editor-in-Chief in Computer Sciences.

Prof. Dr. Robertas Damaševičius
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Computers is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

16 pages, 1372 KiB  
Article
Custom ASIC Design for SHA-256 Using Open-Source Tools
by Lucas Daudt Franck, Gabriel Augusto Ginja, João Paulo Carmo, José A. Afonso and Maximiliam Luppe
Computers 2024, 13(1), 9; https://doi.org/10.3390/computers13010009 - 25 Dec 2023
Viewed by 2341
Abstract
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated [...] Read more.
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated the search for more efficient hardware solutions, such as application-specific integrated circuits (ASICs). This work presents a custom ASIC hardware accelerator for the SHA-256 algorithm entirely created using open-source electronic design automation tools. The integrated circuit was synthesized using SkyWater SKY130 130 nm process technology through the OpenLANE automated workflow. The proposed final design is compatible with 32-bit microcontrollers, has a total area of 104,585 µm2, and operates at a maximum clock frequency of 97.9 MHz. Several optimization configurations were tested and analyzed during the synthesis phase to enhance the performance of the final design. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

17 pages, 2708 KiB  
Article
Chance-Constrained Optimization Formulation for Ship Conceptual Design: A Comparison of Metaheuristic Algorithms
by Jakub Kudela
Computers 2023, 12(11), 225; https://doi.org/10.3390/computers12110225 - 03 Nov 2023
Cited by 2 | Viewed by 1144
Abstract
This paper presents a new chance-constrained optimization (CCO) formulation for the bulk carrier conceptual design. The CCO problem is modeled through the scenario design approach. We conducted extensive numerical experiments comparing the convergence of both canonical and state-of-the-art metaheuristic algorithms on the original [...] Read more.
This paper presents a new chance-constrained optimization (CCO) formulation for the bulk carrier conceptual design. The CCO problem is modeled through the scenario design approach. We conducted extensive numerical experiments comparing the convergence of both canonical and state-of-the-art metaheuristic algorithms on the original and CCO formulations and showed that the CCO formulation is substantially more difficult to solve. The two best-performing methods were both found to be differential evolution-based algorithms. We then provide an analysis of the resulting solutions in terms of the dependence of the distribution functions of the unit transportation costs and annual cargo capacity of the ship design on the probability of violating the chance constraints. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

18 pages, 521 KiB  
Article
Efficient Day-Ahead Scheduling of PV-STATCOMs in Medium-Voltage Distribution Networks Using a Second-Order Cone Relaxation
by Oscar Danilo Montoya, Oscar David Florez-Cediel and Walter Gil-González
Computers 2023, 12(7), 142; https://doi.org/10.3390/computers12070142 - 18 Jul 2023
Cited by 1 | Viewed by 767
Abstract
This paper utilizes convex optimization to implement a day-ahead scheduling strategy for operating a photovoltaic distribution static compensator (PV-STATCOM) in medium-voltage distribution networks. The nonlinear non-convex programming model of the day-ahead scheduling strategy is transformed into a convex optimization model using the second-order [...] Read more.
This paper utilizes convex optimization to implement a day-ahead scheduling strategy for operating a photovoltaic distribution static compensator (PV-STATCOM) in medium-voltage distribution networks. The nonlinear non-convex programming model of the day-ahead scheduling strategy is transformed into a convex optimization model using the second-order cone programming approach in the complex domain. The main goal of efficiently operating PV-STATCOMs in distribution networks is to dynamically compensate for the active and reactive power generated by renewable energy resources such as photovoltaic plants. This is achieved by controlling power electronic converters, usually voltage source converters, to manage reactive power with lagging or leading power factors. Numerical simulations were conducted to analyze the effects of different power factors on the IEEE 33- and 69-bus systems. The simulations considered operations with a unity power factor (active power injection only), a zero power factor (reactive power injection only), and a variable power factor (active and reactive power injections). The results demonstrated the benefits of dynamic, active and reactive power compensation in reducing grid power losses, voltage profile deviations, and energy purchasing costs at the substation terminals. These simulations were conducted using the CVX tool and the Gurobi solver in the MATLAB programming environment. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

16 pages, 3050 KiB  
Article
A Temporal Transformer-Based Fusion Framework for Morphological Arrhythmia Classification
by Nafisa Anjum, Khaleda Akhter Sathi, Md. Azad Hossain and M. Ali Akber Dewan
Computers 2023, 12(3), 68; https://doi.org/10.3390/computers12030068 - 21 Mar 2023
Cited by 1 | Viewed by 2107
Abstract
By using computer-aided arrhythmia diagnosis tools, electrocardiogram (ECG) signal plays a vital role in lowering the fatality rate associated with cardiovascular diseases (CVDs) and providing information about the patient’s cardiac health to the specialist. Current advancements in deep-learning-based multivariate time series data analysis, [...] Read more.
By using computer-aided arrhythmia diagnosis tools, electrocardiogram (ECG) signal plays a vital role in lowering the fatality rate associated with cardiovascular diseases (CVDs) and providing information about the patient’s cardiac health to the specialist. Current advancements in deep-learning-based multivariate time series data analysis, such as ECG data classification include LSTM, Bi-LSTM, CNN, with Bi-LSTM, and other sequential networks. However, these networks often struggle to accurately determine the long-range dependencies among data instances, which can result in problems such as vanishing or exploding gradients for longer data sequences. To address these shortcomings of sequential models, a hybrid arrhythmia classification system using recurrence along with a self-attention mechanism is developed. This system utilizes convolutional layers as a part of representation learning, designed to capture the salient features of raw ECG data. Then, the latent embedded layer is fed to a self-attention-assisted transformer encoder model. Because the ECG data are highly influenced by absolute order, position, and proximity of time steps due to interdependent relationships among immediate neighbors, a component of recurrence using Bi-LSTM is added to the encoder model to address this characteristic of the data. The model performance indices such as classification accuracy and F1-score were found to be 99.2%. This indicates that the combination of recurrence along with self-attention-assisted architecture produces improved classification of arrhythmia from raw ECG signal when compared with the state-of-the-art models. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

13 pages, 810 KiB  
Article
An Enhanced Virtual Cord Protocol Based Multi-Casting Strategy for the Effective and Efficient Management of Mobile Ad Hoc Networks
by Sohaib Latif, Xianwen Fang, Syed Muhammad Mohsin, Syed Muhammad Abrar Akber, Sheraz Aslam, Hana Mujlid and Kaleem Ullah
Computers 2023, 12(1), 21; https://doi.org/10.3390/computers12010021 - 16 Jan 2023
Cited by 1 | Viewed by 2399
Abstract
To solve problems with limited resources such as power, storage, bandwidth, and connectivity, efficient and effective data management solutions are needed. It is believed that the most successful algorithms for circumventing these constraints are those that self-organise and collaborate. To make the best [...] Read more.
To solve problems with limited resources such as power, storage, bandwidth, and connectivity, efficient and effective data management solutions are needed. It is believed that the most successful algorithms for circumventing these constraints are those that self-organise and collaborate. To make the best use of available bandwidth, mobile ad hoc networks (MANETs) employ the strategy of multi-casting. The communication cost of any network can be significantly reduced by multi-casting, and the network can save resources by transmitting only one set of data to numerous receivers at a time. In this study, we implemented multi-casting in the virtual cord protocol (VCP), which uses virtual coordinates (VC) to improve effective routing and control wireless data transmission. We have improved the classic VCP protocol by making it so that intermediate nodes can also forward or re-transmit the dataset to interested nodes. This improves data transmission from the sender to multiple receivers. Simulation results proved efficacy of our proposed enhanced virtual cord protocol-based multi-casting strategy over traditional VCP protocol and helped in reduction of number of MAC transmissions, minimization of end-to-end delay, and maximization of packet delivery ratio. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

11 pages, 10664 KiB  
Article
Multistage Spatial Attention-Based Neural Network for Hand Gesture Recognition
by Abu Saleh Musa Miah, Md. Al Mehedi Hasan, Jungpil Shin, Yuichi Okuyama and Yoichi Tomioka
Computers 2023, 12(1), 13; https://doi.org/10.3390/computers12010013 - 05 Jan 2023
Cited by 21 | Viewed by 3175
Abstract
The definition of human-computer interaction (HCI) has changed in the current year because people are interested in their various ergonomic devices ways. Many researchers have been working to develop a hand gesture recognition system with a kinetic sensor-based dataset, but their performance accuracy [...] Read more.
The definition of human-computer interaction (HCI) has changed in the current year because people are interested in their various ergonomic devices ways. Many researchers have been working to develop a hand gesture recognition system with a kinetic sensor-based dataset, but their performance accuracy is not satisfactory. In our work, we proposed a multistage spatial attention-based neural network for hand gesture recognition to overcome the challenges. We included three stages in the proposed model where each stage is inherited the CNN; where we first apply a feature extractor and a spatial attention module by using self-attention from the original dataset and then multiply the feature vector with the attention map to highlight effective features of the dataset. Then, we explored features concatenated with the original dataset for obtaining modality feature embedding. In the same way, we generated a feature vector and attention map in the second stage with the feature extraction architecture and self-attention technique. After multiplying the attention map and features, we produced the final feature, which feeds into the third stage, a classification module to predict the label of the correspondent hand gesture. Our model achieved 99.67%, 99.75%, and 99.46% accuracy for the senz3D, Kinematic, and NTU datasets. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

Review

Jump to: Research

22 pages, 1278 KiB  
Review
Model Compression for Deep Neural Networks: A Survey
by Zhuo Li, Hengyi Li and Lin Meng
Computers 2023, 12(3), 60; https://doi.org/10.3390/computers12030060 - 12 Mar 2023
Cited by 19 | Viewed by 12571
Abstract
Currently, with the rapid development of deep learning, deep neural networks (DNNs) have been widely applied in various computer vision tasks. However, in the pursuit of performance, advanced DNN models have become more complex, which has led to a large memory footprint and [...] Read more.
Currently, with the rapid development of deep learning, deep neural networks (DNNs) have been widely applied in various computer vision tasks. However, in the pursuit of performance, advanced DNN models have become more complex, which has led to a large memory footprint and high computation demands. As a result, the models are difficult to apply in real time. To address these issues, model compression has become a focus of research. Furthermore, model compression techniques play an important role in deploying models on edge devices. This study analyzed various model compression methods to assist researchers in reducing device storage space, speeding up model inference, reducing model complexity and training costs, and improving model deployment. Hence, this paper summarized the state-of-the-art techniques for model compression, including model pruning, parameter quantization, low-rank decomposition, knowledge distillation, and lightweight model design. In addition, this paper discusses research challenges and directions for future work. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

14 pages, 575 KiB  
Review
A Short Survey on Deep Learning for Multimodal Integration: Applications, Future Perspectives and Challenges
by Giovanna Maria Dimitri
Computers 2022, 11(11), 163; https://doi.org/10.3390/computers11110163 - 18 Nov 2022
Cited by 4 | Viewed by 4907
Abstract
Deep learning has achieved state-of-the-art performances in several research applications nowadays: from computer vision to bioinformatics, from object detection to image generation. In the context of such newly developed deep-learning approaches, we can define the concept of multimodality. The objective of this research [...] Read more.
Deep learning has achieved state-of-the-art performances in several research applications nowadays: from computer vision to bioinformatics, from object detection to image generation. In the context of such newly developed deep-learning approaches, we can define the concept of multimodality. The objective of this research field is to implement methodologies which can use several modalities as input features to perform predictions. In this, there is a strong analogy with respect to what happens with human cognition, since we rely on several different senses to make decisions. In this article, we present a short survey on multimodal integration using deep-learning methods. In a first instance, we comprehensively review the concept of multimodality, describing it from a two-dimensional perspective. First, we provide, in fact, a taxonomical description of the multimodality concept. Secondly, we define the second multimodality dimension as the one describing the fusion approaches in multimodal deep learning. Eventually, we describe four applications of multimodal deep learning to the following fields of research: speech recognition, sentiment analysis, forensic applications and image processing. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

27 pages, 1907 KiB  
Review
A Systemic Mapping Study of Business Intelligence Maturity Models for Higher Education Institutions
by Christopher Lee Stewart and M. Ali Akber Dewan
Computers 2022, 11(11), 153; https://doi.org/10.3390/computers11110153 - 28 Oct 2022
Cited by 1 | Viewed by 2427
Abstract
Higher education institutions (HEIs) are investing in business intelligence (BI) to meet the increasing demand for information stemming from their operations. Information technology (IT) managers in higher education may turn to BI maturity models to evaluate the current state of HEIs’ BI operation [...] Read more.
Higher education institutions (HEIs) are investing in business intelligence (BI) to meet the increasing demand for information stemming from their operations. Information technology (IT) managers in higher education may turn to BI maturity models to evaluate the current state of HEIs’ BI operation capabilities and evaluate the readiness for future improvements. However, generic BI maturity models do not have domain-specific attributes that ensure a high degree of compatibility with HEIs. This study’s objective is to survey maturity models that could be used in HEIs and identify those used for BI to perform an analysis of their qualities and identify future avenues for research into HEI-specific BI maturity models. A systemic mapping was undertaken via both a keyword and snowball search of five indexing services, 6037 articles were processed using inclusion and exclusion criteria resulting in the identification of forty-one academic works regarding maturity model uses which were mapped to ten categories. The mapping reveals an increasing number of publications featuring maturity models for HEI, particularly since 2018, focused on e-learning and ICT. A single instance of a BI maturity model for HEI emerged in 2022 within the European HEI context. The HE-BIA MM has more dimensions than most other models identified, yet only a single co-occurrence of dimensions was identified in name only. We conclude that BI maturity models for HEI are emerging as a field of research with future directions for research including exploring co-occurrence of dimensions with existing maturity models, performing case studies, and validation of HE-BIA MM outside the European HEI context. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

Back to TopTop