Big Data Analytics and Artificial Intelligence in Electronics

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (15 July 2023) | Viewed by 10688

Special Issue Editors

BK21 Chungbuk Information Technology Education and Research Center, Chungbuk National University, Cheongju 28644, Republic of Korea
Interests: robotics; Internet of Things; wireless sensor networks; big data; artificial intelligence; deep learning; machine learning
Special Issues, Collections and Topics in MDPI journals
Department of Computer Science, College of Computer, Qassim University, Buraydah 52571, Saudi Arabia
Interests: bioinformatics; cryptography and steganography; computer networks; wireless systems; artificial intelligence; optimization; deep learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In big data, the enormous amount of data generated by each industry is typical of ultra-high dimensions, necessitating innovative solutions to discover the required patterns. Big data analysis and intelligent computing solutions are increasingly being used to reduce the complexity and cognitive burden of processing such large amounts of data. Big data analytics is also the focus of data science, meaning that big data is expanding rapidly in every field, such as industry, engineering, healthcare, and so on. The application of advanced analytic techniques to very large, diverse big data sets, which include structured, semi-structured, and unstructured data from various sources and sizes ranging from terabytes to zettabytes, is known as big data analytics. Big data analytics is also critical for modeling and forecasting future outcomes, improving business intelligence, and ultimately fueling faster and better decisions. Big data analytics play an important role in many organizations, such as dealing with massive amounts of data containing critical information, such as cybersecurity, fraud detection, health informatics, intelligence reports for any organization, marketing records, and so on. Optimization, deep learning, machine learning, convolutional neural network (CNN), artificial neural networks (ANN), natural language processing (NLP), support vector machine (SVM), transfer and active learning, data mining, distributed learning, and other AI-based methods can be used for big data analysis.

Furthermore, the integration of big data, the Internet of Things (IoT), and AI has paved the way for a plethora of novel real-world applications. These converged technologies encompass not only information and communication technology, but also various industries such as business, smart farming, industrial automation, and healthcare, to name a few. IoT connects the physical world to the internet and generates massive amounts of big data that must be refined using machine learning and deep learning to extract valuable features. There are also numerous challenges associated with underutilizing the big data generated by IoT devices in terms of machine learning, deep learning, and AI. To address these challenges, it is argued that AI is required for big data analytics.

This Special Issue will concentrate on the most recent advancements in the application of big data analytics and artificial intelligence in electronics. This Special Issue aims to bring together the most recent developments in the aforementioned areas within this framework.

Potential topics of interest include, but are not limited to, the following:

  • Big data analytics and big data applications
  • Big data artificial intelligence (AI)
  • Big data systems
  • Deep learning and big data analytics in medical imaging informatics
  • Intelligent IoT and IIoT
  • Healthcare and medical images security
  • Big data in healthcare, medicine big data analytics
  • Big data techniques and its applications
  • BPNN-based particle swarm optimization in data analysis for transportation big data
  • IoT, medical imaging AI systems, and smart wearable body sensors
  • Deep learning for healthcare
  • Convolutional neural networks (CNN), artificial neural networks (ANN), natural language processing (NLP)
  • Deep learning for image/video-based object detection
  • Deep learning algorithm/architectures/theory
  • Deep learning-based object detection for real-world applications
  • Deep learning for secure fingerprint techniques
  • Machine learning techniques for propagation modeling and wireless communications

Dr. Inam Ullah
Dr. Rehmat Ullah
Dr. Ateeq Ur Rehman
Prof. Dr. Mohamed Tahar Ben Othman
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • big data analytics and big data applications
  • big data artificial intelligence (AI)
  • big data systems
  • deep learning and big data analytics in medical imaging informatics
  • intelligent IoT and IIoT
  • healthcare and medical images security
  • big data in healthcare, medicine big data analytics
  • big data techniques and its applications
  • BPNN-based particle swarm optimization in data analysis for transportation big data
  • IoT, medical imaging AI systems, and smart wearable body sensors
  • deep learning for healthcare
  • convolutional neural networks (CNN), artificial neural networks (ANN), natural language processing (NLP)
  • deep learning for image/video-based object detection
  • deep learning algorithm/architectures/theory
  • deep learning-based object detection for real-world applications
  • deep learning for secure fingerprint techniques
  • machine learning techniques for propagation modeling and wireless communications

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 3925 KiB  
Article
A Signal-Denoising Method for Electromagnetic Leakage from USB Keyboards
by Yihua Peng, Jiemin Zhang, Jian Mao and Mengmeng Cui
Electronics 2023, 12(17), 3647; https://doi.org/10.3390/electronics12173647 - 29 Aug 2023
Viewed by 843
Abstract
USB keyboards are commonly used as computer input devices and inevitably generate electromagnetic (EM) leakage signals during their operation, which carry input information. However, due to the weak energy of a keyboard’s EM signal and the small amount of effective information, the received [...] Read more.
USB keyboards are commonly used as computer input devices and inevitably generate electromagnetic (EM) leakage signals during their operation, which carry input information. However, due to the weak energy of a keyboard’s EM signal and the small amount of effective information, the received leakage signal is often characterized by a low signal-to-noise ratio (SNR). This low SNR affects the subsequent detection and restoration of the information. In order to solve this problem, this paper proposes a denoising method for USB keyboard EM leakage signals and designs a self-attentive denoising adversarial network (SADAN) based on generative adversarial networks (GANs). The denoiser continuously enhances the denoising ability during the generative adversarial process, and the self-attention mechanism enables it to better learn the dependencies of the keyboard EM leak signal sequences, modelling the long-range relationships between the sequence sample points and reducing the impact of the number of network layers on the relationship acquisition. The method achieves noise suppression in the keyboard leakage signal, improving its SNR while preserving the effective information in the leakage signal and finally obtaining a denoised leakage signal that can be effectively restored to the information. Full article
(This article belongs to the Special Issue Big Data Analytics and Artificial Intelligence in Electronics)
Show Figures

Figure 1

26 pages, 2872 KiB  
Article
Grapharizer: A Graph-Based Technique for Extractive Multi-Document Summarization
by Zakia Jalil, Muhammad Nasir, Moutaz Alazab, Jamal Nasir, Tehmina Amjad and Abdullah Alqammaz
Electronics 2023, 12(8), 1895; https://doi.org/10.3390/electronics12081895 - 17 Apr 2023
Cited by 1 | Viewed by 1498
Abstract
In the age of big data, there is increasing growth of data on the Internet. It becomes frustrating for users to locate the desired data. Therefore, text summarization emerges as a solution to this problem. It summarizes and presents the users with the [...] Read more.
In the age of big data, there is increasing growth of data on the Internet. It becomes frustrating for users to locate the desired data. Therefore, text summarization emerges as a solution to this problem. It summarizes and presents the users with the gist of the provided documents. However, summarizer systems face challenges, such as poor grammaticality, missing important information, and redundancy, particularly in multi-document summarization. This study involves the development of a graph-based extractive generic MDS technique, named Grapharizer (GRAPH-based summARIZER), focusing on resolving these challenges. Grapharizer addresses the grammaticality problems of the summary using lemmatization during pre-processing. Furthermore, synonym mapping, multi-word expression mapping, and anaphora and cataphora resolution, contribute positively to improving the grammaticality of the generated summary. Challenges, such as redundancy and proper coverage of all topics, are dealt with to achieve informativity and representativeness. Grapharizer is a novel approach which can also be used in combination with different machine learning models. The system was tested on DUC 2004 and Recent News Article datasets against various state-of-the-art techniques. Use of Grapharizer with machine learning increased accuracy by up to 23.05% compared with different baseline techniques on ROUGE scores. Expert evaluation of the proposed system indicated the accuracy to be more than 55%. Full article
(This article belongs to the Special Issue Big Data Analytics and Artificial Intelligence in Electronics)
Show Figures

Graphical abstract

24 pages, 2336 KiB  
Article
Impact of 3G and 4G Technology Performance on Customer Satisfaction in the Telecommunication Industry
by Inayatul Haq, Jahangeer Ahmed Soomro, Tehseen Mazhar, Ikram Ullah, Tamara Al Shloul, Yazeed Yasin Ghadi, Inam Ullah, Aldosary Saad and Amr Tolba
Electronics 2023, 12(7), 1697; https://doi.org/10.3390/electronics12071697 - 03 Apr 2023
Cited by 1 | Viewed by 3215
Abstract
This study investigates the impact of factors (network coverage, customer service, video calls, and downloading Speed) of 3G and 4G telecommunication services performance on customer satisfaction in the Punjab region of Pakistan. This research indicates how to make strong relations with customers and [...] Read more.
This study investigates the impact of factors (network coverage, customer service, video calls, and downloading Speed) of 3G and 4G telecommunication services performance on customer satisfaction in the Punjab region of Pakistan. This research indicates how to make strong relations with customers and what factors of the 3G and 4G networks need to be improved to enhance the revenue of telecom operator companies. The study has recognized the four main hypotheses responsible for checking the level of customer satisfaction in the telecommunication industry of Pakistan in the Punjab region. The study depends on essential insights gathered on the arbitrary premise from 300 clients of significant telecom administrators in the Punjab area. The respondents have selected on irregular premises and were welcomed to express their sentiments through an organized survey. A complete questionnaire has been utilized for statistics collection and exists based on the analysis of descriptive measurement, correlation, and regression analysis and analyzed through SmartPLS software. They indicate that the independent variables network coverage, customer service, video calls, and downloading speed are key driving factors of customer satisfaction. Among all independent variables “Internet downloading speed” highly impacts the dependent variable “customer satisfaction” based on 3G and 4G network performance. There are a limited number of studies that focus on customer satisfaction in the telecommunication sector in Pakistan. The study will fill the gap in the literature and help service providers to increase the satisfaction level of their customers and captivate new customers. Full article
(This article belongs to the Special Issue Big Data Analytics and Artificial Intelligence in Electronics)
Show Figures

Figure 1

19 pages, 3724 KiB  
Article
Judging Stock Trends According to the Sentiments of Stock Comments in Expert Forums
by Zhichao Chang and Zuping Zhang
Electronics 2023, 12(3), 722; https://doi.org/10.3390/electronics12030722 - 01 Feb 2023
Viewed by 1184
Abstract
Machine learning has been proven to be very effective and it can help to boost the performance of stock price predictions. However, most researchers mainly focus on the historical data of stocks and predict the future trends of stock prices by designing prediction [...] Read more.
Machine learning has been proven to be very effective and it can help to boost the performance of stock price predictions. However, most researchers mainly focus on the historical data of stocks and predict the future trends of stock prices by designing prediction models. They believe that past data must hide useful information in the future. Due to the lack of human participation, the result of this practice must be accidental. To solve this problem, we propose a novel model called Convolutional Neural Network with Sentiment Check (CNN-SC) in this paper. The model recommended by the authors refers to and expands upon the ideas of experts, and then takes the sentiment value in expert comments as the basis for stock price prediction. This model reflects the humanization of stock price prediction and eliminates the problem of a lack of supervision in machine learning. To demonstrate the effectiveness of our novel method, we compare it with five other popular and excellent methods. Although the C-E-SVR&RF and GC-CNN models are also quite effective, our results indicate the superiority of CNN-SC and it is accurately used to calculate the short-term (seven days later) stock price fluctuation of a single stock. Full article
(This article belongs to the Special Issue Big Data Analytics and Artificial Intelligence in Electronics)
Show Figures

Figure 1

21 pages, 5984 KiB  
Article
A Hybrid Attention Network for Malware Detection Based on Multi-Feature Aligned and Fusion
by Xing Yang, Denghui Yang and Yizhou Li
Electronics 2023, 12(3), 713; https://doi.org/10.3390/electronics12030713 - 01 Feb 2023
Cited by 2 | Viewed by 2087
Abstract
With the widespread use of computers, the amount of malware has increased exponentially. Since dynamic detection is costly in both time and resources, most existing malware detection methods are based on static features. However, existing static methods mainly rely on single feature types [...] Read more.
With the widespread use of computers, the amount of malware has increased exponentially. Since dynamic detection is costly in both time and resources, most existing malware detection methods are based on static features. However, existing static methods mainly rely on single feature types of malware, while few pay attention to multi-feature fusion. This paper presents a novel multi-feature extraction and fusion method to effectively detect malware variants by combining binary and opcode features. We propose a stacked convolutional network to capture the temporal and discontinuity information in the function call of the binary file from malware. Additionally, we adopt the triangular attention algorithm to extract code-level features from assembly code. Additionally, these two extracted features are aligned and fused by the cross-attention, which could provide a stable feature representation. We evaluate our method on two different datasets. It achieves an accuracy of 0.9954 on the Kaggle Malware Classification dataset and an accuracy of 0.9544 on a large real-world dataset. To optimize our detection model, we conduct in-depth discussions on different feature extractors and multi-feature fusion strategies. Moreover, a visualized attention module in our model is provided to explain its superiority in the opcode feature extraction. An experimental analysis is performed against five baseline deep learning models and five state-of-the-art malware detection models, which reveals that our strategy outperforms competing approaches in all evaluation circumstances. Full article
(This article belongs to the Special Issue Big Data Analytics and Artificial Intelligence in Electronics)
Show Figures

Figure 1

Back to TopTop