entropy-logo

Journal Browser

Journal Browser

Statistical Signal Processing, Detection and Estimation: Dealing with the Data Deluge

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Signal and Data Analysis".

Deadline for manuscript submissions: closed (31 May 2022) | Viewed by 5552

Special Issue Editors


E-Mail Website
Guest Editor
Electrical and Computer Engineering Department, Utah State University, Logan, UT 84322, USA
Interests: statistical signal processing; digital communications; error correction coding; machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Co-Guest Editor
Electrical and Computer Engineering Department, Utah Valley University, Orem, UT 84058, USA
Interests: variational bayes; compressive sensing; statistical signal processing; machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Co-Guest Editor
Dept. of Electronics, University of York / PUC-Rio, York YO10 5DD, UK
Interests: MIMO; digital communications

Special Issue Information

Dear Colleagues,

The present deluge of data and the ubiquity of information appliances provides opportunity and motivation for continued advances in statistical techniques to extract useful information from the deluge. These techniques may be adapted to problems related to the nature of the data, such as streaming data (with a concomitant need to update distribution estimates with low complexity), or the fact that the data may be textual or categorical. Detection and estimation theory draws elements from classical statistics (such as hypothesis testing), statistical signal processing, pattern recognition, and machine learning. These disciplines combine with techniques from a variety of application areas, such as tracking, navigation, financial modeling, bioinformatics, cybersecurity, and many others, to provide a broad and powerful set of tools. Within this toolkit, detection and estimation techniques often employ information-theoretic techniques, such as the Kullback–Leibler distortion (used, for example, in variational Bayes, or for characterizing the performance of a distribution-based classifier).

This Special Issue of Entropy aims to be a forum for the presentation of new, improved, and developing techniques in the broad area of detection and estimation theory, and applications of these techniques, particularly methods having a connection with the “entropic” theme of the journal.

This Special Issue will accept unpublished original papers and comprehensive reviews with topics related to the following, or related, areas:

  • Online, incremental updates and learning methods;
  • Detection and estimation on graphs;
  • Variational and message passing methods;
  • Machine learning, including updating models from streaming data;
  • Addressing the “big p, small n” problem, dealing with high-dimensional data with relatively small amounts of learning instances;
  • Detection, estimation, and tracking;
  • Natural language and textual processing;
  • Techniques to deal with categorical data;
  • Statistical modeling techniques;
  • Applications of modern detection and estimation techniques to pertinent datasets;
  • Detection and estimation applied to engineered signals, such as communication signals;
  • Detection and estimation under sparsity conditions.

Prof. Dr. Todd K. Moon
Dr. Mohammad Shekaramiz
Dr. Rodrigo de Lamare
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

32 pages, 1876 KiB  
Article
Compressive Sensing via Variational Bayesian Inference under Two Widely Used Priors: Modeling, Comparison and Discussion
by Mohammad Shekaramiz and Todd K. Moon
Entropy 2023, 25(3), 511; https://doi.org/10.3390/e25030511 - 16 Mar 2023
Cited by 2 | Viewed by 1206
Abstract
Compressive sensing is a sub-Nyquist sampling technique for efficient signal acquisition and reconstruction of sparse or compressible signals. In order to account for the sparsity of the underlying signal of interest, it is common to use sparsifying priors such as Bernoulli–Gaussian-inverse Gamma (BGiG) [...] Read more.
Compressive sensing is a sub-Nyquist sampling technique for efficient signal acquisition and reconstruction of sparse or compressible signals. In order to account for the sparsity of the underlying signal of interest, it is common to use sparsifying priors such as Bernoulli–Gaussian-inverse Gamma (BGiG) and Gaussian-inverse Gamma (GiG) priors on the components of the signal. With the introduction of variational Bayesian inference, the sparse Bayesian learning (SBL) methods for solving the inverse problem of compressive sensing have received significant interest as the SBL methods become more efficient in terms of execution time. In this paper, we consider the sparse signal recovery problem using compressive sensing and the variational Bayesian (VB) inference framework. More specifically, we consider two widely used Bayesian models of BGiG and GiG for modeling the underlying sparse signal for this problem. Although these two models have been widely used for sparse recovery problems under various signal structures, the question of which model can outperform the other for sparse signal recovery under no specific structure has yet to be fully addressed under the VB inference setting. Here, we study these two models specifically under VB inference in detail, provide some motivating examples regarding the issues in signal reconstruction that may occur under each model, perform comparisons and provide suggestions on how to improve the performance of each model. Full article
Show Figures

Figure 1

21 pages, 768 KiB  
Article
Optimal Passive Source Localization for Acoustic Emissions
by Carlos A. Prete, Jr., Vítor H. Nascimento and Cássio G. Lopes
Entropy 2021, 23(12), 1585; https://doi.org/10.3390/e23121585 - 27 Nov 2021
Cited by 1 | Viewed by 1545
Abstract
Acoustic emission is a non-destructive testing method where sensors monitor an area of a structure to detect and localize passive sources of elastic waves such as expanding cracks. Passive source localization methods based on times of arrival (TOAs) use TOAs estimated from the [...] Read more.
Acoustic emission is a non-destructive testing method where sensors monitor an area of a structure to detect and localize passive sources of elastic waves such as expanding cracks. Passive source localization methods based on times of arrival (TOAs) use TOAs estimated from the noisy signals received by the sensors to estimate the source position. In this work, we derive the probability distribution of TOAs assuming they were obtained by the fixed threshold technique—a popular low-complexity TOA estimation technique—and show that, if the sampling rate is high enough, TOAs can be approximated by a random variable distributed according to a mixture of Gaussian distributions, which reduces to a Gaussian in the low noise regime. The optimal source position estimator is derived assuming the parameters of the mixture are known, in which case its MSE matches the Cramér–Rao lower bound, and an algorithm to estimate the mixture parameters from noisy signals is presented. We also show that the fixed threshold technique produces biased time differences of arrival (TDOAs) and propose a modification of this method to remove the bias. The proposed source position estimator is validated in simulation using biased and unbiased TDOAs, performing better than other TOA-based passive source localization methods in most scenarios. Full article
Show Figures

Figure 1

21 pages, 615 KiB  
Article
Kurtosis-Based Symbol Timing and Carrier Phase/Frequency Tracking
by Todd K. Moon and Jacob H. Gunther
Entropy 2021, 23(7), 819; https://doi.org/10.3390/e23070819 - 27 Jun 2021
Viewed by 1688
Abstract
Kurtosis is known to be effective at estimating signal timing and carrier phase offset when the processing is performed in a “burst mode,” that is, operating on a block of received signal in an offline fashion. In this paper, kurtosis-based estimation is extended [...] Read more.
Kurtosis is known to be effective at estimating signal timing and carrier phase offset when the processing is performed in a “burst mode,” that is, operating on a block of received signal in an offline fashion. In this paper, kurtosis-based estimation is extended to provide tracking of timing and carrier phase, and frequency offsets. The algorithm is compared with conventional PLL-type timing/phase estimation and shown to be superior in terms of speed of convergence, with comparable variance in the matched filter output symbols. Full article
Show Figures

Figure 1

Back to TopTop