entropy-logo

Journal Browser

Journal Browser

Application of Entropy to Computer Vision and Medical Imaging

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (15 July 2023) | Viewed by 12728

Special Issue Editors

Department of Medicine, University of Rouen Normandy, 76130 Mont-Saint-Aignan, France
Interests: image processing; pattern recognition; medical image analysis; information fusion

E-Mail Website
Guest Editor
Department of Medicine, University of Rouen Normandy, 76130 Mont-Saint-Aignan, France
Interests: bayesian and statistical inference; information theory; deep-learning; signal processing; medical image analysis

Special Issue Information

Dear Colleagues,

Shannon entropy is initially devoted to quantifying the minimum bits necessary to encode a signal without loss of information; it represents the asymptotic limit of the compression ratio in the Huffman algorithm. Moreover, Shannon entropy is linked to the amount of disorder in random signals. Since Shannon’s work, generalizations of entropy (Rényie, Havrda–Charvat) as well as various applications have emerged. In statistics, as well as in machine learning, different entropies have been used to model uncertainty in data and in parameter estimation and can be also used to evaluate the amount of information in data. From entropies, one can define divergences which are used as “distances” between probability distributions. In deep learning, these entropies are usually used as loss functions for probabilistic neural networks.

This Special Issue is devoted to applications of probabilistic neural networks for computer vision and medical image analysis.

This Special Issue will accept unpublished original papers and comprehensive reviews focused on (but not restricted to) the following research areas:

- Modeling new loss functions in neural networks;

- Use of entropies and information measures for uncertainty quantification in a neural network;

- Choice of relevant entropy depending on the data and the task;

- Influence of activation functions on the choice of entropy;

- Axioms behind the choice of entropy;

- Entropy measures for the evaluation of image quality;

- Applications to medical image analysis and computer vision.

Dr. Su Ruan
Dr. Jérôme Lapuyade-Lahorgue
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • shannon entropy
  • generalized entropies
  • uncertainty quantification
  • deep learning
  • medical image analysis
  • computer vision
  • amount of information

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

12 pages, 3625 KiB  
Article
Composite Attention Residual U-Net for Rib Fracture Detection
by Xiaoming Wang and Yongxiong Wang
Entropy 2023, 25(3), 466; https://doi.org/10.3390/e25030466 - 07 Mar 2023
Cited by 2 | Viewed by 1265
Abstract
Computed tomography (CT) images play a vital role in diagnosing rib fractures and determining the severity of chest trauma. However, quickly and accurately identifying rib fractures in a large number of CT images is an arduous task for radiologists. We propose a U-net-based [...] Read more.
Computed tomography (CT) images play a vital role in diagnosing rib fractures and determining the severity of chest trauma. However, quickly and accurately identifying rib fractures in a large number of CT images is an arduous task for radiologists. We propose a U-net-based detection method designed to extract rib fracture features at the pixel level to find rib fractures rapidly and precisely. Two modules are applied to the segmentation network—a combined attention module (CAM) and a hybrid dense dilated convolution module (HDDC). The features of the same layer of the encoder and the decoder are fused through CAM, strengthening the local features of the subtle fracture area and increasing the edge features. HDDC is used between the encoder and decoder to obtain sufficient semantic information. Experiments show that on the public dataset, the model test brings the effects of Recall (81.71%), F1 (81.86%), and Dice (53.28%). Experienced radiologists reach lower false positives for each scan, whereas they have underperforming neural network models in terms of detection sensitivities with a long time diagnosis. With the aid of our model, radiologists can achieve higher detection sensitivities than computer-only or human-only diagnosis. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

19 pages, 4448 KiB  
Article
Improved Complementary Pulmonary Nodule Segmentation Model Based on Multi-Feature Fusion
by Tiequn Tang, Feng Li, Minshan Jiang, Xunpeng Xia, Rongfu Zhang and Kailin Lin
Entropy 2022, 24(12), 1755; https://doi.org/10.3390/e24121755 - 30 Nov 2022
Cited by 1 | Viewed by 1520
Abstract
Accurate segmentation of lung nodules from pulmonary computed tomography (CT) slices plays a vital role in the analysis and diagnosis of lung cancer. Convolutional Neural Networks (CNNs) have achieved state-of-the-art performance in the automatic segmentation of lung nodules. However, they are still challenged [...] Read more.
Accurate segmentation of lung nodules from pulmonary computed tomography (CT) slices plays a vital role in the analysis and diagnosis of lung cancer. Convolutional Neural Networks (CNNs) have achieved state-of-the-art performance in the automatic segmentation of lung nodules. However, they are still challenged by the large diversity of segmentation targets, and the small inter-class variances between the nodule and its surrounding tissues. To tackle this issue, we propose a features complementary network according to the process of clinical diagnosis, which made full use of the complementarity and facilitation among lung nodule location information, global coarse area, and edge information. Specifically, we first consider the importance of global features of nodules in segmentation and propose a cross-scale weighted high-level feature decoder module. Then, we develop a low-level feature decoder module for edge feature refinement. Finally, we construct a complementary module to make information complement and promote each other. Furthermore, we weight pixels located at the nodule edge on the loss function and add an edge supervision to the deep supervision, both of which emphasize the importance of edges in segmentation. The experimental results demonstrate that our model achieves robust pulmonary nodule segmentation and more accurate edge segmentation. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

19 pages, 1599 KiB  
Article
Entropy Measurements for Leukocytes’ Surrounding Informativeness Evaluation for Acute Lymphoblastic Leukemia Classification
by Krzysztof Pałczyński, Damian Ledziński and Tomasz Andrysiak
Entropy 2022, 24(11), 1560; https://doi.org/10.3390/e24111560 - 29 Oct 2022
Viewed by 1262
Abstract
The study of leukemia classification using deep learning techniques has been conducted by multiple research teams worldwide. Although deep convolutional neural networks achieved high quality of sick vs. healthy patient discrimination, their inherent lack of human interpretability of the decision-making process hinders the [...] Read more.
The study of leukemia classification using deep learning techniques has been conducted by multiple research teams worldwide. Although deep convolutional neural networks achieved high quality of sick vs. healthy patient discrimination, their inherent lack of human interpretability of the decision-making process hinders the adoption of deep learning techniques in medicine. Research involving deep learning proved that distinguishing between healthy and sick patients using microscopic images of lymphocytes is possible. However, it could not provide information on the intermediate steps in the diagnosis process. As a result, despite numerous examinations, it is still unclear whether the lymphocyte is the only object in the microscopic picture containing leukemia-related information or if the leukocyte’s surroundings also contain the desired information. In this work, entropy measures and machine learning models were applied to study the informativeness of both whole images and lymphocytes’ surroundings alone for Leukemia classification. This work aims to provide human-interpretable features marking the probability of sickness occurrence. The research stated that the hue distribution of images with lymphocytes obfuscated alone is informative enough to facilitate 93.0% accuracy in healthy vs. sick classification. The research was conducted on the ALL-IDB2 dataset. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

11 pages, 594 KiB  
Article
A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction
by Thibaud Brochet, Jérôme Lapuyade-Lahorgue, Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat, Vincent Grégoire, Pierre Vera and Su Ruan
Entropy 2022, 24(4), 436; https://doi.org/10.3390/e24040436 - 22 Mar 2022
Cited by 5 | Viewed by 2198 | Correction
Abstract
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely [...] Read more.
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely used as a loss function for most neural networks applied to the segmentation, classification and detection of images. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy. In this work, we compare these two entropies through a medical application for predicting recurrence in patients with head–neck and lung cancers after treatment. Based on both CT images and patient information, a multitask deep neural network is proposed to perform a recurrence prediction task using cross-entropy as a loss function and an image reconstruction task. Tsallis–Havrda–Charvat cross-entropy is a parameterized cross-entropy with the parameter α. Shannon entropy is a particular case of Tsallis–Havrda–Charvat entropy for α=1. The influence of this parameter on the final prediction results is studied. In this paper, the experiments are conducted on two datasets including in total 580 patients, of whom 434 suffered from head–neck cancers and 146 from lung cancers. The results show that Tsallis–Havrda–Charvat entropy can achieve better performance in terms of prediction accuracy with some values of α. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

11 pages, 341 KiB  
Article
Reconstructing Binary Signals from Local Histograms
by Jon Sporring and Sune Darkner
Entropy 2022, 24(3), 433; https://doi.org/10.3390/e24030433 - 21 Mar 2022
Cited by 2 | Viewed by 2217
Abstract
In this paper, we considered the representation power of local overlapping histograms for discrete binary signals. We give an algorithm that is linear in signal size and factorial in window size for producing the set of signals, which share a sequence of densely [...] Read more.
In this paper, we considered the representation power of local overlapping histograms for discrete binary signals. We give an algorithm that is linear in signal size and factorial in window size for producing the set of signals, which share a sequence of densely overlapping histograms, and we state the values for the sizes of the number of unique signals for a given set of histograms, as well as give bounds on the number of metameric classes, where a metameric class is a set of signals larger than one, which has the same set of densely overlapping histograms. Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Show Figures

Figure 1

Other

Jump to: Research

1 pages, 156 KiB  
Correction
Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436
by Thibaud Brochet, Jérôme Lapuyade-Lahorgue, Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat, Vincent Grégoire, Pierre Vera and Su Ruan
Entropy 2022, 24(5), 685; https://doi.org/10.3390/e24050685 - 13 May 2022
Cited by 3 | Viewed by 2673
Abstract
Alexandre Huat, Sébastien Thureau, David Pasquier, Isabelle Gardin, Romain Modzelewski, David Gibon, Juliette Thariat and Vincent Grégoire were not included as authors in the original publication [...] Full article
(This article belongs to the Special Issue Application of Entropy to Computer Vision and Medical Imaging)
Back to TopTop