entropy-logo

Journal Browser

Journal Browser

Finite-Length Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (15 September 2021) | Viewed by 7503

Special Issue Editors


E-Mail Website
Guest Editor
Signal Theory and Communications Department, Universidad Carlos III de Madrid, Avenida de la Universidad, 30, 28911 Leganés, Spain
Interests: finite-length information theory; performance analysis of communication systems; joint source-channel coding; and statistical hypothesis testing

E-Mail Website
Guest Editor
Electrical Engineering,California Institute of Technology, 1200 E California Blvd, MC 136-93, Pasadena, CA 91125, USA
Interests: information theory; theory of random processes; coding; wireless communications; delay-sensitive communications; control

Special Issue Information

Dear Colleagues,

Several classical information theoretical results follow from considering sequences of infinite duration. While this assumption is adequate for systems using long error-correcting codes or long compression sequences, it hinders the relevance of information theory in other problems which require sequences of moderate length, such as low-latency communications, signal processing under delay constraints, or control of dynamical systems. It is then not surprising that the field of finite-length information theory, which considers the impact of using sequences of finite duration, has recently gained a great deal of attention. The final goal of this line of research is to establish fundamental limits and to develop strategies to attain them in the regime of finite coding delay.

This Special Issue aims at collecting recent results in finite-length information theory and its intersection with neighboring fields. Possible topics include, but are not limited to:

  • One-shot information theory and information spectrum methods.
  • Nonasymptotic performance bounds for point-to-point and multiterminal communication systems.
  • Refined asymptotics: error exponents, dispersion, and moderate deviations analysis.
  • Error-correcting codes: design guidelines and performance analysis in the finite-length regime.
  • Lossless and lossy data compression at finite blocklengths.
  • Delay-constrained joint source-channel coding.
  • Exploiting channel feedback in code design to improve complexity–delay–reliability tradeoffs.
  • Receiver design: constellation, quantization, and iterative decoding.
  • Information theory for the control of dynamical systems.

Dr. Gonzalo Vazquez
Dr. Victoria Kostina
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • delay-constrained information theory
  • nonasymptotic performance bounds
  • error exponents
  • dispersion analysis
  • moderate deviations analysis
  • coding theory
  • channel coding
  • source coding
  • joint source-channel coding
  • feedback communications
  • information theory for control

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

32 pages, 542 KiB  
Article
Multi-Class Cost-Constrained Random Coding for Correlated Sources over the Multiple-Access Channel
by Arezou Rezazadeh, Josep Font-Segura, Alfonso Martinez and Albert Guillén i Fàbregas
Entropy 2021, 23(5), 569; https://doi.org/10.3390/e23050569 - 3 May 2021
Viewed by 1726
Abstract
This paper studies a generalized version of multi-class cost-constrained random-coding ensemble with multiple auxiliary costs for the transmission of N correlated sources over an N-user multiple-access channel. For each user, the set of messages is partitioned into classes and codebooks are generated [...] Read more.
This paper studies a generalized version of multi-class cost-constrained random-coding ensemble with multiple auxiliary costs for the transmission of N correlated sources over an N-user multiple-access channel. For each user, the set of messages is partitioned into classes and codebooks are generated according to a distribution depending on the class index of the source message and under the constraint that the codewords satisfy a set of cost functions. Proper choices of the cost functions recover different coding schemes including message-dependent and message-independent versions of independent and identically distributed, independent conditionally distributed, constant-composition and conditional constant composition ensembles. The transmissibility region of the scheme is related to the Cover-El Gamal-Salehi region. A related family of correlated-source Gallager source exponent functions is also studied. The achievable exponents are compared for correlated and independent sources, both numerically and analytically. Full article
(This article belongs to the Special Issue Finite-Length Information Theory)
Show Figures

Figure 1

21 pages, 519 KiB  
Article
Time-Limited Codewords over Band-Limited Channels: Data Rates and the Dimension of the W-T Space
by Youssef Jaffal and Ibrahim Abou-Faycal
Entropy 2020, 22(9), 924; https://doi.org/10.3390/e22090924 - 23 Aug 2020
Cited by 1 | Viewed by 2095
Abstract
We consider a communication system whereby T-seconds time-limited codewords are transmitted over a W-Hz band-limited additive white Gaussian noise channel. In the asymptotic regime as WT, it is known that the maximal achievable rates with such a [...] Read more.
We consider a communication system whereby T-seconds time-limited codewords are transmitted over a W-Hz band-limited additive white Gaussian noise channel. In the asymptotic regime as WT, it is known that the maximal achievable rates with such a scheme converge to Shannon’s capacity with the presence of 2WT degrees of freedom. In this work we study the degrees of freedom and the achievable information rates for finite values of WT. We use prolate spheroidal wave functions to obtain an information lossless equivalent discrete formulation and then we apply Polyanskiy’s results on coding in the finite block-length regime. We derive upper and lower bounds on the achievable rates and the corresponding degrees of freedom and we numerically evaluate them for sample values of 2WT. The bounds are asymptotically tight and numerical computations show the gap between them decreases as 2WT increases. Additionally, the possible decrease from 2WT in the available degrees of freedom is upper-bounded by a logarithmic function of 2WT. Full article
(This article belongs to the Special Issue Finite-Length Information Theory)
Show Figures

Figure 1

58 pages, 681 KiB  
Article
Finite-Length Analyses for Source and Channel Coding on Markov Chains
by Masahito Hayashi and Shun Watanabe
Entropy 2020, 22(4), 460; https://doi.org/10.3390/e22040460 - 18 Apr 2020
Cited by 11 | Viewed by 2780
Abstract
We derive finite-length bounds for two problems with Markov chains: source coding with side-information where the source and side-information are a joint Markov chain and channel coding for channels with Markovian conditional additive noise. For this purpose, we point out two important aspects [...] Read more.
We derive finite-length bounds for two problems with Markov chains: source coding with side-information where the source and side-information are a joint Markov chain and channel coding for channels with Markovian conditional additive noise. For this purpose, we point out two important aspects of finite-length analysis that must be argued when finite-length bounds are proposed. The first is the asymptotic tightness, and the other is the efficient computability of the bound. Then, we derive finite-length upper and lower bounds for the coding length in both settings such that their computational complexity is low. We argue the first of the above-mentioned aspects by deriving the large deviation bounds, the moderate deviation bounds, and second-order bounds for these two topics and show that these finite-length bounds achieve the asymptotic optimality in these senses. Several kinds of information measures for transition matrices are introduced for the purpose of this discussion. Full article
(This article belongs to the Special Issue Finite-Length Information Theory)
Show Figures

Figure 1

Back to TopTop