Abdelaziz Amraoui, Andrea Montanari, Tom Richardson, Ruediger Urbanke
In this paper we investigate the behavior of iteratively decoded low-density parity-check codes over the binary erasure channel in the so-called ``waterfall region." We show that the performance curves in this region follow a very basic scaling law. We conjecture that essentially the same scaling behavior applies in a much more general setting and we provide some empirical evidence to support this conjecture. The scaling law, together with the error floor expressions developed previously, can be used for fast finite-length optimization.
Kapil Bhattad, Vishwambhar Rathi, Ruediger Urbanke
The min-sum (MS) algorithm is arguably the second most fundamental algorithm in the realm of message passing due to its optimality (for a tree code) with respect to the {\em block error} probability \cite{Wiberg}. There also seems to be a fundamental relationship of MS decoding with the linear programming decoder \cite{Koetter}. Despite its importance, its fundamental properties have not nearly been studied as well as those of the sum-product (also known as BP) algorithm. We address two questions related to the MS rule. First, we characterize the stability condition under MS decoding. It turns out to be essentially the same condition as under BP decoding. Second, we perform a degree distribution optimization. Contrary to the case of BP decoding, under MS decoding the thresholds of the best degree distributions for standard irregular LDPC ensembles are significantly bounded away from the Shannon threshold. More precisely, on the AWGN channel, for the best codes that we find, the gap to capacity is 1dB for a rate 0.3 code and it is 0.4dB when the rate is 0.9 (the gap decreases monotonically as we increase the rate). We also used the optimization procedure to design codes for modified MS algorithm where the output of the check node is scaled by a constant $1/α$. For $α= 1.25$, we observed that the gap to capacity was lesser for the modified MS algorithm when compared with the MS algorithm. However, it was still quite large, varying from 0.75 dB to 0.2 dB for rates between 0.3 and 0.9. We conclude by posing what we consider to be the most important open questions related to the MS algorithm.
Cyril Measson, Andrea Montanari, Rudiger Urbanke
We show that iterative coding systems can not surpass capacity using only quantities which naturally appear in density evolution. Although the result in itself is trivial, the method which we apply shows that in order to achieve capacity the various components in an iterative coding system have to be perfectly matched. This generalizes the perfect matching condition which was previously known for the case of transmission over the binary erasure channel to the general class of binary-input memoryless output-symmetric channels. Potential applications of this perfect matching condition are the construction of capacity-achieving degree distributions and the determination of the number required iterations as a function of the multiplicative gap to capacity.
Abdelaziz Amraoui, Andrea Montanari, Ruediger Urbanke
We explain how to optimize finite-length LDPC codes for transmission over the binary erasure channel. Our approach relies on an analytic approximation of the erasure probability. This is in turn based on a finite-length scaling result to model large scale erasures and a union bound involving minimal stopping sets to take into account small error events. We show that the performances of optimized ensembles as observed in simulations are well described by our approximation. Although we only address the case of transmission over the binary erasure channel, our method should be applicable to a more general setting.
Abdelaziz Amraoui, Andrea Montanari, Tom Richardson, Rudiger Urbanke
Consider communication over the binary erasure channel BEC using random low-density parity-check codes with finite-blocklength n from `standard' ensembles. We show that large error events is conveniently described within a scaling theory, and explain how to estimate heuristically their effect. Among other quantities, we consider the finite length threshold e(n), defined by requiring a block error probability P_B = 1/2. For ensembles with minimum variable degree larger than two, the following expression is argued to hold e(n) = e -e_1 n^{-2/3} +Θ(n^{-1}) with a calculable shift} parameter e_1>0.
Cyril Measson, Andrea Montanari, Tom Richardson, Rudiger Urbanke
There is a fundamental relationship between belief propagation and maximum a posteriori decoding. The case of transmission over the binary erasure channel was investigated in detail in a companion paper. This paper investigates the extension to general memoryless channels (paying special attention to the binary case). An area theorem for transmission over general memoryless channels is introduced and some of its many consequences are discussed. We show that this area theorem gives rise to an upper-bound on the maximum a posteriori threshold for sparse graph codes. In situations where this bound is tight, the extrinsic soft bit estimates delivered by the belief propagation decoder coincide with the correct a posteriori probabilities above the maximum a posteriori threshold. More generally, it is conjectured that the fundamental relationship between the maximum a posteriori and the belief propagation decoder which was observed for transmission over the binary erasure channel carries over to the general case. We finally demonstrate that in order for the design rate of an ensemble to approach the capacity under belief propagation decoding the component codes have to be perfectly matched, a statement which is well known for the special case of transmission over the binary erasure channel.
Vishwambhar Rathi, Ruediger Urbanke
The Extended BP (EBP) Generalized EXIT (GEXIT) function introduced in \cite{MMRU05} plays a fundamental role in the asymptotic analysis of sparse graph codes. For transmission over the binary erasure channel (BEC) the analytic properties of the EBP GEXIT function are relatively simple and well understood. The general case is much harder and even the existence of the curve is not known in general. We introduce some tools from non-linear analysis which can be useful to prove the existence of EXIT like curves in some cases. The main tool is the Krasnoselskii-Rabinowitz (KR) bifurcation theorem.
Vishwambhar Rathi, Rudiger Urbanke
We derive the density evolution equations for non-binary low-density parity-check (LDPC) ensembles when transmission takes place over the binary erasure channel. We introduce ensembles defined with respect to the general linear group over the binary field. For these ensembles the density evolution equations can be written compactly. The density evolution for the general linear group helps us in understanding the density evolution for codes defined with respect to finite fields. We compute thresholds for different alphabet sizes for various LDPC ensembles. Surprisingly, the threshold is not a monotonic function of the alphabet size. We state the stability condition for non-binary LDPC ensembles over any binary memoryless symmetric channel. We also give upper bounds on the MAP thresholds for various non-binary ensembles based on EXIT curves and the area theorem.
Andrea Montanari, Rudiger Urbanke
These are the notes for a set of lectures delivered by the two authors at the Les Houches Summer School on `Complex Systems' in July 2006. They provide an introduction to the basic concepts in modern (probabilistic) coding theory, highlighting connections with statistical mechanics. We also stress common concepts with other disciplines dealing with similar problems that can be generically referred to as `large graphical models'. While most of the lectures are devoted to the classical channel coding problem over simple memoryless channels, we present a discussion of more complex channel models. We conclude with an overview of the main open challenges in the field.
Satish Babu Korada, Ruediger Urbanke
We consider communication over binary-input memoryless output-symmetric channels using low-density parity-check codes and message-passing decoding. The asymptotic (in the length) performance of such a combination for a fixed number of iterations is given by density evolution. Letting the number of iterations tend to infinity we get the density evolution threshold, the largest channel parameter so that the bit error probability tends to zero as a function of the iterations. In practice we often work with short codes and perform a large number of iterations. It is therefore interesting to consider what happens if in the standard analysis we exchange the order in which the blocklength and the number of iterations diverge to infinity. In particular, we can ask whether both limits give the same threshold. Although empirical observations strongly suggest that the exchange of limits is valid for all channel parameters, we limit our discussion to channel parameters below the density evolution threshold. Specifically, we show that under some suitable technical conditions the bit error probability vanishes below the density evolution threshold regardless of how the limit is taken.
Andrea Montanari, Ruediger Urbanke
We consider communication over a noisy network under randomized linear network coding. Possible error mechanism include node- or link- failures, Byzantine behavior of nodes, or an over-estimate of the network min-cut. Building on the work of Koetter and Kschischang, we introduce a probabilistic model for errors. We compute the capacity of this channel and we define an error-correction scheme based on random sparse graphs and a low-complexity decoding algorithm. By optimizing over the code degree profile, we show that this construction achieves the channel capacity in complexity which is jointly quadratic in the number of coded information bits and sublogarithmic in the error probability.
Cyril Measson, Andrea Montanari, Tom Richardson, Rudiger Urbanke
We consider communication over memoryless channels using low-density parity-check code ensembles above the iterative (belief propagation) threshold. What is the computational complexity of decoding (i.e., of reconstructing all the typical input codewords for a given channel output) in this regime? We define an algorithm accomplishing this task and analyze its typical performance. The behavior of the new algorithm can be expressed in purely information-theoretical terms. Its analysis provides an alternative proof of the area theorem for the binary erasure channel. Finally, we explain how the area theorem is generalized to arbitrary memoryless channels. We note that the recently discovered relation between mutual information and minimal square error is an instance of the area theorem in the setting of Gaussian channels.
Cyril Measson, Andrea Montanari, Ruediger Urbanke
There is a fundamental relationship between belief propagation and maximum a posteriori decoding. A decoding algorithm, which we call the Maxwell decoder, is introduced and provides a constructive description of this relationship. Both, the algorithm itself and the analysis of the new decoder are reminiscent of the Maxwell construction in thermodynamics. This paper investigates in detail the case of transmission over the binary erasure channel, while the extension to general binary memoryless channels is discussed in a companion paper.
H. Pfister, I. Sason, R. Urbanke
We present two sequences of ensembles of non-systematic irregular repeat-accumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity per information bit. This is in contrast to all previous constructions of capacity-achieving sequences of ensembles whose complexity grows at least like the log of the inverse of the gap (in rate) to capacity. The new bounded complexity result is achieved by puncturing bits, and allowing in this way a sufficient number of state nodes in the Tanner graph representing the codes. We also derive an information-theoretic lower bound on the decoding complexity of randomly punctured codes on graphs. The bound holds for every memoryless binary-input output-symmetric channel, and is refined for the BEC.
H. Pfister, I. Sason, R. Urbanke
We present two sequences of ensembles of non-systematic irregular repeat-accumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity per information bit. This is in contrast to all previous constructions of capacity-achieving sequences of ensembles whose complexity grows at least like the log of the inverse of the gap (in rate) to capacity. The new bounded complexity result is achieved by puncturing bits, and allowing in this way a sufficient number of state nodes in the Tanner graph representing the codes. We also derive an information-theoretic lower bound on the decoding complexity of randomly punctured codes on graphs. The bound holds for every memoryless binary-input output-symmetric channel and is refined for the BEC.
Shrinivas Kudekar, Cyril Measson, Tom Richardson, Ruediger Urbanke
We consider spatially coupled code ensembles. A particular instance are convolutional LDPC ensembles. It was recently shown that, for transmission over the binary erasure channel, this coupling increases the belief propagation threshold of the ensemble to the maximum a-priori threshold of the underlying component ensemble. We report on empirical evidence which suggest that the same phenomenon also occurs when transmission takes place over a general binary memoryless symmetric channel. This is confirmed both by simulations as well as by computing EBP GEXIT curves and by comparing the empirical BP thresholds of coupled ensembles to the empirically determined MAP thresholds of the underlying regular ensembles. We further consider ways of reducing the rate-loss incurred by such constructions.
Shrinivas Kudekar, Tom Richardson, Ruediger Urbanke
We investigate spatially coupled code ensembles. For transmission over the binary erasure channel, it was recently shown that spatial coupling increases the belief propagation threshold of the ensemble to essentially the maximum a-priori threshold of the underlying component ensemble. This explains why convolutional LDPC ensembles, originally introduced by Felstrom and Zigangirov, perform so well over this channel. We show that the equivalent result holds true for transmission over general binary-input memoryless output-symmetric channels. More precisely, given a desired error probability and a gap to capacity, we can construct a spatially coupled ensemble which fulfills these constraints universally on this class of channels under belief propagation decoding. In fact, most codes in that ensemble have that property. The quantifier universal refers to the single ensemble/code which is good for all channels but we assume that the channel is known at the receiver. The key technical result is a proof that under belief propagation decoding spatially coupled ensembles achieve essentially the area threshold of the underlying uncoupled ensemble. We conclude by discussing some interesting open problems.
S. Hamed Hassani, Rudiger Urbanke
Polar codes provably achieve the capacity of a wide array of channels under successive decoding. This assumes infinite precision arithmetic. Given the successive nature of the decoding algorithm, one might worry about the sensitivity of the performance to the precision of the computation. We show that even very coarsely quantized decoding algorithms lead to excellent performance. More concretely, we show that under successive decoding with an alphabet of cardinality only three, the decoder still has a threshold and this threshold is a sizable fraction of capacity. More generally, we show that if we are willing to transmit at a rate $δ$ below capacity, then we need only $c \log(1/δ)$ bits of precision, where $c$ is a universal constant.
Vishwambhar Rathi, Ruediger Urbanke, Mattias Andersson, Mikael Skoglund
We consider transmission over a wiretap channel where both the main channel and the wiretapper's channel are Binary Erasure Channels (BEC). We use convolutional LDPC ensembles based on the coset encoding scheme. More precisely, we consider regular two edge type convolutional LDPC ensembles. We show that such a construction achieves the whole rate-equivocation region of the BEC wiretap channel. Convolutional LDPC ensemble were introduced by Felström and Zigangirov and are known to have excellent thresholds. Recently, Kudekar, Richardson, and Urbanke proved that the phenomenon of "Spatial Coupling" converts MAP threshold into BP threshold for transmission over the BEC. The phenomenon of spatial coupling has been observed to hold for general binary memoryless symmetric channels. Hence, we conjecture that our construction is a universal rate-equivocation achieving construction when the main channel and wiretapper's channel are binary memoryless symmetric channels, and the wiretapper's channel is degraded with respect to the main channel.
Shrinivas Kudekar, Tom Richardson, Ruediger Urbanke
Convolutional LDPC ensembles, introduced by Felstrom and Zigangirov, have excellent thresholds and these thresholds are rapidly increasing as a function of the average degree. Several variations on the basic theme have been proposed to date, all of which share the good performance characteristics of convolutional LDPC ensembles. We describe the fundamental mechanism which explains why "convolutional-like" or "spatially coupled" codes perform so well. In essence, the spatial coupling of the individual code structure has the effect of increasing the belief-propagation (BP) threshold of the new ensemble to its maximum possible value, namely the maximum-a-posteriori (MAP) threshold of the underlying ensemble. For this reason we call this phenomenon "threshold saturation." This gives an entirely new way of approaching capacity. One significant advantage of such a construction is that one can create capacity-approaching ensembles with an error correcting radius which is increasing in the blocklength. Our proof makes use of the area theorem of the BP-EXIT curve and the connection between the MAP and BP threshold recently pointed out by Measson, Montanari, Richardson, and Urbanke. Although we prove the connection between the MAP and the BP threshold only for a very specific ensemble and only for the binary erasure channel, empirically a threshold saturation phenomenon occurs for a wide class of ensembles and channels. More generally, we conjecture that for a large range of graphical systems a similar saturation of the "dynamical" threshold occurs once individual components are coupled sufficiently strongly. This might give rise to improved algorithms as well as to new techniques for analysis.