Algebraic Decoding Principles for Linear Block Codes
The decoding of linear block codes is a di•cult task in general, since a native decoding algorithm, Ti.e., an exhaustive search, has exponential time complexity. Œe de€nition of the decoding problem is not uni€ed in the literature and many various formulations exists. In this chapter, we describe elementary decoding principles of linear block codes in Hamming metric. Œe re€nement for algebraic block codes and in particular for GRS codes is outlined.
In a €rst step, we give relevant de€nitions for hard-decision decoding problems in Section 3.1. For each problem, we identify when a decoder fails. In the second step, the hard-decision decoding problem is generalized to the case where the channel, in addition to the received vector, outputs information on the reliability of the received symbol, the so-called so‰-information.
Syndrome-based hard- and so‰-decision decoding approaches for GRS codes are considered in Section 3.2. We show in Section 3.3, how the Extended Euclidean Algorithm (EEA), as originally modi€ed by Sugiyama, Kasahara, Hirasawa and Namekawa [A-SKHN75; A-SKHN76] for Goppa codes, can be used for error/erasure decoding of GRS codes. Œe Fundamental Iterative Algorithm (FIA) of Feng and Tzeng [A-FT85; A-FT91a] €nds the minimal number of linearly dependent columns (and the corresponding vanishing linear combination) of a given arbitrary matrix. Œe FIA can be suited to a structured matrix like a Hankel matrix. Œe homogeneous linear equations that originates from a so-called Key Equation, i.e., a polynomial equation that connects algebraically the input and the output of the unique decoding problem of GRS codes, is of Hankel structure. Œerefore, we show the adjustment of the FIA and prove the complexity reducing initialization rule, which is generalized in Chapter 4 and 5. We illustrate the native and the adjusted FIA, when used for Bounded Minimum Distance (BMD) decoding of a GRS code.
In Section 3.4, collaborative decoding of IGRS codes, as de€ned in Subsection 2.4.2, is considered. We de€ne the model of burst-errors and give the set of Key Equations for the collaborative decoding scheme. Œe de€nition of decoding failure is given and the corresponding homogeneous set of equations is outlined.
We prove the main theorem of the interpolation-based hard-decision list decoding of GRS codes by Guruswami and Sudan [A-Sud97; A-GS99] in Section 3.5. We show that the Johnson radius is achieved asymptotically. Furthermore, the extension of KoŠer¨–Vardy [A-KV03a] to a so‰-decision scenario is discussed. Œe comparison of existing realizations for the interpolation step concludes this chapter.
1 Decoding Principles and Complexity Issues
3.1.1 Hard-Decision Decoding of Linear Block Codes and Generalized Reed–Solomon Codes
Œroughout this subsection, let C Fnq be a linear [n; k; d]q block code and let c 2 C. Let e 2 Fnq denote the error with » = wt(e). Let r = c + e denote the received word.
De€nition 3.1 (Bounded Distance Decoder)
A bounded distance decoder for a given linear block code C is a function that returns one codeword or a decoding failure for a given decoding radius and a given received vector r, i.e.:
BDD: (Fqn; N) ! C [ fDecoding Failureg (r; ) 7!BDD(r; ):
Œe bounded distance decoder returns one codeword c from a given received vector if jB (r) \ Cj = 1; (3.1)
wherefailure.B (r) is the Hamming ball of radius around the vector r, and otherwise it declares a decoding
With De€nition 3.1, we can easily de€ne a Bounded Minimum Distance (BMD) decoder.
De€nition 3.2 (Bounded Minimum Distance (BMD) Decoder)
A Bounded Minimum Distance (BMD) decoder for an [n; k; d]q block code is a bounded distance decoder as in De€nition 3.1 with decoding radius = b(d 1)=2c.
If the number of errors » is at most b(d 1)=2c, a BMD decoder does not fail. For a bounded distance decoder with decoding radius > b(d 1)=2c a decoding failure can occur if » > b(d 1)=2c and therefore includes cases where the number of errors is smaller than the decoding radius.
For algebraic block codes, BMD decoding was €rst realized by syndrome-based decoding algorithms. Œe €rst polynomial-time approaches were proposed by Peterson [A-Pet60], Gorenstein–Zierler [A-GZ61] and Chien [A-Chi64]. Many e•cient BMD decoding algorithms for GRS codes exist. Œe di‚erent steps for syndrome-based decoding have quadratic or even sub-quadratic time and space complexity.
We derive the Key Equation for syndrome-based BMD decoding of GRS codes as in De€nition 3.2 from the simplest interpolation-based approach in Section 3.2. It is a special case of the derivation of the Key Equation for the Sudan principle by Roth and Ruckenstein [I-RR98; A-RR00] and resembles the ones of Welch–Berlekamp [O-WB86] and Gao [O-Gao03]. We consider the derivation for the case of erasures, which was not considered in [A-RR00].
A decoder capable to decode GRS codes beyond half the minimum distance is given in Section 4.1 and was €rst developed by Schmidt, Sidorenko and Bossert [I-SSB06; O-Sch07; A-SSB10]. It is not clear if it is a bounded distance decoder as in De€nition 3.1 or if it declares in some cases a failure even if (3.1) is ful€lled. Œe Schmidt–Sidorenko–Bossert decoding approach is based on a virtual extension of a GRS code to an IGRS code. Œerefore, we investigate €rst the collaborative decoding principle of IGRS codes in Section 3.4.
De€nition 3.3 (List Decoder)
A list decoder for a linear block code C is a function that returns a list of up to ‘ codewords or a decoding failure for a given received vector r, i.e.:
LD: (Fnq; N) ! fC [ ;g‘ n f;g‘ [ fDecoding Failureg
(r; ‘) 7!LD(r; ‘): Œe decoding radius is such that jB (r) \ Cj ‘;
where B (r) is the Hamming ball of radius around the vector r. Œe list decoder returns at most ‘ codewords c0; c1; : : : ; c‘ 1 2 C from a given received vector r. If there is no codeword c, such that d(r; c) holds, i.e., the list is empty, a decoding failure is declared.
Similar to a bounded distance decoder, a list decoder with decoding radius > b(d 1)=2c can return a decoding failure if » > b(d 1) =2c. Œe decoding spheres for BMD decoder and a decoder with higher decoding radius are shown in Figure 3.1.
Figure 3.1: Comparison of the decoding spheres of BMD (Sub€gure 3.1a) and a decoder with radius larger than b(d 1)=2c (Sub€gure 3.1b). In the case of list decoding the illustrated received vector r can be mapped to the codewords c0 and c1.
Œe principle of list decoding was €rst considered in the work of Elias [O-Eli57] and Wozencra‰ [O-Woz58]. Œe €rst polynomial-time list decoder for GRS and Algebraic-Geometry codes was developed by Sudan [A-Sud97], Shokrollahi–Wasserman [A-SW99] and extended by Guruswami–Sudan [A-GS99]. We give an introduction to their interpolation-based principle in Section 3.5.
A nearest-codeword decoder returns the closest codeword, i.e., a codeword with smallest Hamming distance to the received word (and therefore generalizes the bounded distance decoder of De€nition 3.1). Œe de€nition of a maximum likelihood decoder coincides with the one of a nearest-codeword decoder in Hamming metric for channels, where an error word with higher Hamming weight is less probable than one of lower Hamming weight.
Maximum likelihood decoding of linear codes, in general, and RS codes, in particular, is NP-hard [A-BMVT78; A-GV05]. It remains an open problem to €nd polynomial-time decoding algorithms with near maximum likelihood performance for GRS as well as for linear block codes.
Table of contents :
2 Linear Block Codes over Finite Fields
2.1 Basic Notations
2.1.1 Hamming Metric and Polynomials over Finite Fields
2.1.2 e Univariate Lagrange Interpolation Problem
2.1.3 Bivariate Polynomials over Finite Fields
2.2 Basics of Linear Block Codes and Cyclic Codes
2.2.1 Basics of Linear Block Codes
2.2.2 Cyclic Codes
2.3 Product Codes
2.3.1 Basic Principle
2.3.2 Cyclic Product Codes
2.3.3 Generalized Concatenated Codes
2.4 Generalized Reed–Solomon Codes
2.4.1 Denition and Notation
2.4.2 Interleaved Generalized Reed–Solomon Codes
3 Algebraic Decoding of Linear Block Codes
3.1 Decoding Principles and Complexity Issues
3.1.1 Hard-Decision Decoding of Linear Block Codes and GRS Codes
3.1.2 So-Decision Decoding of GRS Codes
3.2 Syndrome-Based Decoding of GRS Codes
3.2.1 Welch–Berlekamp Approach as List-One Decoder and Explicit Syndromes
3.2.2 Error/Erasure Decoding of GRS Codes
3.2.3 Welch–Berlekamp-like Approach for Error/Erasure Decoding
3.3 Decoding Algorithms Based on the Key Equation
3.3.2 EEA and Error/Erasure Decoding of GRS Codes
3.3.3 e Fundamental Iterative Algorithm
3.4 Collaborative Decoding of Interleaved GRS Codes
3.4.1 Error Model
3.4.2 Syndromes and Collaborative Decoding Algorithms
3.5 Interpolation-Based Decoding of GRS Codes
3.5.2 e Guruswami–Sudan Principle
3.5.3 So-Decision Decoding Based on Guruswami–Sudan
3.5.4 Some Realizations of Guruswami–Sudan for GRS Codes
4 Key Equations for Decoding of GRS Codes Beyond Half the Minimum Distance
4.1 A Unique Decoding Approach for GRS Codes
4.1.1 Basic Idea
4.1.2 Virtual Extension of a GRS Code to an IGRS Code
4.1.3 Decoding and Failure Probability
4.2 Key Equation for Sudan
4.2.1 Modied Univariate Reformulation
4.2.2 Adjustment of the Fundamental Iterative Algorithm
4.2.3 Example: Sudan Decoding GRS Code with Adapted FIA
4.3 Key Equations for Guruswami–Sudan
4.3.1 Univariate Reformulation
4.3.2 e Fundamental Iterative Algorithm for a Block-Hankel Matrix
4.3.3 Example: Guruswami–Sudan Decoding of a GRS Code with Adapted FIA
4.3.4 Reduced Set of Equations
4.4 Explicit Guruswami–Sudan Syndromes and Hermite Interpolation
4.4.1 Univariate Hermite Interpolation
4.4.2 Modied Reformulation for Guruswami–Sudan
4.4.3 Explicit Syndromes
4.5 Conclusion and Future Work
5 Key Equations for Interpolation-Based So-Decision Decoding of GRS Codes
5.1 Key Equations for the K¨oer–Vardy Algorithm
5.1.1 From the Multiplicity Matrix to Univariate Polynomials
5.1.2 Block–Hankel Structure and FIA
5.2 Re-Encoding Transformation with Key Equations
5.2.1 Ordering of Multiplicities
5.2.2 Reduced Set of Univariate Equations
5.2.3 Reduced Set of Homogeneous Equations in Block-Hankel Form
5.3 Conclusion and Future Work
6 Bounding the Minimum Distance of Cyclic Codes
6.1 Bounding the Minimum Distance by Rational Functions
6.1.2 More Preliminaries on Cyclic Codes
6.1.3 Bound I-a: Rational Functions
6.1.4 Some Classes of Cyclic Codes
6.1.5 Reversible Codes
6.1.6 Non-Reversible Codes
6.1.7 Generalizing Boston’s Bounds
6.1.8 Generalized Key Equation and Decoding Algorithm
6.2 Bounding Minimum Distance by Embedding a Cyclic Code Into a Cyclic Product Codes
6.2.1 Bound I-b: Basic Idea
6.2.2 Syndrome-Based Error/Erasure Decoding Approach up to Bound I-b
6.2.3 Bound II: Generalized Hartmann–Tzeng Bound Using Cyclic Product Code
6.2.4 Bound III: Using a Second Cyclic Code
6.2.5 Decoding up to Bound II
6.3 Lowest-Code-Rate Binary Cyclic Codes with Minimum Distance Two and ree
6.3.1 Motivation and Previous Work
6.3.2 Implications for Bounding the Minimum Distance
6.4 Cyclic Generalized Product Codes
6.4.1 Related Work and Basic Idea
6.4.2 Denition and Dening Set
6.4.3 Example of a Cyclic Generalized Product Code
6.4.4 Using Cyclic Generalized Product Codes for Bounding the Minimum Distance
6.5 Conclusion and Future Work