標題: 具高密度位元檢測碼之隨機列舉解碼法
Stochastic List Decoding of High-Density Parity-Check Codes
作者: 李昌明
Lee, Chang-Ming
蘇育德
Su, Yu-Teh
電信工程研究所
關鍵字: 隨機列舉解碼法;交錯熵方法;Stochastic List Decoding;High-Density Parity-Check Codes;CE method
公開日期: 2008
摘要: 在本論文中,我們研究了數種關於具有高密度位元檢測矩陣之線性方塊碼的隨機解碼法。我們的方法可被視為一具有可移動中心的隨機球體解碼法,它會根據一球體對稱的機率分佈來選取在中心向量附近的候選碼。此機率分佈中心向量的更新是根據一被稱為交錯熵(Cross-Entropy)方法的蒙地卡羅法來實現。在每一次的交錯熵方法遞迴過程中,一別具含意的隨機樣本集合被產生並轉化為合法碼。根據這些隨機產生的合法碼與接收向量間的歐幾理得距離,我們選擇E個較佳的候選碼組成一菁英集合並用來修正機率分佈進而影響往後遞迴中產生的隨機樣本。為了確保新產生的隨機樣本會越來越集中在正確的傳送碼附近,不僅中心向量將會移動到傳送碼,其蘊含的機率分佈最終亦會退化為只在傳送碼有值的奇異函數。此外,每次遞迴被更新的機率分佈參數應該要促使新的機率分佈與最佳分佈間的庫柏克萊不勒(Kullback-Leibler)距離越來越接近。 我們在本論文裡提出了三類隨機解碼法。前兩類是特別針對(n.k)里得所羅門碼所提出的設計。在第一種解碼法中,被產生的隨機樣本代表了一隨機錯誤指標向量集合,其中每一個向量都指出了接收字碼中 n-k 個應該被擦拭的位置。我們將接收字元中被指定的相對位置擦拭後即可利用只具擦拭(Erasures-Only)解碼器還原成候選碼。在第二種方法中,n維的實數隨機向量被產生並代表著接收字碼的可靠度向量,而其中n-k個最不可靠的座標會假設為應該要被擦拭。針對每個隨機樣本,我們將其k個最可靠的座標做硬式決策(hard-decision)並利用只具擦拭解碼器將其還原為合法碼。第三種演算法利用一連續位元翻轉演算法來將隨機樣本向量轉換為合法碼。值得一提的是前兩種演算法只對可最大距離分離(Maximum-Distance Separable)碼有用而第三種演算法則沒有這個限制。我們的演算法相對於信賴傳遞(Belief Propagation)演算法與部分現有的代數演算法提供了性能與複雜度上的改善,尤其是具有高碼率、高密度位元檢測矩陣的方塊碼。
In this thesis, we present several novel stochastic decoding algorithms for linear high-density parity-check (HDPC) codes. Our approach can be regarded as a randomized sphere decoding with moving center that selects candidate codewords around a center vector according to a sphere-symmetric probability distribution. The center (median) vector of the distribution is updated according to a Monte Carlo based approach called the Cross-Entropy (CE) method. The CE method produces, in every iteration, a set of random samples which can be transformed into valid codewords. Based on the Euclidean distances between the received word and the random codewords, we select the best E candidates to form the elite set which is then used to modify the probability distribution that govern the generations of the random samples in the ensuing iteration. To ensure that the newly generated samples are concentrated more and more on a small neighborhood of the correct codeword and either the median vector will move to or the underlying distribution will eventually degenerate to a singularity at the transmitted codeword, the parameters of the updated distribution should be such that the new distribution is closest to the optimal distribution in the sense of the Kullback-Leibler distance (i.e CE). We propose three classes of stochastic decoding algorithms in this thesis. The first two are specifically designed for decoding (n,k) Reed-Solomon (RS) codes. For the first decoder, the random samples represent a set of random error locator vectors, each indicates n-k possible erasure positions within the received word. We associate each error locator vector with a candidate odeword by erasures-only (EO) decoding the received word, assuming that erasure locations are those indicated by the error locator vector. The n-dimensional real random vectors in the second algorithm represent reliability vectors whose least reliable n-k coordinates are assumed to be erasures. For each sample, we make component-wise hard-decisions on the most reliable k coordinates and EO-decoding the resulting binary vector. The third algorithm uses a sequential bit flipping procedure to convert each random sample into a legitimate codewords. The first two algorithms are valid for MDS codes only while the third algorithm can be used for decoding any linear block code. Our algorithms offer both complexity and performance advantage over BP and some existing algebraic decoding algorithms, especially for high rate linear HDPC codes of short or medium lengths.
URI: http://140.113.39.130/cdrfb3/record/nctu/#GT008913809
http://hdl.handle.net/11536/77202
Appears in Collections:Thesis


Files in This Item:

  1. 380901.pdf