Sökning: WFRF:(Häger Andreas) >
Pruning and Quantiz...
Pruning and Quantizing Neural Belief Propagation Decoders
-
- Buchberger, Andreas, 1990 (författare)
- Chalmers tekniska högskola,Chalmers University of Technology
-
- Häger, Christian, 1986 (författare)
- Chalmers tekniska högskola,Chalmers University of Technology
-
- Pfister, Henry D. (författare)
- Duke University
-
visa fler...
-
- Schmalen, Laurent (författare)
- Karlsruher Institut für Technologie (KIT),Karlsruhe Institute of Technology (KIT)
-
- Graell I Amat, Alexandre, 1976 (författare)
- Chalmers tekniska högskola,Chalmers University of Technology
-
visa färre...
-
(creator_code:org_t)
- 2021
- 2021
- Engelska.
-
Ingår i: IEEE Journal on Selected Areas in Communications. - 0733-8716 .- 1558-0008. ; 39:7, s. 1957-1966
- Relaterad länk:
-
https://doi.org/10.1...
-
visa fler...
-
https://research.cha...
-
visa färre...
Abstract
Ämnesord
Stäng
- We consider near maximum-likelihood (ML) decoding of short linear block codes. In particular, we propose a novel decoding approach based on neural belief propagation (NBP) decoding recently introduced by Nachmani et al. in which we allow a different parity-check matrix in each iteration of the algorithm. The key idea is to consider NBP decoding over an overcomplete parity-check matrix and use the weights of NBP as a measure of the importance of the check nodes (CNs) to decoding. The unimportant CNs are then pruned. In contrast to NBP, which performs decoding on a given fixed parity-check matrix, the proposed pruning-based neural belief propagation (PB-NBP) typically results in a different parity-check matrix in each iteration. For a given complexity in terms of CN evaluations, we show that PB-NBP yields significant performance improvements with respect to NBP. We apply the proposed decoder to the decoding of a Reed-Muller code, a short low-density parity-check (LDPC) code, and a polar code. PB-NBP outperforms NBP decoding over an overcomplete parity-check matrix by 0.27–0.31 dB while reducing the number of required CN evaluations by up to 97%. For the LDPC code, PB-NBP outperforms conventional belief propagation with the same number of CN evaluations by 0.52 dB. We further extend the pruning concept to offset min-sum decoding and introduce a pruning-based neural offset min-sum (PB-NOMS) decoder, for which we jointly optimize the offsets and the quantization of the messages and offsets. We demonstrate performance 0.5 dB from ML decoding with 5-bit quantization for the Reed-Muller code.
Ämnesord
- TEKNIK OCH TEKNOLOGIER -- Elektroteknik och elektronik -- Telekommunikation (hsv//swe)
- ENGINEERING AND TECHNOLOGY -- Electrical Engineering, Electronic Engineering, Information Engineering -- Telecommunications (hsv//eng)
- TEKNIK OCH TEKNOLOGIER -- Elektroteknik och elektronik -- Kommunikationssystem (hsv//swe)
- ENGINEERING AND TECHNOLOGY -- Electrical Engineering, Electronic Engineering, Information Engineering -- Communication Systems (hsv//eng)
- TEKNIK OCH TEKNOLOGIER -- Elektroteknik och elektronik -- Signalbehandling (hsv//swe)
- ENGINEERING AND TECHNOLOGY -- Electrical Engineering, Electronic Engineering, Information Engineering -- Signal Processing (hsv//eng)
Nyckelord
- Belief propagation
- neural decoders
- quantization
- min-sum decoding
- deep learning
- pruning
Publikations- och innehållstyp
- art (ämneskategori)
- ref (ämneskategori)
Hitta via bibliotek
Till lärosätets databas