论文标题

关于软信息量化的作用

On the Role of Quantization of Soft Information in GRAND

论文作者

Yuan, Peihong, Duffy, Ken R., Gabhart, Evan P., Médard, Muriel

论文摘要

在这项工作中,我们研究了使用量化软输入的猜测随机添加噪声解码(GRAND)。首先,我们分析了有序可靠性位的可实现速率(orbgrand),后者将可靠性的等级顺序作为量化的软信息。我们表明,多线Orbgrand可以达到任何信噪比(SNR)的容量。然后,我们引入了离散的软grand(DSGrand),该软grand(DSGrand)使用传统量化器中的信息。仿真结果表明,DSGRAND井近似于最大样本(ML)解码,并具有许多量化位,这些量化位与当前软解码实现一致。对于(128,106)CRC占地的极性代码,基本的Orbgrand能够匹配或优于CRC辅助连续的取消列表(CA-SCL),该列表(CA-SCL)用CodeWord列表的尺寸为64和3位的量化量化的软信息,而DSGrands ca-scl ca-scl均超过了128码的代码图。 Orbgrand和dsgrand的平均复杂性大约比CA-SCL的平均复杂性和两个数量级的数量级较小。

In this work, we investigate guessing random additive noise decoding (GRAND) with quantized soft input. First, we analyze the achievable rate of ordered reliability bits GRAND (ORBGRAND), which uses the rank order of the reliability as quantized soft information. We show that multi-line ORBGRAND can approach capacity for any signal-to-noise ratio (SNR). We then introduce discretized soft GRAND (DSGRAND), which uses information from a conventional quantizer. Simulation results show that DSGRAND well approximates maximum-likelihood (ML) decoding with a number of quantization bits that is in line with current soft decoding implementations. For a (128,106) CRC-concatenated polar code, the basic ORBGRAND is able to match or outperform CRC-aided successive cancellation list (CA-SCL) decoding with codeword list size of 64 and 3 bits of quantized soft information, while DSGRAND outperforms CA-SCL decoding with a list size of 128 codewords. Both ORBGRAND and DSGRAND exhibit approximately an order of magnitude less average complexity and two orders of magnitude smaller memory requirements than CA-SCL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源