Abstract:A new low-complexity deep learning method is proposed to improve the complex neural network belief propagation algorithm, and reduce the complexity of the hardware implementation. By giving an alternative graph of the confidence propagation decoding algorithm and combining with the min-sum algorithm, the hyperbolic function operation is removed and the multiplication operation which consumes a lot of resources in engineering practice is converted into a simple addition operation. And combined with the recurrent neural network, different parameters of multiple layers are constrained into single layer parameters. Considering the attributes of different information, no additional parameters are attached to the log-likelihood ratio message from the channel, parameters are attached to the edge when the check node is updated. Combined with the distribution of edge parameters, it is found that part of weight values deviate greatly from 1, which is determined as the edge that needs to add weight. parameters of neural network are reduced by about 20%. performance of the proposed algorithm is improved by 1dB compared with the belief propagation algorithm in the high SNR regime, which is convenient for hardware implementation and has strong practicability.