VHTMIMOPac​ketErrorRa​teExample - Why is LDPC on IEEE 802.11ac worst than BCC ??

1 visualización (últimos 30 días)
Hi All, I ran the WLAN System Toolbox example VHTMIMOPacketErrorRateExample, which basically provide a PER simulation of IEEE 802.11ac on a 8x8 MIMO configuration based on BCC as channel coding.
Now, I tried to use LDPC instead of BCC, and I was expecting it to perform a bit better (it does on IEEE 802.11n). But the output is totally worse...
Would anybody know where this poor performance could come from? I used the based TGacChannel and did not change anything from the example beside the channel coding.
Thanks in advance,
BR,
Jérôme

Respuestas (2)

BABA CHOUDHURY
BABA CHOUDHURY el 2 de En. de 2019
Hi Jerome, I know its very late now to respond to your query.
Still, I was also running similar simulations and found LDPC to outperfrom BCC in every scenario. Maybe some other parameters is affecting the calculations.

Darcy Poulin
Darcy Poulin el 12 de Ag. de 2020
I had exactly the same issue, and spoke with Mathworks support.
It turns out that you should configure the LDPC decoding method to use 'norm-min-sum' rather than the default 'bp' algorithm. When I made this change, I saw the predicted improvement in link performance.
For 11ac, you configure it like this:
rxPSDU=wlanVHTDataRecover(vhtdata,chanEst,nVarVHT,cfgVHT, 'LDPCDecodingMethod','norm-min-sum')
The same thing occurs in 11ax. Here you configure it like this:
rxPSDU = wlanHEDataBitRecover(eqDataSym,nVarEst,csi,cfgHE,'LDPCDecodingMethod','norm-min-sum');

Categorías

Más información sobre Wireless Communications en Help Center y File Exchange.

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by