Relative entropy at the channel output of a capacity-achieving code
Author(s)
Polyanskiy, Yury; Verdu, Sergio
Downloadoptcodes_allerton.pdf (155.6Kb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).
Date issued
2011-09Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsJournal
Proceedings of the 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Polyanskiy, Yury, and Sergio Verdu. Relative Entropy at the Channel Output of a Capacity-achieving Code. In 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton), 52-59. Institute of Electrical and Electronics Engineers, 2011.
Version: Author's final manuscript
ISBN
978-1-4577-1818-2
978-1-4577-1817-5
978-1-4577-1816-8