Jump to content

Shannon coding: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Yanpas (talk | contribs)
m picture
m References: add WP:TEMPLATECAT to remove from template; genfixes
 
(49 intermediate revisions by 26 users not shown)
Line 1: Line 1:
{{Short description|Lossless data compression technique}}
[[File:Shannon-example.svg|thumb|400px|Example of creating code scheme. l<sub>i</sub> represents negative power of 2, which is less or equal then current probability. Fifth row represent binary form of summ]]
In the field of [[data compression]], '''Shannon coding''', named after its creator, [[Claude Shannon]], is a [[lossless data compression]] technique for constructing a [[prefix code]] based on a set of symbols and their probabilities (estimated or measured). It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like [[Huffman coding]], and never better but sometime equal to the [[Shannon-Fano coding]].
In the field of [[data compression]], '''Shannon coding''', named after its creator, [[Claude Shannon]], is a [[lossless data compression]] technique for constructing a [[prefix code]] based on a set of symbols and their probabilities (estimated or measured). It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like [[Huffman coding]] does, and never better than but sometimes equal to the [[Shannon–Fano coding]] (Fano's method).


The method was the first of its type, The technique was used to prove [[Shannon's noiseless coding theorem]] in his 1948 article "A Mathematical Theory of Communication",<ref>"A Mathematical Theory of Communication" http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf</ref> and is therefore a centerpiece to the information age.
The method was the first of its type, the technique was used to prove [[Shannon's noiseless coding theorem]] in his 1948 article "A Mathematical Theory of Communication",<ref>{{cite journal |author=Shannon, Claude E. |date=July 1948 |title=A Mathematical Theory of Communication [reprint with corrections] |url=http://math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf |journal=[[Bell System Technical Journal]] |volume=27 |issue=3 |pages=379–423 |doi=10.1002/j.1538-7305.1948.tb01338.x |hdl-access=free |authorlink=Claude Shannon |hdl=11858/00-001M-0000-002C-4314-2}}</ref> and is therefore a centerpiece of the information age.


This coding method led rise to the field of information theory and without its contribution, the world would not have any of the many successors; for example Shannon-Fano coding, Huffman coding, or [[arithmetic coding]]. Much of our day-to-day lives are significantly influenced by [[digital data]] and this would not be possible without Shannon coding and its ongoing evolution of its predecessor coding methods.
[[Shannon–Fano coding]] methods gave rise to the field of information theory and without its contributions, the world would not have any of the many successors; for example Huffman coding, or [[arithmetic coding]]. Much of our day-to-day lives are significantly influenced by [[digital data]] and this would not be possible without Shannon-Fano coding and the ongoing evolution of its methods.<ref name="LiDrew2014">{{cite book|author1=Ze-Nian Li|author2=Mark S. Drew|author3=Jiangchuan Liu|title=Fundamentals of Multimedia|url=https://books.google.com/books?id=R6vBBAAAQBAJ|date=9 April 2014|publisher=Springer Science & Business Media|isbn=978-3-319-05290-8}}</ref>{{page needed|date=June 2023}}


In Shannon coding, the symbols are arranged in order from most probable to least probable, and assigned codewords by taking the first <math>l_i = \left\lceil {-\log} p_i(x) \right\rceil </math> digits from the binary expansions of the cumulative probabilities. <math> \sum\limits_{i=k}^{i-1} p_k(x)</math> . Here <math>\left\lceil x \right\rceil</math> denotes the function which rounds <math>x</math> up to the next integer value.
In Shannon coding, the symbols are arranged in order from most probable to least probable, and assigned codewords by taking the first <math>l_i = \left\lceil -\log_2 p_i \right\rceil </math> bits from the binary expansions of the cumulative probabilities <math> \sum\limits_{k=1}^{i-1} p_k.</math> Here <math>\lceil x \rceil</math> denotes the [[ceiling function]] (which rounds <math>x</math> up to the next integer value).

== Example ==
<!-- This example actually requires 2.92 bits/symbol on average, compared to a noncompressed value of ~2.58, so this is probably not a very good example -->
In the table below is an example of creating a code scheme for symbols ''a''<sub>1</sub> to ''a''<sub>6</sub>. The value of ''l''<sub>''i''</sub> gives the number of bits used to represent the symbol ''a''<sub>''i''</sub>. The last column is the bit code of each symbol.
{| class="wikitable"
!''i''
!''p''<sub>''i''</sub>
!''l''<sub>''i''</sub>
!<math>\sum_{n=0}^{i-1} p_n</math>
!Previous value in binary
!Codeword for ''a''<sub>''i''</sub>
|-
!1
|0.36
|2
|0.0
|0.0000
|00
|-
!2
|0.18
|3
|0.36
|0.0101...
|010
|-
!3
|0.18
|3
|0.54
|0.1000...
|100
|-
!4
|0.12
|4
|0.72
|0.1011...
|1011
|-
!5
|0.09
|4
|0.84
|0.1101...
|1101
|-
!6
|0.07
|4
|0.93
|0.1110...
|1110
|}


==References==
==References==
{{Reflist}}
{{Reflist}}


Shannon, Claude Elwood. [https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf "A Mathematical Theory of Communication."] ACM SIGMOBILE mobile computing and communications review 5.1 (2001): 3-55.
==External links==

{{Compression Methods}}
{{Compression Methods}}


Line 18: Line 71:
[[Category:Lossless compression algorithms]]
[[Category:Lossless compression algorithms]]
[[Category:Claude Shannon]]
[[Category:Claude Shannon]]
[[Category:Data compression]]

Latest revision as of 00:53, 6 December 2024

In the field of data compression, Shannon coding, named after its creator, Claude Shannon, is a lossless data compression technique for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured). It is suboptimal in the sense that it does not achieve the lowest possible expected code word length like Huffman coding does, and never better than but sometimes equal to the Shannon–Fano coding (Fano's method).

The method was the first of its type, the technique was used to prove Shannon's noiseless coding theorem in his 1948 article "A Mathematical Theory of Communication",[1] and is therefore a centerpiece of the information age.

Shannon–Fano coding methods gave rise to the field of information theory and without its contributions, the world would not have any of the many successors; for example Huffman coding, or arithmetic coding. Much of our day-to-day lives are significantly influenced by digital data and this would not be possible without Shannon-Fano coding and the ongoing evolution of its methods.[2][page needed]

In Shannon coding, the symbols are arranged in order from most probable to least probable, and assigned codewords by taking the first bits from the binary expansions of the cumulative probabilities Here denotes the ceiling function (which rounds up to the next integer value).

Example

[edit]

In the table below is an example of creating a code scheme for symbols a1 to a6. The value of li gives the number of bits used to represent the symbol ai. The last column is the bit code of each symbol.

i pi li Previous value in binary Codeword for ai
1 0.36 2 0.0 0.0000 00
2 0.18 3 0.36 0.0101... 010
3 0.18 3 0.54 0.1000... 100
4 0.12 4 0.72 0.1011... 1011
5 0.09 4 0.84 0.1101... 1101
6 0.07 4 0.93 0.1110... 1110

References

[edit]
  1. ^ Shannon, Claude E. (July 1948). "A Mathematical Theory of Communication [reprint with corrections]" (PDF). Bell System Technical Journal. 27 (3): 379–423. doi:10.1002/j.1538-7305.1948.tb01338.x. hdl:11858/00-001M-0000-002C-4314-2.
  2. ^ Ze-Nian Li; Mark S. Drew; Jiangchuan Liu (9 April 2014). Fundamentals of Multimedia. Springer Science & Business Media. ISBN 978-3-319-05290-8.

Shannon, Claude Elwood. "A Mathematical Theory of Communication." ACM SIGMOBILE mobile computing and communications review 5.1 (2001): 3-55.