Huffman coding, the Glossary
In computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression.[1]
Table of Contents
67 relations: A Mathematical Theory of Communication, Adaptive Huffman coding, Adriano Garsia, Alan Tucker, Arithmetic coding, Array (data type), ASCII, Asymmetric numeral systems, Bernoulli process, Binary search tree, Binary tree, Block code, Canonical Huffman code, Charles E. Leiserson, Claude Shannon, Clifford Stein, Codec, Computer science, David A. Huffman, Deflate, Doctor of Science, Dyadic distribution, Entropy (information theory), Entropy coding, Exam, Expected value, Frequency (statistics), Garsia–Wachs algorithm, Golomb coding, Greedy algorithm, IEEE Transactions on Information Theory, Independent and identically distributed random variables, Information theory, Institute of Electrical and Electronics Engineers, Introduction to Algorithms, JPEG, Lossless compression, Massachusetts Institute of Technology, Michelle L. Wachs, Modified Huffman coding, Morse code, MP3, Office of Naval Research, Package-merge algorithm, Patent, PKZIP, Power of two, Prefix code, Priority queue, Probability mass function, ... Expand index (17 more) »
- 1952 in computing
- Binary trees
A Mathematical Theory of Communication
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948.
See Huffman coding and A Mathematical Theory of Communication
Adaptive Huffman coding
Adaptive Huffman coding (also called Dynamic Huffman coding) is an adaptive coding technique based on Huffman coding. Huffman coding and adaptive Huffman coding are data compression.
See Huffman coding and Adaptive Huffman coding
Adriano Garsia
Adriano Mario Garsia (born 20 August 1928) is a Tunisian-born Italian American mathematician who works in analysis, combinatorics, representation theory, and algebraic geometry.
See Huffman coding and Adriano Garsia
Alan Tucker
Alan Curtiss Tucker is an American mathematician.
See Huffman coding and Alan Tucker
Arithmetic coding
Arithmetic coding (AC) is a form of entropy encoding used in lossless data compression. Huffman coding and Arithmetic coding are data compression.
See Huffman coding and Arithmetic coding
Array (data type)
In computer science, array is a data type that represents a collection of elements (values or variables), each selected by one or more indices (identifying keys) that can be computed at run time during program execution.
See Huffman coding and Array (data type)
ASCII
ASCII, an acronym for American Standard Code for Information Interchange, is a character encoding standard for electronic communication.
Asymmetric numeral systems
Asymmetric numeral systems (ANS)J. Huffman coding and Asymmetric numeral systems are data compression.
See Huffman coding and Asymmetric numeral systems
Bernoulli process
In probability and statistics, a Bernoulli process (named after Jacob Bernoulli) is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1.
See Huffman coding and Bernoulli process
Binary search tree
In computer science, a binary search tree (BST), also called an ordered or sorted binary tree, is a rooted binary tree data structure with the key of each internal node being greater than all the keys in the respective node's left subtree and less than the ones in its right subtree. Huffman coding and binary search tree are binary trees.
See Huffman coding and Binary search tree
Binary tree
In computer science, a binary tree is a tree data structure in which each node has at most two children, referred to as the left child and the right child. Huffman coding and binary tree are binary trees.
See Huffman coding and Binary tree
Block code
In coding theory, block codes are a large and important family of error-correcting codes that encode data in blocks.
See Huffman coding and Block code
Canonical Huffman code
In computer science and information theory, a canonical Huffman code is a particular type of Huffman code with unique properties which allow it to be described in a very compact manner. Huffman coding and canonical Huffman code are data compression.
See Huffman coding and Canonical Huffman code
Charles E. Leiserson
Charles Eric Leiserson (born 1953) is a computer scientist and professor at Massachusetts Institute of Technology (M.I.T.). He specializes in the theory of parallel computing and distributed computing.
See Huffman coding and Charles E. Leiserson
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist and cryptographer known as the "father of information theory" and as the "father of the Information Age".
See Huffman coding and Claude Shannon
Clifford Stein
Clifford Seth Stein (born December 14, 1965), a computer scientist, is a professor of industrial engineering and operations research at Columbia University in New York, NY, where he also holds an appointment in the Department of Computer Science.
See Huffman coding and Clifford Stein
Codec
A codec is a device or computer program that encodes or decodes a data stream or signal. Huffman coding and codec are data compression.
Computer science
Computer science is the study of computation, information, and automation.
See Huffman coding and Computer science
David A. Huffman
David Albert Huffman (August 9, 1925 – October 7, 1999) was an American pioneer in computer science, known for his Huffman coding.
See Huffman coding and David A. Huffman
Deflate
In computing, Deflate (stylized as DEFLATE, and also called Flate) is a lossless data compression file format that uses a combination of LZ77 and Huffman coding. Huffman coding and Deflate are data compression.
See Huffman coding and Deflate
Doctor of Science
A Doctor of Science (Scientiae Doctor; most commonly abbreviated DSc or ScD) is a science doctorate awarded in a number of countries throughout the world.
See Huffman coding and Doctor of Science
Dyadic distribution
A dyadic (or 2-adic) distribution is a specific type of discrete probability distribution that is of some theoretical importance in data compression. Huffman coding and dyadic distribution are data compression.
See Huffman coding and Dyadic distribution
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Huffman coding and entropy (information theory) are data compression.
See Huffman coding and Entropy (information theory)
Entropy coding
In information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have an expected code length greater than or equal to the entropy of the source. Huffman coding and entropy coding are data compression.
See Huffman coding and Entropy coding
Exam
An examination (exam or evaluation) or test is an educational assessment intended to measure a test-taker's knowledge, skill, aptitude, physical fitness, or classification in many other topics (e.g., beliefs).
Expected value
In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average.
See Huffman coding and Expected value
Frequency (statistics)
In statistics, the frequency or absolute frequency of an event i is the number n_i of times the observation has occurred/recorded in an experiment or study.
See Huffman coding and Frequency (statistics)
Garsia–Wachs algorithm
The Garsia–Wachs algorithm is an efficient method for computers to construct optimal binary search trees and alphabetic Huffman codes, in linearithmic time. Huffman coding and Garsia–Wachs algorithm are binary trees.
See Huffman coding and Garsia–Wachs algorithm
Golomb coding
Golomb coding is a lossless data compression method using a family of data compression codes invented by Solomon W. Golomb in the 1960s. Huffman coding and Golomb coding are data compression.
See Huffman coding and Golomb coding
Greedy algorithm
A greedy algorithm is any algorithm that follows the problem-solving heuristic of making the locally optimal choice at each stage.
See Huffman coding and Greedy algorithm
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory is a monthly peer-reviewed scientific journal published by the IEEE Information Theory Society.
See Huffman coding and IEEE Transactions on Information Theory
Independent and identically distributed random variables
In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent.
See Huffman coding and Independent and identically distributed random variables
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information. Huffman coding and information theory are data compression.
See Huffman coding and Information theory
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers (IEEE) is an American 501(c)(3) professional association for electronics engineering, electrical engineering, and other related disciplines.
See Huffman coding and Institute of Electrical and Electronics Engineers
Introduction to Algorithms
Introduction to Algorithms is a book on computer programming by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein.
See Huffman coding and Introduction to Algorithms
JPEG
JPEG (short for Joint Photographic Experts Group) is a commonly used method of lossy compression for digital images, particularly for those images produced by digital photography.
Lossless compression
Lossless compression is a class of data compression that allows the original data to be perfectly reconstructed from the compressed data with no loss of information. Huffman coding and Lossless compression are data compression.
See Huffman coding and Lossless compression
Massachusetts Institute of Technology
The Massachusetts Institute of Technology (MIT) is a private land-grant research university in Cambridge, Massachusetts.
See Huffman coding and Massachusetts Institute of Technology
Michelle L. Wachs
Michelle Lynn Wachs is an American mathematician who specializes in algebraic combinatorics and works as a professor of mathematics at the University of Miami.
See Huffman coding and Michelle L. Wachs
Modified Huffman coding
Modified Huffman coding is used in fax machines to encode black-on-white images (bitmaps). Huffman coding and Modified Huffman coding are data compression.
See Huffman coding and Modified Huffman coding
Morse code
Morse code is a telecommunications method which encodes text characters as standardized sequences of two different signal durations, called dots and dashes, or dits and dahs.
See Huffman coding and Morse code
MP3
MP3 (formally MPEG-1 Audio Layer III or MPEG-2 Audio Layer III) is a coding format for digital audio developed largely by the Fraunhofer Society in Germany under the lead of Karlheinz Brandenburg, with support from other digital scientists in other countries. Huffman coding and MP3 are data compression.
Office of Naval Research
The Office of Naval Research (ONR) is an organization within the United States Department of the Navy responsible for the science and technology programs of the U.S. Navy and Marine Corps.
See Huffman coding and Office of Naval Research
Package-merge algorithm
The package-merge algorithm is an O(nL)-time algorithm for finding an optimal length-limited Huffman code for a given distribution on a given alphabet of size n, where no code word is longer than L. It is a greedy algorithm, and a generalization of Huffman's original algorithm.
See Huffman coding and Package-merge algorithm
Patent
A patent is a type of intellectual property that gives its owner the legal right to exclude others from making, using, or selling an invention for a limited period of time in exchange for publishing an enabling disclosure of the invention.
PKZIP
PKZIP is a file archiving computer program, notable for introducing the popular ZIP file format.
Power of two
A power of two is a number of the form where is an integer, that is, the result of exponentiation with number two as the base and integer as the exponent.
See Huffman coding and Power of two
Prefix code
A prefix code is a type of code system distinguished by its possession of the "prefix property", which requires that there is no whole code word in the system that is a prefix (initial segment) of any other code word in the system. Huffman coding and prefix code are data compression.
See Huffman coding and Prefix code
Priority queue
In computer science, a priority queue is an abstract data type similar to a regular queue or stack abstract data type.
See Huffman coding and Priority queue
Probability mass function
In probability and statistics, a probability mass function (sometimes called probability function or frequency function) is a function that gives the probability that a discrete random variable is exactly equal to some value.
See Huffman coding and Probability mass function
Proceedings of the IEEE
The Proceedings of the IEEE is a monthly peer-reviewed scientific journal published by the Institute of Electrical and Electronics Engineers (IEEE).
See Huffman coding and Proceedings of the IEEE
Proportionality (mathematics)
In mathematics, two sequences of numbers, often experimental data, are proportional or directly proportional if their corresponding elements have a constant ratio.
See Huffman coding and Proportionality (mathematics)
Quantization (signal processing)
Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set (often a continuous set) to output values in a (countable) smaller set, often with a finite number of elements. Huffman coding and Quantization (signal processing) are data compression.
See Huffman coding and Quantization (signal processing)
Queue (abstract data type)
In computer science, a queue is a collection of entities that are maintained in a sequence and can be modified by the addition of entities at one end of the sequence and the removal of entities from the other end of the sequence.
See Huffman coding and Queue (abstract data type)
Robert Fano
Roberto Mario "Robert" Fano (11 November 1917 – 13 July 2016) was an Italian-American computer scientist and professor of electrical engineering and computer science at the Massachusetts Institute of Technology.
See Huffman coding and Robert Fano
Ron Rivest
Ronald Linn Rivest (born May 6, 1947) is a cryptographer and computer scientist whose work has spanned the fields of algorithms and combinatorics, cryptography, machine learning, and election integrity.
See Huffman coding and Ron Rivest
Run-length encoding
Run-length encoding (RLE) is a form of lossless data compression in which runs of data (consecutive occurrences of the same data value) are stored as a single occurrence of that data value and a count of its consecutive occurrences, rather than as the original run. Huffman coding and run-length encoding are data compression.
See Huffman coding and Run-length encoding
Scientific American
Scientific American, informally abbreviated SciAm or sometimes SA, is an American popular science magazine.
See Huffman coding and Scientific American
Shannon's source coding theorem
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the operational meaning of the Shannon entropy. Huffman coding and Shannon's source coding theorem are data compression.
See Huffman coding and Shannon's source coding theorem
Shannon–Fano coding
In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is one of two related techniques for constructing a prefix code based on a set of symbols and their probabilities (estimated or measured). Huffman coding and Shannon–Fano coding are data compression.
See Huffman coding and Shannon–Fano coding
T. C. Hu
Te Chiang Hu (胡德强, 1930–2021) was a Chinese-American computer scientist and operations researcher known for his work in the design and analysis of algorithms.
See Huffman coding and T. C. Hu
Term paper
A term paper is a research paper written by students over an academic term, accounting for a large part of a grade.
See Huffman coding and Term paper
Thomas H. Cormen
Thomas H. Cormen is an American politician and retired academic.
See Huffman coding and Thomas H. Cormen
Time complexity
In theoretical computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm.
See Huffman coding and Time complexity
Total order
In mathematics, a total order or linear order is a partial order in which any two elements are comparable.
See Huffman coding and Total order
Tree (data structure)
In computer science, a tree is a widely used abstract data type that represents a hierarchical tree structure with a set of connected nodes.
See Huffman coding and Tree (data structure)
Variable-length code
In coding theory, a variable-length code is a code which maps source symbols to a variable number of bits. Huffman coding and variable-length code are data compression.
See Huffman coding and Variable-length code
See also
1952 in computing
- Huffman coding
- Laning and Zierler system
Binary trees
- AVL tree
- Binary expression tree
- Binary heap
- Binary search tree
- Binary space partitioning
- Binary tree
- Cartesian tree
- Garsia–Wachs algorithm
- Geometry of binary search trees
- Huffman coding
- Interleave lower bound
- Join-based tree algorithms
- Left-child right-sibling binary tree
- Optimal binary search tree
- Random binary tree
- Red–black tree
- Rope (data structure)
- Rotation distance
- Scapegoat tree
- Segment tree
- Self-balancing binary search tree
- Skew heap
- Splay tree
- T-tree
- Tango tree
- Threaded binary tree
- Top tree
- Treap
- Tree rotation
- Vantage-point tree
- WAVL tree
References
[1] https://en.wikipedia.org/wiki/Huffman_coding
Also known as Alphabetic Huffman coding, Alphabetic Huffman tree, Applications of Huffman coding, Coding tree, Data compression/Huffman coding, Fixed-length Huffman code, Hoffman coding, Hu-Tucker, Hu-Tucker coding, Huffman Code, Huffman Compression, Huffman Compression algorithm, Huffman Decoding, Huffman Encoding, Huffman algorithm, Huffman codes, Huffman tree, Huffman's algorithm, Huffman-Shannon-Fano coding, Huffmann code, Huffmann coding, Huffmann decoding, Huffmann encoding, Hufman coding, Hufman encoding, K-ary Huffman coding, K-ary Huffman encoding, K-ary Hufman encoding, Length-limited Huffman code, Length-limited Huffman coding, Limited-length Huffman code, Minimum variance Huffman code, Minimum variance Huffman coding, Static Huffman coding.
, Proceedings of the IEEE, Proportionality (mathematics), Quantization (signal processing), Queue (abstract data type), Robert Fano, Ron Rivest, Run-length encoding, Scientific American, Shannon's source coding theorem, Shannon–Fano coding, T. C. Hu, Term paper, Thomas H. Cormen, Time complexity, Total order, Tree (data structure), Variable-length code.