Information is organised data which possesses some meaningful application for the receiver. It is the processed data on which actions and decisions are based. In the case of information technology, data is stored in databases. It is accessed and processed digitally. Information and data are not the same entity. Data points to a qualitative and numerical aspect of an event or thing. Information is produced when data is represented in a manner that has meaning to the receiver. In order to convert data into information, it must be organised and processed according to the specified parameters. Organising data in a manner that has value and meaning is known as information design. This is an important aspect of both human-computer interaction and information architecture.
Table of Contents |
What is Information Theory?
Information theory is a mathematical representation of parameters and conditions impacting the processing and transmission of information. In the mid-20th century, electrical engineer Claude Shannon laid down the foundation of information theory. It is predominantly concerned with communication engineering. Some of the aspects of information theory have been adopted in fields like linguistics and psychology. Information theory converges directly with communication theory. However, it is much more directed towards the fundamental limitations of communication and processing of information. It is less oriented towards the operation of communication devices.
Information theory examines the utilisation, processing, transmission and extraction of information. Theoretically, information can be considered as the resolution of uncertainty. In the scenario of information communication over noisy channels, this theoretical concept was formalised by Claude Shannon (in his work called A Mathematical Theory of Communication) in 1948. In his work, information is considered as a group of possible messages. The main goal is to transfer these messages over noisy channels and to have the receiving device redevelop the message with negligible error probability (regardless of the channel noise). The main result of Shannon’s work is the noisy-channel coding theorem.
Evolution of Information Theory
The monumental event developing the field of information theory and attracting the world’s attention was the publication of “A Mathematical Theory of Communication” by Claude E. Shannon in 1948. Before this work, primitive information-theoretic concepts had been constructed at Bell Labs, everything implicitly considering incidents of equal probability. In 1924, Harry Nyquist published the paper “Certain Factors Affecting Telegraph Speed”. It has a theoretical portion which quantifies the “line speed” and “intelligence” at which it can be transferred by a communication device.
Most of the mathematics behind information theory with incidents of varied probabilities was constructed for the thermodynamics field by J. Willard Gibbs and Ludwig Boltzmann. The relation between thermodynamic entropy and information-theoretic entropy, including the core contributions of Rolf Landauer, is explored in entropy thermodynamics and information theory. In Shannon’s groundbreaking and incredible paper, which was completed at Bell Labs for the first time, Shannon put forward the quantitative and qualitative concept of communication as an analytical and statistical process underlying the theory of information. It began by asserting that the basic issue of communication is replicating at one point, either approximately or exactly, a message chosen at another point.
The video explains the difference between analogue and digital signals
Information and Source theory
The processes which produce successive messages can be stated as a source of information. A memoryless source is a type in which every message is a random and identically distributed variable, whereas the characteristics of stationarity and ergodicity inflict less restrictive constraints. Every such source is stochastic. These concepts are thoroughly studied individually, even outside the scope of information theory.
Entropy and Information Theory
Entropy is an important measure in information theory. It quantifies the extent of uncertainty embodied in the value of random variables or the result of random action. In the case of predicting the result of a fair coin flip (with only two equally probable outcomes), it gives much less information (lesser entropy) than specifying the result from a die roll (die rolling has six equally possible results). Other essential measures in this theory are relative entropy, channel capacity, mutual information, and error exponents. Algorithmic information theory, source coding, information-theoretic security, and algorithmic complexity theory are the main subfields of information theory.
Information Theory and Probability Theory
Information theory is fundamentally based on statistics and probability theory. Information theory usually deals with the extent of information of distributions related to random variables. The main quantities of information are mutual information (a measure of information in common among two arbitrary variables) and entropy (a measure of information in one random variable). The first quantity is a characteristic of the collective distribution of two arbitrary variables. It is the highest rate of ideal communication along a noisy channel in the long block length limit when the channel statistics are determined by the collective or joint distribution.
Coding Theory
Coding theory is one of the most significant and explicit applications of information theory. It can be categorised into channel coding theory and source coding theory. Using an analytical description for data, the theory of information objectifies the number of bits required to represent the data which is the source’s information entropy.
Coding theory is the study related to the nature of codes and their individual capability for particular applications. Codes are utilised for data compression, data transmission, cryptography, data storage, error detection and error correction. Codes are extensively studied in many scientific fields like computer science, electrical engineering, mathematics, and linguistics for the purpose of developing reliable and efficient data transmission techniques. This generally involves the elimination of redundancy and detection of errors in the transferred data. There are four fundamental types of coding: data compression (source coding), error control (channel coding), line coding, and cryptographic coding.
Applications of Information Theory
The application of basic concepts of information theory includes channel coding detection/correction and source coding/data compression (for ZIP files etc.). Its role has been essential to the launch and success of Voyager space missions to deep space, the evolution of the Internet, the practicality of mobile phones, and the development of the compact disc. Information theory has been applied in fields including quantum computing, molecular codes, thermal physics, anomaly detection, black hole, intelligence gathering, cryptography, linguistics, molecular dynamics, information retrieval, complex art, and statistical inference.
Related Topics |
Frequently Asked Questions – FAQs
Define information theory.
Information theory is a mathematical representation of parameters and conditions impacting the processing and transmission of information.
Who laid the foundation of information theory?
Electrical engineer Claude Shannon laid down the foundation of information theory.
What is meant by information?
Information is organised data which possesses some meaningful application for the receiver.
What is the basic scope of information theory?
Information theory examines the utilisation, processing, transmission and extraction of information. Theoretically, information can be considered as the resolution of uncertainty.
Explain the fundamentals of the research paper called “A Mathematical Theory of Communication”.
In the scenario of information communication over noisy channels, the theoretical concept was formalised by Claude Shannon (in his work called A Mathematical Theory of Communication) in 1948. In his work, information is considered as a group of possible messages. The main goal is to transfer these messages over noisy channels and to have the receiving device redevelop the message with negligible error probability (regardless of the channel noise).
What is code theory?
Coding theory is the study related to the nature of codes and their individual capability for particular applications. Codes are utilised for data compression, data transmission, cryptography, data storage, error detection and error correction.
Which are the scientific fields directly connected to coding theory?
Codes are extensively studied in many scientific fields like computer science, electrical engineering, mathematics, and linguistics for the purpose of developing reliable and efficient data transmission techniques.
Stay tuned to BYJU’S and Fall in Love with Learning!