Tuesday, 9 July 2019

THEORY OF INFORMATION


Information theory is a branch of mathematics that overlaps into communications engineering, biology, medical science, sociology, and psychology. The theory is devoted to the discovery and exploration of mathematical laws that govern the behavior of data as it is transferred, stored, or retrieved.
The first component of the model, the message source, is simply the entity that originally creates the message. Often the message source is a human, but in Shannon’s model it could also be an animal, a computer, or some other inanimate object. The encoder is the object that connects the message to the actual physical signals that are being sent. For example, there are several ways to apply this model to two people having a telephone conversation.
The paper caught the immediate attention of mathematicians and scientists worldwide. Several disciplines spun off as the result of reactions to this paper, including information theory, coding theory, and the entropy theory of abstract dynamical systems.
Whenever data is transmitted, stored, or retrieved, there are a number of variables such as bandwidth, noise, data transfer rate, storage capacity, number of channels, propagation delay, signal-to-noise ratio, accuracy (or error rate), intelligibility, and reliability. In audio systems, additional variables include fidelity and dynamic range.
Information theory is an evolving discipline and continues to generate interest among experimentalists and theorists.

Happy Learning!
Anamika Gupta
IAAN

No comments:

Post a Comment