Coding theory: source coding, channel coding
Source coding and channel coding are the fundamental concerns of information theory.
Information theory, however, does not consider message importance or meaning, as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities. "Thank you, come again" conveys less information than "Call an ambulance!" not because it is less important or less urgent, but because it is said more often.
The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel.
The most fundamental results of this theory are Shannon's source coding theorem, which establishes that, on average, the number of bits needed to represent the result of an uncertain event is given by its entropy; and Shannon's noisy-channel coding theorem, which states that reliable communication is possible over noisy channels provided that the rate of communication is below a certain threshold, called the channel capacity. The channel capacity can be approached in practice by using appropriate encoding and decoding systems.
data compression (source coding) and error-correction (channel coding)