What do you understand by Information in Data Compression?
The amount of information conveyed by a message increases as the amount of uncertainty regarding the message becomes greater.
The more it is known about the message a source will produce, the less the uncertainty and less the information conveyed.
The entropy of communication theory is a measure of this uncertainty conveyed by a message from a source.
The starting point of information theory is the concept of uncertainty.
Let us define an event as an occurrence which can result in one of the many possible outcomes.
The outcome of the event is known only after it has occurred, and before its occurrence, we do not know which one of the several possible outcomes will result.
We are thus uncertain about the outcome before the occurrence of the event.
After the event has occurred, we are no longer uncertain about it.
If we know or can assign a probability to each one of the outcomes, then we will have some information as to which one of the outcomes is most likely to occur.