Joke Collection Website - Public benefit messages - How to calculate the average amount of information in an hour?

How to calculate the average amount of information in an hour?

It takes an hour's rest on average.

The formula for calculating the average amount of information is I = log2( 1/p), where p is the probability and log2 refers to the logarithm with base 2. However, the in-depth and systematic study of information quantity began with the pioneering work of C.E. Shannon from 1948. In information theory, it is considered that the messages output by the source are random. Information quantity and information entropy are different in concept. You can't be sure what symbol is sent by the source until you receive the symbol. The purpose of communication is to make the receiver eliminate the doubt (uncertainty) about the source after receiving the symbol and make the uncertainty zero. This shows that the amount of information obtained by the receiver from the sender's source is a relative amount (H(U)-0). Information entropy is a physical quantity that describes the statistical characteristics of the source itself. It represents the average uncertainty of the symbols generated by the source, and it is always an objective quantity whether there is a receiver or not.