Entropy Simplified
The field of information theory defines entropy as a measure of average information a random variable takes. As we know the random variable is one whose possible outcome states are not deterministic. We cannot predict its behavior exactly in an analytical form. In such cases, we can only look at the patterns of its outcome …