Entropy of a probability distribution — in layman’s terms

Entropy of a probability distribution is the average “element of surprise” or amount of information when drawing from (or sampling) the probability distribution.

Lets consider a probability distribution composed of two outcomes — “sun will rise tomorrow” (probability — 1) and “sun will not rise tomorrow” (probability 0 ) — (the numerical values 0 and 1 are just chosen for illustrative purposes. Perhaps the distribution should have been something more like .9999999999 and .000000001).