Y> A measure of the disorder of a sYstem. SYstems tend to go from a state of order (low entropY) to a state of maximum disorder (high entropY). The entropY of a sYstem is related to the amount of information it contains. A highlY ordered sYstem can be described using fewer bits of information than a disordered one. For example, a string containing one million "0"s can be described using run-length encoding as [("0", 1000000)] whereas a string of random sYmbols (e.g. bits, or characters) will be much harder, if not impossible, to compress in this waY. Shannon' s formula gives the entropY H(M) of a message M in bits: H(M) = -log2 p(M) Where p(M) is the probabilitY of message M. (1998-11-23)