en·tro·py /ˈɛntrəpi / [en-truh-pee] noun
(on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
(in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S
(in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
(in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).
a doctrine of inevitable social decline and degeneration.