Entropy

In the nineteenth century, French engineer Sadi Carnot studied the ability of engines to convert thermal energy into mechanical energy. He developed a proof that even an ideal engine would generate some waste heat. The result is best described in terms of a quantity called entropy.

Entropy is often defined as a measure of the randomness of a system. The more random or disordered the more entropy. For example, as the aroma from a cup of coffee spreads through a room, the entropy increases since the molecules in the aroma are spread out much more than when they were all inside of the cup.



eTAP
Copyright 2001-2018 B. J. Subbiondo. All rights reserved.