Back to Table of Contents
The word entropy is sometimes confused with energy. Although they are related quantities, they are distinct.
As described in previous sections, energy measures the capability of an object or system to do work.
Entropy, on the other hand, is a measure of the "disorder" of a system. What "disorder refers to is really the number of different microscopic states a system can be in, given that the system has a particular fixed composition, volume, energy, pressure, and temperature. By "microscopic states", we mean the exact states of all the molecules making up the system.
The idea here is that just knowing the composition, volume, energy, pressure, and temperature doesn't tell you very much about the exact state of each molecule making up the system. For even a very small piece of matter, there can be trillions of different microscopic states, all of which correspond to the sample having the same composition, volume, energy, pressure, and temperature. But you're ignorant of exactly which one the system is in at any given moment - and that turns out to be important.
Why should it be important, after all, if you know the bulk properties. Isn't that all one usually needs? It turns out that no, in fact if you want to, say, exact energy from say steam and convert it to useful work, those details turn out to be crucial! (More on this below).
For those that are technically inclined, the exact definition is
Entropy = (Boltzmann's constant k) x logarithm of number of possible states
= k log(N).
Since the logarithm of a number always increases as the number increases, we see that the more possible states that the system can be in (given that it has a particular volume, energy, pressure, and temperature), then the greater the entropy.
Again, because we can't see which particular microscopic state a system is in, people often like to say that entropy is quantitative measure of how uncertain or ignorant one is about the exact, detailed, microscopic state of a system. Or, another popular way of saying this is that entropy measures the microscopic disorder of a system.
As a simple example, suppose that you put a marble in a large box, and shook the box around, and you didn't look inside afterwards. Then the marble could be anywhere in the box. Because the box is large, there are many possible places inside the box that the marble could be, so the marble in the box has a high entropy. Now suppose you put the marble in a tiny box and shook up the box. Now, even though you shook the box, you pretty much know where the marble is, because the box is small. In this case we say that the marble in the box has low entropy.
The same idea applies to the arrangements of atoms of a gas in a jar at room temperature. The smaller the jar, the lower the entropy. But keep in mind that we also have to consider the velocities of the gas particles to have full knowledge of their states. The higher the temperature of the gas, the faster the gas particles are moving on average, so the wider the range of possible velocities for the gas particles, and hence, the more uncertainty we have about the velocity of any particular particle. Thus, higher temperature, as well as greater volume, mean higher entropy.
Scientists say that entropy, like energy, volume, temperature, and pressure, is another thermodynamic state variable of a system. It turns out that, for a simple system, if you know any two of these state variables, then the others are all determined. Although the word entropy might seem like a mysterious concept, its really not. Remember that its really just a measure of the number states a system can be in, given the constraints on the system.
What is entropy good for? Knowing the entropy of a system can tell us many things about what can and can't happen. In particular, its the basis for the second law of thermodynamics: the Universe evolves such that its total entropy always stays the same or increases (The first law of thermodynamics is conservation of energy).
Why is this so? In fact, the basic idea of entropy is simple to understand. Suppose you are floating out in space and you have a jar containing a particular gas, say argon. When you open the jar for a moment, the argon will almost certainly escape out into space. After the argon has escaped, its entropy is greatly increased (and it continues to increase as the gas expands). How do I know that the entropy increased? This is because the number of states that the argon gas can be in when it occupies a much larger volume is much greater than when its confined to the jar. So, the entropy of the gas increases when the argon escapes. But why must the argon escape? Well, in fact, prior to opening the jar, if you arranged the microscopic states of the argon molecules in just the right way, you could open the jar for a moment and not have the argon escape. The point is that it is highly improbable that the argon is in one of these special non-escaping states when you open the jar - most of the states lead to the gas escaping. This is really the content of the second law - that if you begin not knowing the microscopic state of a system, then the system is more than likely to evolve to state where you are even more ignorant of its exact microscopic state. Just knowing the thermodynamic state variables of a system, such as its temperature and pressure, means you are in fact ignorant about the initial exact microscopic state - all you can know from the state variables is the number of possible microscopic states it can be in, i.e. the entropy. Hence, for most situations we encounter, chances are that entropy will increase with time.
It is very interesting to compare the behavior of entropy compared to energy. Unlike energy, entropy can be created (but not generally destroyed). In fact, your body is creating some right now as it generates heat. One of the reasons that your body temperature has to be higher than the surrounding air, or that you have to sweat off water if it isn't, is that you have to get rid of the extra entropy (otherwise, you would become disorganized and eventually die). The energy that your warm body radiates carries away the extra entropy. It does this because losing this energy decreases the number of microscopic states that the atoms and molecules of your body can be in.
Another practical example of entropy is the following. Suppose we want to use a source of heat, say, from steam generated by heating water, to drive some kind of turbine. Then, it turns out, by considering entropy, that the maximum efficiency of our process will be less than 100%. The reason that this is so is because when heat is brought into the turbine, it carries with it some entropy. We can't keep this entropy in the turbine, because the turbine would become microscopically disordered and eventually break. So, some heat energy has to be released to the outside world to get rid of this entropy to protect the turbine. The heat released for this purpose therefore can't be converted into work (otherwise it wouldn't be available anymore to release as heat). We get rid of the unwanted entropy by rejecting this heat to the outside world at a lower temperature than we brought the heat in at. The reason for the lower temperature is that the heat released into a low temperature environment carries out more entropy from the turbine than the entropy this same amount of heat carries into the turbine at a high temperature. This is because heat disrupts a cold system more than a hot one, because the hot one is already more disordered. Thus, we must only sacrifice some of the heat carried into the turbine to get rid of the entropy imported into the turbine by that heat in the first place. One can see from this discussion, however, why power plants need a cold temperature environment to dump their waste heat.
Now this all might seem a little too abstract. Here's another way to look at it: The kinetic energy of the steam molecules is large (because the steam is hot), but the directions of the molecules are disordered. Somehow, to convert all of the energy of the steam into useful work, you'd have to line them all up in the same direction (at least, say, one at a time or in groups). But you're ignorant of the exact configuration at any given instant, right? And even if you weren't, how are you going to get in there and actually do it for each molecule? Clearly, the microscopic disorder is a barrier. This shows why being ignorant of those details might seem minor intuitively, but actually has real consequences for real things you would like to do!
This example above demonstrates how heat energy, because it can't be completely converted to mechanical energy in a turbine, is, in a sense, of lesser quality than mechanical energy. People have in fact rated the quality of energy in this sense for many different sources. Solar electric energy captured by photovoltaic cells, in particular, is energy of very high "quality". Virtually all of it can be converted to mechanical energy.
Entropy can also be applied to many other situations. For example, it can be used to predict the direction that a chemical reaction will proceed.
Back to Table of Contents