• No results found

7. The Physics of High Elasticity 109

7.6 Entropy

Rapidly developing science gives rise to a new vocabulary. This gives us an excuse to reflect on how human languages evolve. It is fascinating to be able to trace this process, spanning from the dawn of mankind to the modern day. For example, officers of the Russian army, after entering Paris in 1814, used to spur on French waiters in Russian. The Russian word bistro (meaning “quickly”) soon became absorbed into French and then to other western languages, everybody now understands that “Bistro” is a modestly set small restaurant o↵ering inexpensive simple meals. Much more recently, we witnessed how the English language acquired such strangers as “pogrom”, “sputnik”, “perestroika” from the Russian, French language picked up “airbags” from English, whereas the Russians borrowed words like

“computer” and even some English abbreviations became words in Russian such as PR (public relations). Novel words usually enrich the language, as they represent new things and ideas. For instance, the word “computer” is literally absorbed into Russian to distinguish the modern universal device from a “machine for calculations”; likewise, the word sputnik in English does not mean “satellite” in general (which would have been the correct translation), but rather refers to the first Russian satellite, and so evokes memories and mood of that period of time.

122 Giant Molecules: Here, There, and Everywhere

Speaking more specifically of the scientific words, many people are afraid of them, and indeed some of the newly invented ones may sound pretty hor-rible (like “uniformitarianism” or “compartmentalization”). Such words have a very narrow use, and clutter up the language. We honestly think that their authors must have lacked a sense of moderation! Here is a telling statement by Samuel Goudsmit, an editor of one of the leading scientific journals, Physical Review : “We find that [neologisms] are often ungram-matical, frequently ugly, sometimes chauvinistic, likely to be obscure, and usually unnecessary”. Nevertheless, there have been some really valuable scientific contributions to the world’s vocabularies, and the word “entropy”

is among them; moreover, it certainly deserves a place near the top of the list.

Together with energy, time, and so on, entropy is one of the most crucial concepts of physics, and of science in general. Unfortunately, ever since the idea of entropy appeared, it has always been surrounded by a halo of mys-tery. For instance, the following definition is attributed to the well-known physical chemist Wilhelm Ostwald (1853–1932; he was born in Riga, edu-cated in Tartu, worked most of his life in Leipzig, and awarded Nobel prize in 1909): “Energy is the queen of the world, and entropy is her shadow!”

Such an attitude is not without reason. How do people hear about entropy the first time? Quite often it gets mentioned in the context of the most global and tantalizing problems, such as the origin of life, or the future of the Universe. Perhaps, this explains why there is usually no room for en-tropy in the school curriculum. However, it is quite a straightforward thing.

To get to know it in the first instance, you do not need to dive into obscure philosophical matters. Moreover, it is hard to manage without entropy, if you are aiming to describe atomic properties of matter. It would be a bit like trying to explain the rules of football without mentioning the ball!

This is especially true for polymers. Now you understand why we need to digress from the main theme, and talk about entropy in more detail.

Let’s think of energy, for a start. How would you define it? Of course, you can split it up into various forms, e.g. potential energy, kinetic energy, etc, and describe them separately. However, the real meaning of energy is revealed by the conservation law. Consider a complex system. Suppose we know that somewhere in this system a certain form of energy has decreased.

This means that the energy of the other parts must have increased (given that the system is isolated). Thus, we are able to draw the right conclusion straightaway. The great thing is that we don’t need to know anything about the way the system functions or what it is made of.

Now back to entropy. Equation (7.2) can be regarded as its definition.

As we have already said, entropy is the energy equivalent of probability.

In other words, if you look at how much the value ( T S) has changed, it will tell you exactly how much work has been done to transfer the system from a more probable to a less probable state. In this case yet again, just like with energy, you need not worry about the details, e.g. what did the work (a piston, or an electric field, etc.), how the molecules collided with the object doing the work, and so forth.

What exactly does the Boltzmann principle (7.2) mean? Its main idea is that the quantity Ue↵ = T S defined by (7.3) and (7.2) can be regarded as some sort of potential energy. Indeed, if the system is left to itself, it is most likely to drop down into the most likely state (sorry for this tautology!) According to (7.2) and (7.3), this would mean an increase in entropy, and hence a decrease in Ue↵, which is just what the principle of minimum potential energy predicts.

Figure 7.6 sketches the function Ue↵(R) for an ideal polymer chain, in accordance with (7.5). The graph has the shape of a potential well.

However, you cannot say that “sitting” at the bottom of the well corre-sponds to the equilibrium. We are talking about non-zero temperatures here. Suppose you have a little ball at temperature T , and you put it into a proper potential well (not Ue↵, but merely U ). What will it do? It will go jittering around the equilibrium position, in a random Brownian way.

The typical size of the swings will be such that the potential energy in-creases by about kBT . (By the way, this is just how physicists estimate the amplitudes of thermal oscillations of atoms in a crystal.) A similar thing can be said about Ue↵. As you can see from Equation (7.5), the condition Ue↵(R) Ue↵(0) kBT leads to the result that the distance R N1/2` (as usually, we dropped numerical factors of order unity). This result is

typical fluctuation range

kBT

Ueff=-TS

Rx

Fig. 7.6 The dependence of the ef-fective potential energy Ue↵= TS of a polymer on the x-component of its end-to-end vector RR. This pic-ture shows how the amplitude of the fluctuations in R can be found from the condition that Ue↵reaches up to about kBT above the minimum.

124 Giant Molecules: Here, There, and Everywhere

exactly what we want — the most probable end-to-end distance for a single polymer coil (see Section 6.7).

Now the Boltzmann equation has become a little clearer, because we have sorted out Ue↵, and agreed that it is something like potential energy.

Yet it is so tempting to actually try to derive the equation! Let’s do it for an ideal gas, in the next section.

In document This page is intentionally left blank (pagina 141-144)