Entropy as I understand it, is the amount of energy not availible to do work. So is that like energy in bond? And also I am assuming that entropy is
the amount of heat in a system that does not contribute to the temperature, so is that in anyway related to Specific heat capacity?pantone159 - 16-7-2006 at 15:59
Entropy is the number of possible combinations of states of all the individual atoms/molecules that result in the overall bulk properties of the
material at the particular conditions involved.
(Actually, it is the logarithm of the number of combinations.)Magpie - 22-7-2006 at 14:37
Entropy is a property that has always fascinated me ever since I first learned about it during my 3rd year in college. It is abstract enough that I
always have to review it before feeling anywhere near competent to discuss it. Since it is too hot to work in the lab (>100F) this is a good time
for me.
It is very interesting to me that entropy is a state function that can be quantified both as (1) the exponent of the number of molecular positions
possible, and (2) by the thermodynamic equation of the heat transferred reversibly divided by the absolute temperature of the transfer! That is:
delta S = q(rev)/T(abs)
guy's original question about whether this is related to heat capacity is an interesting one for which I will have to continue my review.... Anyone else care to respond?
[Edited on 23-7-2006 by Magpie]franklyn - 22-7-2006 at 21:20
Ludwig Boltzmann, James Maxwell, Leo Szilard, and many others of
repute mused over this entropy thing. It is the simple fact of
being that the greater the number of individual objects such as
atoms in a system, the greater the number of possibilities for
their arrangement.
Those that we classify as random are vastly more existent and for
this reason of all the possibilities any one of these is by far
most probable.
It is supposed that mass and energy are created and destroyed all
the time at the quantum level but it is only the average of this
guy, I have reviewed the subject of entropy so feel I can now discuss it again without making too many mistakes. I trust you have been doing some
reading also.
My understanding is that entropy is a measure of disorder, not energy. An entropy increase occurs when water goes from a liquid to a gas,
for example. And in this case it does include the breaking of some bonds (hydrogen bonding) I would say.
A change in entropy can be computed, however, by dividing the heat transfered, reversibly, by the absolute temperature of the transfer. In that sense
it seems related to energy.
Since the heat transfered can also be equated to the heat capacity times the change in temperature for some processes, you could also say there's a
relation of entropy change to heat capacity... in a sense.
I can recommend what I think is an especially good and concise book: ELEMENTARY CHEMICAL THERMODYNAMICS by Bruce H. Mahan, 1964, paperback, 155
pp. A knowledge of calculus is a requirement for a full understanding, however.guy - 26-7-2006 at 20:34
A measure of disorder is one of the definitions, where it is the number of possible ways that it can order itself, like a permutation. This is S =
k·log(N) where N is the number of possible quantum states. That is one way to look at it.
The way I am looking at it is using the definition that S = qrev/T. So I see it as the amount of energy in a system that cannot contribute to
temperature (kinetic energy). Example is ice melting. Heat is increased yet temperature remains the same, so entropy is increasing. The units for
heat capacity is J/K*g and the unit for entropy is J/K. I just thought how similar they were in definition and units.
"Entropy is what the equations define it to be"
(S=q/T) is "the change in entropy represents the amount of energy input to the system which does not participate in mechanical work done by the system
"
S = k·log(N) is a measure of probablity, and used when looking a statistical mechanics.Magpie - 27-7-2006 at 10:19
I read through Tim Thompson's essay and think he has some good points.
His assertion that "entropy is what the mathematical equations define it to be" is unassailable. But this does not leave us with much of an intuitive
grasp, does it? Maybe that's the way it has to be. That's why it is so fascinating! A powerful property that we just can't completely understand!
My thermodynamics text*puts it this way: "Furthermore, it [entropy] is defined in terms of a mathematical operation, and no direct physical picture
of it can be given. For these reasons, you may find at first that the concept of entropy is somewhat nebulous. In order to gain an understanding of
entropy, you should study its uses and keep asking the questions, 'What is it used for?' and 'How is it used?' If you are looking for a physical
description as an answer, the question 'What is entropy?' is fruitless."
Tim Thompson directly asserts that entropy is not disorder, but that it is related to disorder. I can't disagree with that.
I have never seen entropy defined as energy in any peer-reviewed text.
*Engineering Thermodynamics by Jones & Hawkins, 1960
[Edited on 27-7-2006 by Magpie]Quibbler - 27-7-2006 at 11:58
Entropy of a collection of molecules is a measure of the number of permutations they can have. So a perfect crystal at zero kelvin there is only one
possible arrangement of the atoms/molecules and all are in their lowest energy levels so S=0.
Now heating up the crystal causes the entropy to increase as molecules will be able to move out of the lowest energy level (for a solid crystal this
will just be vibrational energy). So the amount of heat absorbed (the heat capacity) is a measure of the amount of disorder due to the movement of the
molecules amongst energy levels. Mathematically:
S = Integral (C/T) dT where C is heat capacity and T is temperature.
In the gas phase the entropy any atom/molecule can be calculated as the availablility of energy levels is known - by solving the Schrodinger equation.
In fact most tabulations of entropies are calculated as this is more accurate than experiment.
For something simple like a monatomic gas, only translational energies are available and these only depend on the mass of the molecule. The entropy of
a monatomic gas is:
S = 5/2 R + R ln{Vm/(Na*L^3)}
L=h/SQR(2*pi*m*k*T)
R gas constant, Vm molar volume, Na Avagadros constant, h Planck's constant, m mass (of one molecule), k Boltzmann's constant, T temperature (K). If
you put in the numbers for Xe you should get 168.7 J mol-1 K-1.
Amazingly this is exactly the same value as the experimental (measuring the heat capacity from zero K to 298 K).
For more complex molecules there are vibrational and rotational energies to consider. For water the calculated and experimental values do not agree
meaning that water at zero kelvin has residual disorder.unionised - 27-7-2006 at 12:28
Because S is calculated as an integral there is no general way of knowing the absolute value. There is always a possibillity of the "constant of
integration" messing things up.
delta S values are much more robust.
There is a nice example, in theory, frozen CO has all the molecules lined up. In practice it usually doesn't. You can measure the delta S of fusion
for any given samples and any of the experimental values can be thought of as correct. The perfect crystal simply isn't available. You can look at the
Xray data for a crystal and (with some difficulty) measure the degree of disorder. This can then be related to the residual entropy. Once you take
that into account , the entropy values tally up.
Unfortunately, if you use real CO, it has C13 in it- about 1%. That should give you another entropy term but you can never get the stuff to crystalise
properly (ie with the heavy isotope at the bottom) so the calculated entropy is never testable by measurement- it gets mathematically lost in the
integral.
All the changes in entropy can be calculated but (like electrode potentials where the differences can be measured) the absolute values are based on
convention rather than calculation.
my textbook has a different interpretation
ooja - 17-4-2009 at 14:50
It says that chemistry is actually a divice that creates order, in that it is order.
not disorder?Lambda-Eyde - 17-4-2009 at 15:31
I can recommend what I think is an especially good and concise book: ELEMENTARY CHEMICAL THERMODYNAMICS by Bruce H. Mahan, 1964, paperback, 155
pp. A knowledge of calculus is a requirement for a full understanding, however.
Exactly what degree of calculus is required? I have knowledge of quadratic equations, basic functions and basic derivation, is this enough to grasp at
least the bigger picture? Or will it all be useless technobabble to me? Magpie - 17-4-2009 at 16:00
Quote:
Exactly what degree of calculus is required?
This question can't be answered without seeing exactly what problem you are attempting to solve. The concepts of calculus can be understood without
knowing the mechanics of doing calculus. Simple calculations of entropy changes can be done if the heat capacity does not change over the temperature
range of interest. Even if it does, solutions can be found graphically.
The level of technobabble used all depends on the approach of the author and the complexity of the problem.
[Edited on 18-4-2009 by Magpie]Lambda-Eyde - 17-4-2009 at 16:26
This question can't be answered without seeing exactly what problem you are attempting to solve.
Well, I was just thinking of understanding most of the book.
So my question is really; is this a good buy for someone posessing my degree of mathematical knowledge (which I roughly described in my last post)?
Magpie - 17-4-2009 at 18:06
I'm sorry; I see what you mean now.
The book is filled with simple equations having the differentials and integrals of calculus. I think it would be frustrating to try to read it
without a knowledge of what these forms mean. I hate to discourage you, but recommend you hold off buying until you've taken some calculus, both
differential and integral. DJF90 - 17-4-2009 at 18:56
The calculus I have to use when doing classical thermodynamics (revising it at the moment, have an exams next week ) is quite advanced. Not only does differentiation and integration come into play, but also partial differentiation
and a minor amount of the associated integration.
Physical chemistry is far from my favourite, but there is a sense of satisfaction when dealing with questions on thermodynamics (not to mention being
an extremely useful part of physical chemistry). To be discouraged is bad; instead go and learn the calculus first (it isnt too difficult, but its far
from easy..), and then come back and enjoy the wanders of thermodynamics