Sciencemadness Discussion Board
Not logged in [Login ]
Go To Bottom

Printable Version  
Author: Subject: Entropy
guy
National Hazard
****




Posts: 982
Registered: 14-4-2004
Location: California, USA
Member Is Offline

Mood: Catalytic!

[*] posted on 16-7-2006 at 11:52
Entropy


Entropy as I understand it, is the amount of energy not availible to do work. So is that like energy in bond? And also I am assuming that entropy is the amount of heat in a system that does not contribute to the temperature, so is that in anyway related to Specific heat capacity?



View user's profile View All Posts By User
pantone159
National Hazard
****




Posts: 590
Registered: 27-6-2006
Location: Austin, TX, USA
Member Is Offline

Mood: desperate for shade

[*] posted on 16-7-2006 at 15:59


Entropy is the number of possible combinations of states of all the individual atoms/molecules that result in the overall bulk properties of the material at the particular conditions involved.

(Actually, it is the logarithm of the number of combinations.)
View user's profile Visit user's homepage View All Posts By User
Magpie
lab constructor
*****




Posts: 5939
Registered: 1-11-2003
Location: USA
Member Is Offline

Mood: Chemistry: the subtle science.

[*] posted on 22-7-2006 at 14:37


Entropy is a property that has always fascinated me ever since I first learned about it during my 3rd year in college. It is abstract enough that I always have to review it before feeling anywhere near competent to discuss it. Since it is too hot to work in the lab (>100F) this is a good time for me.

It is very interesting to me that entropy is a state function that can be quantified both as (1) the exponent of the number of molecular positions possible, and (2) by the thermodynamic equation of the heat transferred reversibly divided by the absolute temperature of the transfer! That is:

delta S = q(rev)/T(abs)

guy's original question about whether this is related to heat capacity is an interesting one for which I will have to continue my review....:D Anyone else care to respond?

[Edited on 23-7-2006 by Magpie]




The single most important condition for a successful synthesis is good mixing - Nicodem
View user's profile View All Posts By User
franklyn
International Hazard
*****




Posts: 3026
Registered: 30-5-2006
Location: Da Big Apple
Member Is Offline

Mood: No Mood

[*] posted on 22-7-2006 at 21:20


Ludwig Boltzmann, James Maxwell, Leo Szilard, and many others of

repute mused over this entropy thing. It is the simple fact of

being that the greater the number of individual objects such as

atoms in a system, the greater the number of possibilities for

their arrangement.

Those that we classify as random are vastly more existent and for

this reason of all the possibilities any one of these is by far

most probable.

It is supposed that mass and energy are created and destroyed all

the time at the quantum level but it is only the average of this

chaos that is observed macroscopically.

Entropy then is only the perception of this.


Boltzman's entropy and constant

http://en.wikipedia.org/wiki/Boltzman%27s_constant

http://plato.stanford.edu/entries/statphys-Boltzmann


The Demons

http://home.att.net/~numericana/answer/demon.htm


Szilard engine

http://www.slac.stanford.edu/pubs/slacpubs/6250/slac-pub-648...



Not just speculations in physical systems there are well established

mathematical analogs of this. One of the most amusing is Parrondos _


Ivar peterson

http://www.maa.org/mathland/mathtrek_3_6_00.html

Wikipedia

http://en.wikipedia.org/wiki/Parrondo%27s_Paradox

Juan's Home page

http://seneca.fis.ucm.es/parr


Brownian ratchet java applet

http://monet.physik.unibas.ch/~elmer/bm/


Brownian ratchets in fact

http://www-lpm2c.grenoble.cnrs.fr/nanosciences/Houches/duke5...

http://www.eleceng.adelaide.edu.au/Personal/gpharmer/games/i...


THE FLUCTUATION THEOREM FOR STOCHASTIC SYSTEMS
( two sources here )

http://eprints.anu.edu.au/archive/00000112/00/stochFT.pdf

http://www.citebase.org/cgi-bin/fulltext?format=application/...

Experimental Demonstration of Violations of
the Second Law of Thermodynamics for
Small Systems and Short Time Scales

( Various reviews and articles )

http://rsc.anu.edu.au/~evans/papers/selectnewsreportsFT.pdf

This paper was posted here but not well received

http://www.sciencemadness.org/talk/viewthread.php?tid=1978#p...

Realization of Maxwell’s Hypothesis

http://www.citebase.org/cgi-bin/fulltext?format=application/...


Quote:
Originally posted by pantone159

( Entropy is the logarithm of the number of combinations.)


I think the word you're searching for is permutation. Simple to

visualize if it is only the possible sequences of a number of coin

tosses. Fiendishly complex if spatial positioning, scalar and vector

magnitudes and duration of occurrence are disparate properties of

countless elements in a set.


It is theorized by some ( myself ) this is why gravity is so weak

compared to the atomic forces, it is a side effect of the virtual

certainty that matter energy creation and annihilation cannot be

equal, the numbers of events involved are just to great for this

to be, so there is a continuous vestigial surplus or deficit of

electric charge in the universe. This superposition of transient

forces, by Mach's Principle is what we perceive as gravity.

I posted references to papers allied to this idea here _

http://www.sciencemadness.org/talk/viewthread.php?tid=3216#p...

Mach's Principle

http://en.wikipedia.org/wiki/Mach's_principle

Haisch and Rueda

http://en.wikipedia.org/wiki/Stochastic_electrodynamics

.
View user's profile View All Posts By User
Magpie
lab constructor
*****




Posts: 5939
Registered: 1-11-2003
Location: USA
Member Is Offline

Mood: Chemistry: the subtle science.

[*] posted on 26-7-2006 at 20:17


guy, I have reviewed the subject of entropy so feel I can now discuss it again without making too many mistakes. I trust you have been doing some reading also. ;)

My understanding is that entropy is a measure of disorder, not energy. An entropy increase occurs when water goes from a liquid to a gas, for example. And in this case it does include the breaking of some bonds (hydrogen bonding) I would say.

A change in entropy can be computed, however, by dividing the heat transfered, reversibly, by the absolute temperature of the transfer. In that sense it seems related to energy.

Since the heat transfered can also be equated to the heat capacity times the change in temperature for some processes, you could also say there's a relation of entropy change to heat capacity... in a sense.

I can recommend what I think is an especially good and concise book: ELEMENTARY CHEMICAL THERMODYNAMICS by Bruce H. Mahan, 1964, paperback, 155 pp. A knowledge of calculus is a requirement for a full understanding, however.




The single most important condition for a successful synthesis is good mixing - Nicodem
View user's profile View All Posts By User
guy
National Hazard
****




Posts: 982
Registered: 14-4-2004
Location: California, USA
Member Is Offline

Mood: Catalytic!

[*] posted on 26-7-2006 at 20:34


A measure of disorder is one of the definitions, where it is the number of possible ways that it can order itself, like a permutation. This is S = k·log(N) where N is the number of possible quantum states. That is one way to look at it.

The way I am looking at it is using the definition that S = qrev/T. So I see it as the amount of energy in a system that cannot contribute to temperature (kinetic energy). Example is ice melting. Heat is increased yet temperature remains the same, so entropy is increasing. The units for heat capacity is J/K*g and the unit for entropy is J/K. I just thought how similar they were in definition and units.

Read this site (http://www.tim-thompson.com/entropy1.html).

"Entropy is what the equations define it to be"
(S=q/T) is "the change in entropy represents the amount of energy input to the system which does not participate in mechanical work done by the system "

S = k·log(N) is a measure of probablity, and used when looking a statistical mechanics.




View user's profile View All Posts By User
Magpie
lab constructor
*****




Posts: 5939
Registered: 1-11-2003
Location: USA
Member Is Offline

Mood: Chemistry: the subtle science.

[*] posted on 27-7-2006 at 10:19


I read through Tim Thompson's essay and think he has some good points.

His assertion that "entropy is what the mathematical equations define it to be" is unassailable. But this does not leave us with much of an intuitive grasp, does it? Maybe that's the way it has to be. That's why it is so fascinating! A powerful property that we just can't completely understand!

My thermodynamics text*puts it this way: "Furthermore, it [entropy] is defined in terms of a mathematical operation, and no direct physical picture of it can be given. For these reasons, you may find at first that the concept of entropy is somewhat nebulous. In order to gain an understanding of entropy, you should study its uses and keep asking the questions, 'What is it used for?' and 'How is it used?' If you are looking for a physical description as an answer, the question 'What is entropy?' is fruitless."

Tim Thompson directly asserts that entropy is not disorder, but that it is related to disorder. I can't disagree with that.

I have never seen entropy defined as energy in any peer-reviewed text.

*Engineering Thermodynamics by Jones & Hawkins, 1960

[Edited on 27-7-2006 by Magpie]




The single most important condition for a successful synthesis is good mixing - Nicodem
View user's profile View All Posts By User
Quibbler
Hazard to Self
**




Posts: 65
Registered: 15-7-2005
Location: Trinidad and Tobago
Member Is Offline

Mood: Deflagrated

[*] posted on 27-7-2006 at 11:58


Entropy of a collection of molecules is a measure of the number of permutations they can have. So a perfect crystal at zero kelvin there is only one possible arrangement of the atoms/molecules and all are in their lowest energy levels so S=0.

Now heating up the crystal causes the entropy to increase as molecules will be able to move out of the lowest energy level (for a solid crystal this will just be vibrational energy). So the amount of heat absorbed (the heat capacity) is a measure of the amount of disorder due to the movement of the molecules amongst energy levels. Mathematically:

S = Integral (C/T) dT where C is heat capacity and T is temperature.

In the gas phase the entropy any atom/molecule can be calculated as the availablility of energy levels is known - by solving the Schrodinger equation. In fact most tabulations of entropies are calculated as this is more accurate than experiment.

For something simple like a monatomic gas, only translational energies are available and these only depend on the mass of the molecule. The entropy of a monatomic gas is:

S = 5/2 R + R ln{Vm/(Na*L^3)}

L=h/SQR(2*pi*m*k*T)

R gas constant, Vm molar volume, Na Avagadros constant, h Planck's constant, m mass (of one molecule), k Boltzmann's constant, T temperature (K). If you put in the numbers for Xe you should get 168.7 J mol-1 K-1.
Amazingly this is exactly the same value as the experimental (measuring the heat capacity from zero K to 298 K).

For more complex molecules there are vibrational and rotational energies to consider. For water the calculated and experimental values do not agree meaning that water at zero kelvin has residual disorder.
View user's profile View All Posts By User
unionised
International Hazard
*****




Posts: 5126
Registered: 1-11-2003
Location: UK
Member Is Offline

Mood: No Mood

[*] posted on 27-7-2006 at 12:28


Because S is calculated as an integral there is no general way of knowing the absolute value. There is always a possibillity of the "constant of integration" messing things up.
delta S values are much more robust.
There is a nice example, in theory, frozen CO has all the molecules lined up. In practice it usually doesn't. You can measure the delta S of fusion for any given samples and any of the experimental values can be thought of as correct. The perfect crystal simply isn't available. You can look at the Xray data for a crystal and (with some difficulty) measure the degree of disorder. This can then be related to the residual entropy. Once you take that into account , the entropy values tally up.
Unfortunately, if you use real CO, it has C13 in it- about 1%. That should give you another entropy term but you can never get the stuff to crystalise properly (ie with the heavy isotope at the bottom) so the calculated entropy is never testable by measurement- it gets mathematically lost in the integral.
All the changes in entropy can be calculated but (like electrode potentials where the differences can be measured) the absolute values are based on convention rather than calculation.
View user's profile View All Posts By User
ooja
Harmless
*




Posts: 2
Registered: 17-4-2009
Member Is Offline

Mood: No Mood

[*] posted on 17-4-2009 at 14:50
my textbook has a different interpretation


It says that chemistry is actually a divice that creates order, in that it is order.
not disorder?
View user's profile View All Posts By User
Lambda-Eyde
National Hazard
****




Posts: 860
Registered: 20-11-2008
Location: Norway
Member Is Offline

Mood: Cleaved

[*] posted on 17-4-2009 at 15:31


Quote: Originally posted by Magpie  

I can recommend what I think is an especially good and concise book: ELEMENTARY CHEMICAL THERMODYNAMICS by Bruce H. Mahan, 1964, paperback, 155 pp. A knowledge of calculus is a requirement for a full understanding, however.

Exactly what degree of calculus is required? I have knowledge of quadratic equations, basic functions and basic derivation, is this enough to grasp at least the bigger picture? Or will it all be useless technobabble to me? :(
View user's profile View All Posts By User
Magpie
lab constructor
*****




Posts: 5939
Registered: 1-11-2003
Location: USA
Member Is Offline

Mood: Chemistry: the subtle science.

[*] posted on 17-4-2009 at 16:00



Quote:

Exactly what degree of calculus is required?


This question can't be answered without seeing exactly what problem you are attempting to solve. The concepts of calculus can be understood without knowing the mechanics of doing calculus. Simple calculations of entropy changes can be done if the heat capacity does not change over the temperature range of interest. Even if it does, solutions can be found graphically.

The level of technobabble used all depends on the approach of the author and the complexity of the problem.

[Edited on 18-4-2009 by Magpie]
View user's profile View All Posts By User
Lambda-Eyde
National Hazard
****




Posts: 860
Registered: 20-11-2008
Location: Norway
Member Is Offline

Mood: Cleaved

[*] posted on 17-4-2009 at 16:26


Quote: Originally posted by Magpie  

This question can't be answered without seeing exactly what problem you are attempting to solve.

Well, I was just thinking of understanding most of the book.
So my question is really; is this a good buy for someone posessing my degree of mathematical knowledge (which I roughly described in my last post)? :)
View user's profile View All Posts By User
Magpie
lab constructor
*****




Posts: 5939
Registered: 1-11-2003
Location: USA
Member Is Offline

Mood: Chemistry: the subtle science.

[*] posted on 17-4-2009 at 18:06


I'm sorry; I see what you mean now.

The book is filled with simple equations having the differentials and integrals of calculus. I think it would be frustrating to try to read it without a knowledge of what these forms mean. I hate to discourage you, but recommend you hold off buying until you've taken some calculus, both differential and integral.
View user's profile View All Posts By User
DJF90
International Hazard
*****




Posts: 2266
Registered: 15-12-2007
Location: At the bench
Member Is Offline

Mood: No Mood

[*] posted on 17-4-2009 at 18:56


The calculus I have to use when doing classical thermodynamics (revising it at the moment, have an exams next week :( ) is quite advanced. Not only does differentiation and integration come into play, but also partial differentiation and a minor amount of the associated integration.

Physical chemistry is far from my favourite, but there is a sense of satisfaction when dealing with questions on thermodynamics (not to mention being an extremely useful part of physical chemistry). To be discouraged is bad; instead go and learn the calculus first (it isnt too difficult, but its far from easy..), and then come back and enjoy the wanders of thermodynamics :D

[Edited on 18-4-2009 by DJF90]
View user's profile View All Posts By User

  Go To Top