6575 Shares

I have a few questions about entropy?

I have a few questions about entropy? Topic: Case stayed definition
May 22, 2019 / By Nevan
Question: I know that entropy is the tendency for all things to reach equilibium. Is this a logarithmic function? and how does one calculate or measure it? What is its value when equilibium is attained? What are the units called?
Best Answer

Best Answers: I have a few questions about entropy?

King King | 4 days ago
Entropy isn't really the "tendency toward equilibrium." Most people tie it to some sense of 'disorder' in a system, but that's hard to measure. The real definition is tied to the number of 'states' a system can find itself in. The entropy S is defined as S = k*ln(number of states) where k is Boltzmann's constant, 1.38066×10−23 joules per kelvin. The units of entropy are sort of "energy per temperature". It's still hard to imagine counting states. It turns out that entropy changes as heat energy flows in or out of system and the change in entropy is given by change in S = (amount of heat flowing into the system)/(temperature of the system) So one can tally up the total entropy be adding up all the heat flow into the system divided by the temperature at which the transfer takes place. This gets tricky because the temperature changes as the heat flows in most cases. There are a few situations where heat can flow while the temperature stays constant, like boiling a pan of water or melting a block if ice. Those are easier calculations.
👍 156 | 👎 4
Did you like the answer? I have a few questions about entropy? Share with your friends

We found more questions related to the topic: Case stayed definition


King Originally Answered: Entropy: Increasing toward the past?
So I have finally read the corresponding section in the book (pp. 159–167 in my print) and here is how I understood what was written there. The major difference between the future and the past is that we remember the past, we know what happened. On the other hand, we can only model what can happen to a system in the future if we know its current state. If the method of describing the state allows us to describe the system fully in the given scope (which can be e.g. single rigid body dynamics), and provided that we have correct physics laws, we can model the future behaviour more or less exactly. However, when it comes to microscopical description, which we never know exactly, there is some statistical error in the initial condition itself and we can only guess the future states, with the error still rising. Alternatively, we can assign probabilities to various outcomes that can happen under such a fuzzy initial condition. Here is where the second law of thermodynamics comes into play, saying that among these outcomes, the more disordered ones share a much larger portion of the total probability than the more ordered ones. Thus (for example, in terms of expectation values of observable quantities), the system aims towards the most entropic future states. However, note that this is a mere guess. If you let the system really evolve for a given time and measure the corresponding values after that, you may get an answer different from the predicted one. Knowing more about your system, you have to adjust your calculations, effectively throwing out the "old", superseded result. What the author was referring to was the situation when we are not able to tell the past of the physical system and we must guess it from the current state, too. If we do not know the current state exactly, same as above, we will obtain nothing better than a probability distribution of what could have the system looked like in a time prior to the measurement. Obviously enough, taking an intuitive answer which has less entropy than the current state is one possibility, but among all the others, it has mathematically just a very little probability. If we really have no means of investigating the past, it's quite daring then to tell that this was supposedly the past state. Please note that this all must have been studied with the same care as in the above case. Imagine that you computed the current value of the entropy of a system, S1, and the expected value of entropy 1 hour ago, S2 > S1. However, then you meet someone else who actually was there, performed a measurement of his own and found a value S3 < S1. This again gives you some more information about the system, discarding the value of S2. Edit: Truth Hurts – I gave you a thumb down by mistake. Please accept my apologies for that, I just misclicked. Everyone else, please don't count this one.
King Originally Answered: Entropy: Increasing toward the past?
They list alleged "holes" all the time here. I think the question should be, can they find any holes that are NOT based on a misrepresentation of evolution and what scientists are actually saying? I hear the same cookie-cut arguments all the time: violates the second law of thermodynamics, monkeys would have to change into humans, blah blah blah. Have they found some flaw that somehow, NO scientist over the past 200 years ever happened to notice or publish a paper on? >>So, in order for scientists to take Creationism seriously, >>we need evidence that contradicts the theory of evolution. Even simpler than that: we'd need a HYPOTHETICAL example of something that would count as actual evidence AGAINST creationism. Unfortunately the definition of a creator is so vague, you can always rework it to fit any silly idea. You can always fall back on "Well the creator decided to make it that way to test our faith. And he's all-powerful, so he can do it." But you'll never get any answers about the physical world if you keep on bringing up claims like this. After that, we'd still need to know the exact mechanics of creation. What, exactly, does the creator do to a rib to turn it into a woman? Is it some force that changes the atoms? Creationism just isn't a science no matter how you look at it.
King Originally Answered: Entropy: Increasing toward the past?
"entropy means that to increase it, you must have an increase in disorder of the system under scrutiny." I would tend to think that the universe is actually going towards order, because planets form, solar systems, stars and galaxies form. This would mean entropy is decreasing for the universe. If it were to go to disorder, then all things in this universe would fly apart into sub atomic components and be nothing but chaos. Therefore, entropy actually does increase going into the past. Whereas more likely hood of randomness occured and much heat was present. "When heat is added to a system at high temperature, the increase in entropy is small. When heat is added to a system at low temperature, the increase in entropy is great." For the universe, there would be many many small systems (open ones) with which would have localized increases in entropy. Such as near stars.

Humphrey Humphrey
Entropy is the measure of disorder in a system. equivalently it is a measure of the information in a system. A "perfectly random" mixture has no information and is completely mixed or completely disordered. In communication theory (see Shannon) the information way to look at Entropy is used. In thermodynamics and chemistry we think about the disorder. More disorder, more Entropy. Units of entropy are energy per degree. Just like the idea of an absolute zero energy is meaningless, absolute zero entropy is also meaningless (for us backwards rubes on this third planet of a mediocre dwarf yellow star). We use entropy relatively, as the change in Entropy between two states. So we can calculate the relative entropy, while the absolute entropy is either a philosphical or a religious question. The measurements will relate both the energy and the entropy of a system in two different states (say before and after) and the total quantity will be the "driving force" behind any change. E = Q + TS where E will be the "free energy" or "potential to change" , Q will be the energy difference between state two and state one and S will be the entropy change between the two states ( T is the temperature, assumed here to be constant)
👍 60 | 👎 -2

Humphrey Originally Answered: Hi,why is Clausius’s entropy characterised like q/T?
If you want an informative answer on entropy, you really have to read more than can be provided on Y!A. For the derivation of 2nd law of thermodynamics and a solid explanation of entropy based on thermal physics (Carnot Cycles, i.e., not the statistical interpretation), an accessible but still reasonably rigorous derivation can be found in the book linked below. The level of the derivation is appropriate for a student with a solid college freshman/sophomore background in chemistry or physics and knowledge of elementary calculus.
Humphrey Originally Answered: Hi,why is Clausius’s entropy characterised like q/T?
If i recall correctly the ratio of dq/T was not derived...so to speak. It was a value that was used and seen repeatedly in thermodynamic study. It was then branded a label...ie Entropy. Who first did that on how it was found to relate to the second law of thermodynamics is beyond me. Hopefully, thats helps a little bit in your investigation

If you have your own answer to the question case stayed definition, then you can write your own version, using the form below for an extended answer.