ENG New site

Advanced search

[ New messages · Forum rules · Members ]
  • Page 2 of 2
  • «
  • 1
  • 2
When will all the energy in the universe be exaused?
parameciumkidDate: Thursday, 23.04.2015, 05:39 | Message # 16
Explorer
Group: Users
United States
Messages: 277
Status: Offline
So here's a discussion question:
Is gravity a force acting against the law of increasing entropy, insofar as it gathers randomly dispersed particles and packages them up into neat spheres, sorted internally by density?





Intel HD Graphics 4000 ;P
 
WatsisnameDate: Thursday, 23.04.2015, 08:03 | Message # 17
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
Quote BlueDrache ()
Common sense guess with the assumption that entropy increases as a system becomes more disorganized.


But does it increase to infinity? I can give you a variety of mathematical functions which only increase, but do not go to infinity.

I don't mean to seem like I'm picking on you here, I'm just trying to help you (and others!) to think deeply about it. smile Entropy is one of those things that a lot of people have heard about and understand on some "common sense" level (e.g. entropy as "chaos" or "disorder" which are perhaps the most popular analogies), but the unfortunate reality is that entropy is not always so straightforward, and the analogies can sometimes lead to confusion. I'd like to ask you a couple more questions to try and illustrate that. I don't expect you to answer them, but at least think about them:

Which is more disorganized (and why): a cloud of interstellar gas, or a star? How would you quantify how disorganized a system is? What does it mean to say that something is infinitely disorganized?

Not exactly obvious, eh?

The problem is that although "chaos" and "disorder" are easy to understand, they are not necessarily easy to quantify, and they are only loosely connected to what entropy actually means. Formally, entropy is "the logarithm of the number of possible dynamical states of a system". And that is not so easy to understand -- it requires some knowledge of statistical mechanics.

Personally I have a couple ways I like to think about it:

1) It is the amount of "hidden information" contained within the system.
For example, you can store quite a bit of useful information on a computer. But what happens if you decide to wipe the hard drive? Did that information disappear? No. It got scrambled, much of it converted into heat, and dissipated throughout the surrounding air. The information still exists, but now it is much more difficult to access. You would have to precisely trace back the motions of a huge number of particles to extract it. Thus it is "hidden" information, high in entropy.

2) It is a measure of the "unavailability" of energy to do work.
This is a bit more physics oriented since it assumes you know what "work" means. But if you do, then you know that work requires energy, but not all energy is useful in doing work. For example, incoming visible sunlight is short in wavelength compared to the infrared light that the Earth emits back into space. The incoming light warms the earth and does a lot of work -- it is converted by photosynthesis, it drives ocean and atmospheric motions, evaporates water, and so forth. But the outgoing longwave radiation isn't so useful for doing work. You could capture it and maybe power a low energy device, but it would be a lot harder. We can say (and accurately), that the sunlight provides low-entropy, useful energy to the Earth, and the Earth re-radiates high-entropy, less useful energy back into space. We can also say that in this sense the Earth is an "engine", with the Sun as the hot, low entropy reservoir and space as the cold, high entropy reservoir.

I often think of the second law as a statement that, for a given amount of energy, you can only do so much work with it before it is exhausted. The energy doesn't go away, since it is a conserved quantity, but after performing irreversible processes with it, the energy is converted into forms that you can no longer use as easily.

Often, processes involving an increase in entropy involve an obvious "scrambling of the stuff", which we intuitively characterize as "getting more disordered". But that's not always the case. There are also cases where entropy rises but the "stuff" might be arguably more organized. And that's the whole dang problem. A good understanding of entropy can be elusive.

Parameciumkid's latest discussion question is actually a wonderful example of how entropy can be so non-intuitive. I'll go over it later since this post length is getting a little absurd, but here's a teaser:

It turns out that if you have a self-gravitating system which is capable of collapsing, like a cloud of interstellar gas, then if it collapses, that involves an increase in the entropy, because the gravitational collapse is an irreversible process. Let me underline that. Gravity does not act against entropy. It is a way in which entropy can increase further! To think about it, consider the extreme case where the cloud of particles collapses fully. That is, it forms a black hole. What is the entropy of a black hole? Remember the definition. How many possible ways are there of building a black hole of a given mass? How much information can the black hole be hiding?





 
midtskogenDate: Thursday, 23.04.2015, 18:48 | Message # 18
Star Engineer
Group: Users
Norway
Messages: 1674
Status: Offline
When my daughter dropped my brand new Samsung phone on concrete a few years ago, my phone suffered something we can call a significant "gravity assisted entropy increase". So, yes, gravity definitely increases entropy. :P




NIL DIFFICILE VOLENTI
 
WatsisnameDate: Thursday, 23.04.2015, 22:45 | Message # 19
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
That's why I love my old dumb flip phone. No cool features, but it's built tough and resists increasing entropy amazingly well. tongue

The shattering of an object after being dropped and hitting a hard surface is definitely an intuitive example of increasing entropy. Thinking about it in terms of the energy transformations, it converted gravitational potential energy into kinetic energy, which then via collision went into doing destructive things. At the end, the physical structure is less ordered, and the energy finally converted into sound and heat. (And the sound ultimately converts into heat as well.)

The second law is intuitive in this case as well: dropping a delicate object, you expect it to shatter. If I played a video of a shattered object coming back together of its own will and then rising up off the ground and into my hand, you know the video was just played backwards. The direction of increasing entropy, "the forward arrow of time", is obvious.

A much less obvious example is the increase in entropy as a gas cloud collapses to form a star or some other compact object. Is the star more orderly than the gas cloud or less? If the cloud naturally collapses to form a star, then the second law tells us the star must be higher entropy. But even that is not quite correct. The collapsing gas cloud heats up, and emits a lot of radiation into surrounding space (not to mention radiation from the star itself). That's where a lot of the entropy increase is manifest: the conversion of gravitational potential energy (in the gas cloud) into heat and light.

Things get more confusing if you then look at a star exploding. One might naively suppose that the supernova is just a reversal of the formation of a star in the first place -- converting the star back into a diffuse cloud of particles. They may even suppose that the cycle from gas cloud to star to gas cloud is a reversible process, and therefore the net change in entropy is zero. But this is wrong. Both the collapse and the explosion increase the entropy. The reason is that vast amounts of useful energy are lost in both processes: as light and heat during the collapse part, and as light, heat, and radiation (mostly neutrinos!) during the explosion part.

If you sum it all up, you'll find that the star formation and explosion cycle cannot repeat over and over indefinitely. It is not 100% reversible, even if 100% of the material is recycled (no stellar remnant like a neutron star or black hole left over). Of course the stellar remnant does account for much of the entropy gain as well. A black hole will never turn back into a star.





 
parameciumkidDate: Friday, 24.04.2015, 04:32 | Message # 20
Explorer
Group: Users
United States
Messages: 277
Status: Offline
Ah, but what if over billions of years a black hole decays away into a dispersed cloud of elementary particles, which subsequently form atoms and collapse to form a new star?
This would of course be difficult in empty space, but near a galactic core, or in a nebula, where the density of interstellar dust and plasma is high, the emitted particles could be prevented from going too far too fast.





Intel HD Graphics 4000 ;P
 
WatsisnameDate: Friday, 24.04.2015, 05:33 | Message # 21
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
Most black hole radiation is extremely long wavelength photons (wavelength is comparable to the diameter of the hole; if you could image a black hole with its own radiation, it would look very fuzzy). This is about as high in entropy as any radiation can be, and isn't really possible to convert back into a star or anything else. In the long term it just adds to the uniform background radiation.




 
midtskogenDate: Friday, 24.04.2015, 09:38 | Message # 22
Star Engineer
Group: Users
Norway
Messages: 1674
Status: Offline
Wouldn't a big crunch scenario imply that entropy at some point would reverse?




NIL DIFFICILE VOLENTI
 
WatsisnameDate: Friday, 24.04.2015, 11:14 | Message # 23
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
If the universe ends as a crunch, then the entropy increases during both the expansion and contraction phase, all the way up to the crunch itself. There is a clear arrow of time through both; the contraction phase played backward looks distinctly different than the expansion phase played forward. They are not reversible processes.

The real unknown is what, if anything, happens after the crunch. Does it burst forth again as a new Big Bang? What if such a universe is cyclical? How does the entropy evolve then?

For now it is impossible to say. We have no physics for singularity states such as Big Bang or Big Crunch, let alone what happens before a bang or after a crunch. I think we would need a more complete theory for this (requiring quantum gravitation at the very least) before we could speak of the entropy throughout multiple universe iterations.





 
FastFourierTransformDate: Friday, 24.04.2015, 15:22 | Message # 24
Pioneer
Group: Local Moderators
Spain
Messages: 542
Status: Offline
Quote Watsisname ()
I think you would enjoy the paper that the wikipedia article cites for that figure, it's extremely fascinating and will help answer your question. smile In particular, check figures 6 and 7. (Also keep in mind that they use entropy with dimensionless units, by dividing it by the Boltzmann constant: k=1.38x10-23 J/K)


Wow thank you a lot!! So let see if I have understanded:
In 10^(113) years (10^120 seconds from the big bang), when we reach our final equilibrium, the entropy of the Universe would be k*10^(111)= 1,38*10^(88) J/K right?

Quote Watsisname ()
It turns out that the total entropy is already pretty close (in a log sense) to the maximum. The vast majority of the entropy gain happened very early on, when the universe was less than a second old.

Yes indeed (in a log sense). Quite interesting. From now (4,3*10^(81) J/K) to the end (1,38*10^(88) J/K), in brute numbers, the universe has to gain ten million times the present entropy.

I asked that question because I have a game proposal for SpaceEngine (if I can accept this becoming a game) and because I'm currently trying to explain to some people the concept of entropy without messing it up. In particular I was thinking of the mind blowing fact that the Universe dies a little (infinitessimaly little probably) to allow our existence as humans, for example (entropy producers).

I've done a little research and I found this paper where they arrive to the number of 11.404.000 J/K of entropy produced by a kg of mass of a human during its entire lifespan.
Taking the fact that the average human has 62 kg of mass (maybe a correction is needed to account for the variation of mass during its lifetime), this means that the life of a person cost an increase of 707.048.000 J/K in the entropy of the Universe.

I think the real number can be quite bigger because they have taken into account only the increase in entropy due to matabolic processes, heat emmision and specific diets. But what about the human production? I think that each human life cost more entropy than that, because without us the entropy production in the assembly of a car or the mining of a cave or the construction of a skycrapper wouldn't have appear. So the 7,1*10^8 J/K of entropy added to the universe by each human don't consider all the entropy that human intelligent activity produces. I don't know how to calculate that number, but let us assume (for legitimate rounding) that each of us costs 10^9 J/K.

Then, to "kill the Universe", nature "has" basically to increase the entropy 10^88 J/K more than we have today. To this number our lifes mean a contribution of 10^9 J/K as we have see. Now I want to translate this into time. What I mean is that I want numbers to astonish to fill the gaps in the phrase: "The universe will die ________ seconds sooner because of your existence". I want to tell someone: "the Universe, the entire Universe, Galaxies, Clusters, the hundreds of billions of worlds from here to the edge of the Cosmos are paying for your existence. All that ever was or ever will be is going to "die sooner" than expected because is the price the Cosmos is paying for you to exist. And this is not only a new-age style nonsense garbage, I can actually quantify this and give you a number!"

I want to know how much I'm wrong with numbers (in terms of orders of magnitude) and how much I'm wrong in terms of theoretical concepts. So here goes my aproach to this issue:



The result is 10^(-55) seconds!!
That's the time a human life costs in the agony of the Universe.

I don't know if this is even something. 10^(-55) seconds is a hundred billionth of a planck time unit (does this even make sense as time?)

If we talk about all of humanity, the cost for the Universe has been 10^(-44) seconds of his life (more or less a planck unit of time!!!).


The formula 5.2 of the page 279 of this paper about the thermodynamics of the biosphere
gives us a total entropy production by Earths Biosphere of 2,4*10^20 J/K every year. This means that life on Earth costs half a plank unit of time to the universe every year!!!!
Correct me if I'm wrong please.
 
parameciumkidDate: Friday, 24.04.2015, 21:04 | Message # 25
Explorer
Group: Users
United States
Messages: 277
Status: Offline
Yeah, if I understand Planck units correctly, that really doesn't count. The Planck length is effectively the size (or time length) below which measurements cannot be made, even by subatomic particles - more or less the "resolution" of the universe, though it's not exactly made of pixels.
Perhaps if the human population increased several times over it would be possible (in theory) to measure our "damage" to the universe, but at present we're a very small drop in a very large ocean.





Intel HD Graphics 4000 ;P
 
WatsisnameDate: Saturday, 25.04.2015, 08:27 | Message # 26
Galaxy Architect
Group: Global Moderators
United States
Messages: 2613
Status: Offline
Quote FastFourierTransform ()
Wow thank you a lot!! So let see if I have understanded:
In 10^(113) years (10^120 seconds from the big bang), when we reach our final equilibrium, the entropy of the Universe would be k*10^(111)= 1,38*10^(88) J/K right?


Yep, those numbers look good. I think you're smart to calculate from Fig 6, and not include the entropy of the CEH. The result would have been less interesting (and harder to find).

In the interest of correctness, I should point out that heat death doesn't really occur at a particular time -- you can never say "aha, the universe has just reached thermodynamic equilibrium". Physically, even though we may find a time where all structures have evaporated into radiation, that radiation can always become more diffuse and homogenized. The plot in Figure 6 ends at 10120s just because that's a nice 20th power of ten to stop after the last interesting and significant process (the decay of the super-massive black holes). And really, the fact that we're looking at time on a log scale that goes out more than a googol years should tell us that heat death is a super slow, asymptotic process.

Of course, you can always define a particular time as the moment of heat death, which is basically what you're doing here. 10120s is good enough. I would say anytime after 10110 or so, corresponding to the evaporation of the last super-massive black holes. After that, the universe is basically "dead". An observer would just see the increasingly rare and dim (very dim) bursts of distant exploding black holes, and the ever more attenuated radiation.

Anyway, your model and project. You want to model S(t) as an exponential:
S(t) = a(1-e-qt)

which makes S(0) equal to 0,
the limit of S(t) as t goes to infinity is a, which we choose to be 1088 or the maximum entropy,
and q (I make q positive and introduce a minus sign to emphasize the function's behavior) is the "growth rate".
The value of q is determined by plugging in a known S(t) and solving for it.

This model is automatically accurate in three places: t=0, t=now, and t-->∞. Its accuracy between t=0 and t=now is horrible (especially for very small t), but you're not interested in that. You're not trying to model the whole evolution of the universe's entropy, for which no simple mathematical formula will work. You only care about the future behavior, for which this model's accuracy is probably not too bad (for order of magnitude estimates). Your question is "if I add some instantaneous boost (the entropy caused by a human) to the curve at t=now, how much sooner does it reach the value it originally had at heat death?"

I haven't checked the numbers you got from that, but obviously they're going to be very small. Really, your math is just showing what we already know: humans don't make up much of the universe. smile

I'm only going to briefly poke at the question of whether or not the increase in entropy from a human equates to the universe going to heat death sooner. I really don't know if that is true or not. I can imagine that all the entropy we make would just happen anyway after the death of the Sun, or by some other process in the distant future but before heat death. Do we really make a long term difference in even the negligibly small sense? Again, I don't know, but for your exercise it's a pretty cool thing to calculate a number from, and say,
"Look, the universe is dying faster because of you!"
devil





 
FastFourierTransformDate: Wednesday, 06.05.2015, 15:14 | Message # 27
Pioneer
Group: Local Moderators
Spain
Messages: 542
Status: Offline
Wow, thanks a lot Watsisname!! It's an honor to have your answer on this issue.

Quote Watsisname ()
Yep, those numbers look good. I think you're smart to calculate from Fig 6, and not include the entropy of the CEH. The result would have been less interesting (and harder to find).


It was a quite ignorant choice, but thanks hahaha

Quote Watsisname ()
In the interest of correctness, I should point out that heat death doesn't really occur at a particular time

Yes I knew that but I had to choose a date (I choose the end of the graph becuase it was "sufficiently death" biggrin ). In fact, as you have pointed out, the thermodynamic equilibrium is never reached, but we can estimate that beyond 10120s there's not much difference with the actual equilibrium state.

Quote Watsisname ()
I'm only going to briefly poke at the question of whether or not the increase in entropy from a human equates to the universe going to heat death sooner. I really don't know if that is true or not. I can imagine that all the entropy we make would just happen anyway after the death of the Sun, or by some other process in the distant future but before heat death.

Yes, sure. I agree with tha fact that even without humans the universe will die because of many many other processes, in essence, at the same time. We are increasing the entropy for sure and that happens to be an infinitessimaly small help I think.
The interest thing would be calculating the total amount of entropy generated by each planet and comparing it to the Earth. Maybe our capability of "killing the cosmos" is noticiable when compared with the average of a planetary thermodynamical system.
The fact is that planets don't make much of the entropy as it has been shown in figure 6 hahaha (there are processes in the universe that "make such a damage to it" that we are puny in comparison).

Wow, I think my english is getting even worse. I don't know how to express myself without looking stupid
 
VilfateDate: Wednesday, 17.06.2015, 07:09 | Message # 28
Astronaut
Group: Users
China
Messages: 49
Status: Offline
Quote
The stars and Galaxies died and snuffed out, and space grew black after ten trillion years of running down.

One by one Man fused with AC, each physical body losing its mental identity in a manner that was somehow not a loss but a gain.

Man's last mind paused before fusion, looking over a space that included nothing but the dregs of one last dark star and nothing besides but incredibly thin matter, agitated randomly by the tag ends of heat wearing out, asymptotically, to the absolute zero.

Man said, "AC, is this the end? Can this chaos not be reversed into the Universe once more? Can that not be done?"

AC said, "THERE IS AS YET INSUFFICIENT DATA FOR A MEANINGFUL ANSWER."

Man's last mind fused and only AC existed -- and that in hyperspace.

Isaac Asimov, The Last Question biggrin
The best sci-fi on understanding how entropy works I think





 
  • Page 2 of 2
  • «
  • 1
  • 2
Search: