# Stuff and Things > HISTORY, veterans & science >  What I Learned About Entropy

## Fall River

I recently read a relatively new book, "Until The End Of Time" by Brian Greene

Amazingly, to my delight, he doesn't use any math to explain entropy, whether on Earth or to the formation of the universe.

Entropy is going on everywhere, moving from place to place.  

That makes sense because Einstein said, "Energy cannot be created or destroyed; it can only be changed from one form to anther."

One of Brian Greene's best examples, in my opinion, is a stick of dynamite.  It represents high quality energy and low entropy.  I like it because it's easy to remember.

But after it explodes it becomes low quality energy, high entropy.  (I hope I got that right, at least I think I did.)

You can't stop entropy, it's going on everywhere, and that, supposedly, is bad news for how the Universe ends.  

No more starlight just darkness, right?

----------

Swedgin (11-02-2021)

----------


## nonsqtr

It's not that simple.

Life acts in opposition to entropy. It's the exact opposite, life represents extreme concentrations of information and structure.

----------

Fall River (10-20-2021)

----------


## UKSmartypants

> I recently read a relatively new book, "Until The End Of Time" by Brian Greene
> 
> Amazingly, to my delight, he doesn't use any math to explain entropy, whether on Earth or to the formation of the universe.
> 
> Entropy is going on everywhere, moving from place to place.  
> 
> That makes sense because Einstein said, "Energy cannot be created or destroyed; it can only be changed from one form to anther."
> 
> One of Brian Greene's best examples, in my opinion, is a stick of dynamite.  It represents high quality energy and low entropy.  I like it because it's easy to remember.
> ...



well me and nonsqtr have slightly different views on this.


The issue I have with entropy is that it is allegedly a consequence of the Second law of Thermodynamics and used as an indicator of the direction of the Arrow of Time.

Let me tell you why this is an issue

1. The 2nd law  assumes the universe is a closed system. Ther is no proof of this. In fact, if the Universe is infinite, it cant possibly be.

2. The 2nd law assumes the amount of Matter/Energy/Information in the Universe is fixed. There is also no evidence thats true. Again, if the Universe is infinite, then the amount of Matter/Energy/Information may be infinte. On the other hand, we know for a fact information falls into a black hole, and no one has come up with a theory to explain what happens to it when the black hole evaporates.

3. There is nothing in physics demands a time dimension. Nonsqtr disagrees with this,  but he tends to base his arguments on what goes on in the human brain, which I think is irrelevant.

I think (as do a lot of others) that time, and therefore entropy is a local phenomenon associated with random local collapses on a macro scale of wave functions. Otherwise the question of how did the universe come into existence with no observers gets a bit awkward.

----------

StanAtStanFan (10-20-2021),teeceetx (10-20-2021)

----------


## StanAtStanFan

> It's not that simple.
> 
> Life acts in opposition to entropy. It's the exact opposite, life represents extreme concentrations of information and structure.



Energy can only change shape or composition, but is never destroyed.

I would agree - Death, in fact, eliminates (all religious opinions aside), the elimination of entropy I would venture.

Stan

----------


## Freewill

> I recently read a relatively new book, "Until The End Of Time" by Brian Greene
> 
> Amazingly, to my delight, he doesn't use any math to explain entropy, whether on Earth or to the formation of the universe.
> 
> Entropy is going on everywhere, moving from place to place.  
> 
> That makes sense because Einstein said, "Energy cannot be created or destroyed; it can only be changed from one form to anther."
> 
> One of Brian Greene's best examples, in my opinion, is a stick of dynamite.  It represents high quality energy and low entropy.  I like it because it's easy to remember.
> ...


One small addition, even though Einstein did say "Energy cannot be created or destroyed"  That had to be prior to 1944 when in fact mass was coverted to energy.  So it should be, "Energy cannot be created or destroyed except in a nuclear reaction.

----------


## Oceander

> One small addition, even though Einstein did say "Energy cannot be created or destroyed"  That had to be prior to 1944 when in fact mass was coverted to energy.  So it should be, "Energy cannot be created or destroyed except in a nuclear reaction.


But energy isn't created or destroyed, it's just put into different forms.  The energy released from a nuclear reaction is part of the binding energy of the subatomic particles.

----------

Physics Hunter (10-21-2021),StanAtStanFan (10-20-2021)

----------


## UKSmartypants

> One small addition, even though Einstein did say "Energy cannot be created or destroyed"  That had to be prior to 1944 when in fact mass was coverted to energy.  So it should be, "Energy cannot be created or destroyed except in a nuclear reaction.



well no, the basic truth is that matter, energy and information are three form of the same thing



You convert matter to energy when you have a meal then run a marathon

----------


## Fall River

> It's not that simple.


I agree, and the author spends a lot of time explaining why it's not simple.




> Life acts in opposition to entropy. It's the exact opposite, life represents extreme concentrations of information and structure.


Yes, and he explains by looking at what happens in the overall scheme of things - the big picture.  

An obvious example, on a smaller scale, is what happens when a baby is born, it grows and gains more information and structure.

I skipped over some chapters so I'm not too clear on how to explain the overall effect, but I assume, in the future, there will be more stars burning out than new ones being born. So eventually there will be no stars.
I'll have to go back and do more reading to see exactly how he envisions the end of the universe.  

It's in the title: "The End Of Time"   :Smiley20:

----------


## Fall River

> well me and nonsqtr have slightly different views on this.
> 
> 
> The issue I have with entropy is that it is allegedly a consequence of the Second law of Thermodynamics and used as an indicator of the direction of the Arrow of Time.
> 
> Let me tell you why this is an issue
> 
> 1. The 2nd law  assumes the universe is a closed system. Ther is no proof of this. In fact, if the Universe is infinite, it cant possibly be.
> 
> ...


You said, "matter/energy/information may be infinite", but can't it still become totally disorganized over a long period of time so that it no longer looks and operates like the universe we know today (i.e., it becomes low quality energy)?

----------


## nonsqtr

> well me and nonsqtr have slightly different views on this.
> 
> 
> The issue I have with entropy is that it is allegedly a consequence of the Second law of Thermodynamics and used as an indicator of the direction of the Arrow of Time.
> 
> Let me tell you why this is an issue
> 
> 1. The 2nd law  assumes the universe is a closed system. Ther is no proof of this. In fact, if the Universe is infinite, it cant possibly be.
> 
> ...


No, it's simple mathematics.

Stochastic processes are by definition irreversible.

Selection from a set can not be reversed. That's simple math and it had nothing to do with neurons.

"Choice" is an irreversible process. Period.

----------


## nonsqtr

> well no, the basic truth is that matter, energy and information are three form of the same thing


That's what I think too.

But, most people consider "energy" to be the fundamental driver of that trio, and I don't.

I believe "information" is the fundamental driver.

Information happens, and energy goes along with it. Not the other way around.

----------


## nonsqtr

> I recently read a relatively new book, "Until The End Of Time" by Brian Greene
> 
> Amazingly, to my delight, he doesn't use any math to explain entropy, whether on Earth or to the formation of the universe.
> 
> Entropy is going on everywhere, moving from place to place.  
> 
> That makes sense because Einstein said, "Energy cannot be created or destroyed; it can only be changed from one form to anther."
> 
> One of Brian Greene's best examples, in my opinion, is a stick of dynamite.  It represents high quality energy and low entropy.  I like it because it's easy to remember.
> ...


I believe entropy is a "result", a consequence. In that view it seems Smarty and I are aligned.

There's way more going on here than we understand. First of all the "volume" of our universe is expanding, and it's not like a balloon where the edges expand, it's happening everywhere all at once. It makes sense that nature abhors a vacuum and things will tend to migrate into the new empty space.

Second, there is NO local measure of information. Such a thing doesn't exist. Every measure of information that we have or know, depends on knowledge of the whole. A probability density "by definition" is a fraction of the whole.

This is why an understanding of topology is now so vital. Math, so far, has been mostly based in "points", and points are kind of a bottom-up view. There is a whole different view that's top-down, where you "can't" have points because the concept is nonsensical. You can have "intervals" and "areas" and "volumes", but if you start talking about points you get into this weird nonsensical math where, for instance, the probability of selecting any single outcome from a continuous distribution is zero. Which is clearly nonsensical, but satisfies the prevailing math.

----------


## nonsqtr

> You said, "matter/energy/information may be infinite", but can't it still become totally disorganized over a long period of time so that it no longer looks and operates like the universe we know today (i.e., it becomes low quality energy)?


Gibbs entropy is defined in terms of "number of possible combinations", or states.

Disorganization does not change the number of possible states.

----------


## nonsqtr

> I agree, and the author spends a lot of time explaining why it's not simple.
> 
> 
> 
> Yes, and he explains by looking at what happens in the overall scheme of things - the big picture.  
> 
> An obvious example, on a smaller scale, is what happens when a baby is born, it grows and gains more information and structure.
> 
> I skipped over some chapters so I'm not too clear on how to explain the overall effect, but I assume, in the future, there will be more stars burning out than new ones being born. So eventually there will be no stars.
> ...


You probably want to understand the issue of commutativity in quantum mechanics.

Commutativity means you can interchange the order in which things are done.

Non-commutativity means that order matters. The equation is called the "commutator", expressed as AB - BA either equals or does not equal 0.

If order matters, it's "almost" the same thing as irreversibility.

----------


## nonsqtr

The fundamental issue with thermodynamics ("statistical mechanics") is it includes states we can't measure.

The Jaynes formulation of the Gibbs equilibrium is based on optimization of the Von Neumann entropy, which includes states we can't measure.

Um... we're only looking at real parts along diagonals, right? If we can't measure something it's not, strictly speaking, an observable.

----------


## UKSmartypants

> You said, "matter/energy/information may be infinite", but can't it still become totally disorganized over a long period of time so that it no longer looks and operates like the universe we know today (i.e., it becomes low quality energy)?


well if the universe is infinite it can never max out entropy.

and if it cant do that, its not a closed system

so the second law of thermodynamics cant apply.

so time is bollox.




Also, the Heat Death of the Observable Universe has already happened, we are now at 2.7K. But  we have no way of determining  how far that has spread out beyond the furthest point we observe, the Epoch of the Last Scattering, if there is a beyond.

----------


## Physics Hunter

> It's not that simple.
> 
> Life acts in opposition to entropy. It's the exact opposite, life represents extreme concentrations of information and structure.


No.  Entropy is.

Life seeks and uses energy to climb out of entropy to create complexity, while creating more entropy.

There is no perpetual motion machine.

----------

Fall River (10-21-2021)

----------


## Physics Hunter

> That's what I think too.
> 
> But, most people consider "energy" to be the fundamental driver of that trio, and I don't.
> 
> I believe "information" is the fundamental driver.
> 
> Information happens, and energy goes along with it. Not the other way around.



No, without energy, information is useless.

----------

Fall River (10-21-2021)

----------


## Physics Hunter

> well if the universe is infinite it can never max out entropy.
> 
> and if it cant do that, its not a closed system
> 
> so the second law of thermodynamics cant apply.
> 
> so time is bollox.
> 
> 
> ...


The universe is not infinite, it is expanding.

----------


## UKSmartypants

> The universe is not infinite, it is expanding.



You have no proof or other evidence to the size of the universe beyond the Epoch of the Last Scattering at 13.7 years we know  of, all you can say is THIS bubble cant be more than 13.7B years old, however  THIS bubble is expanding at roughly 3.5c so the expanding front must be at 46.5B LY. Beyond that you dont  know what there is. The Cosmological  Constant is so close to 1.0 it implies its flat, which is also suggested by measurements of parallel light beams . Current measurements aren’t accurate enough for us to know whether the universe’s flat geometry is represented by a piece of paper, a cylinder, torus, or any other shape that permits the parallel passage of two beams of light. An infinite universe could have a geometry that is totally flat like a piece of paper. Such a universe would go on forever and include every possibility — including endless versions of ourselves. On the other hand, a donut-shaped universe would have to be finite, as it's closed. But for now we still don't know the shape of the universe, and therefore nor can we know its size,  and only when we know the geometry can we make an observation  about the true nature of entropy.

----------


## UKSmartypants

> It's not that simple.
> 
> Life acts in opposition to entropy. It's the exact opposite,* life represents extreme concentrations of information* and structure.





> No.  Entropy is.
> 
> Life seeks and *uses energy to climb out of entropy to create complexity, while creating more entropy.
> * 
> There is no perpetual motion machine.





I think you are both arguing about semantics, the bolded partts of both statement are approximately and arguably true.  The mistake you both make is use the word 'life'. The universe does not need  life to function, its processes continue without oberservers. Hence, entropy, whether its a local or universal measure,  continues to accrue regardless.

----------


## Fall River

https://astronomy.com/news/magazine/...the-big-freeze

*Astronomers once thought the universe could collapse in a big crunch. Now most agree it will end with a Big Freeze.*

"Trillions of years in the future, long after Earth is destroyed, the universe will drift apart until galaxy and star formation ceases. Slowly, stars will fizzle out, turning night skies black."

This seems to be the same line of thinking expressed by Brian Greene, the author I mentioned in my opening post.  The end of the universe and the end of time is inevitable.

----------


## nonsqtr

> No, without energy, information is useless.


Prove it.

----------


## nonsqtr

> No.  Entropy is.
> 
> Life seeks and uses energy to climb out of entropy to create complexity, while creating more entropy.
> 
> There is no perpetual motion machine.


What I meant was, life is a natural phenomenon in apparent violation of the Second Law. Within to his six foot bag of water, there is no tendency to disorder.

----------


## nonsqtr

> https://astronomy.com/news/magazine/...the-big-freeze
> 
> *Astronomers once thought the universe could collapse in a big crunch. Now most agree it will end with a Big Freeze.*
> 
> "Trillions of years in the future, long after Earth is destroyed, the universe will drift apart until galaxy and star formation ceases. Slowly, stars will fizzle out, turning night skies black."
> 
> This seems to be the same line of thinking expressed by Brian Greene, the author I mentioned in my opening post.  The end of the universe and the end of time is inevitable.


Nah. They're just guessing.  :Smile:

----------

Fall River (10-22-2021)

----------


## UKSmartypants

> Nah. They're just guessing.



I agree. At the moment we cant even agree on a value for the expansion rate, you get two values depending how you measure it.. And theres always the possibility of a False Vaccum. Or the energy density  per sq/km might become too low to support spacetime. Or there are even some maths to show  its cyclic 

Need to solve the Dark energy/Dark matter issue, and as ive previously posted Teleparallel Gravity is the way forward to fix the standard model

----------


## Physics Hunter

> You have no proof or other evidence to the size of the universe beyond the Epoch of the Last Scattering at 13.7 years we know  of, all you can say is THIS bubble cant be more than 13.7B years old, however  THIS bubble is expanding at roughly 3.5c so the expanding front must be at 46.5B LY. Beyond that you dont  know what there is. The Cosmological  Constant is so close to 1.0 it implies its flat, which is also suggested by measurements of parallel light beams . Current measurements aren’t accurate enough for us to know whether the universe’s flat geometry is represented by a piece of paper, a cylinder, torus, or any other shape that permits the parallel passage of two beams of light. An infinite universe could have a geometry that is totally flat like a piece of paper. Such a universe would go on forever and include every possibility — including endless versions of ourselves. On the other hand, a donut-shaped universe would have to be finite, as it's closed. But for now we still don't know the shape of the universe, and therefore nor can we know its size,  and only when we know the geometry can we make an observation  about the true nature of entropy.


Now you know why I stayed out of astronomy, too many people trying to think up some theory to get it named after themselves.  

I will continue to assume a spherical chicken until informed of some breakthrough...

----------


## Physics Hunter

> Prove it.


Without energy you cannot do work, without work you cannot exploit info.

Done and out.

----------


## Physics Hunter

> What I meant was, life is a natural phenomenon in apparent violation of the Second Law. Within to his six foot bag of water, there is no tendency to disorder.


Apparent is the magic word, take that body in even it's local context and it is a strong entropy generator

----------


## Physics Hunter

> I think you are both arguing about semantics, the bolded partts of both statement are approximately and arguably true.  The mistake you both make is use the word 'life'. The universe does not need  life to function, its processes continue without oberservers. Hence, entropy, whether its a local or universal measure,  continues to accrue regardless.


No, as in all discussions of a system's entropy, the discussion usually turns to how big the system boundaries are.

----------

Northern Rivers (10-22-2021)

----------


## Northern Rivers

> No, as in all discussions of a system's entropy, the discussion usually turns to how big the system boundaries are.


That's right out of my Anger Management mentoring. Damned near word for word, too: 

"Ignore the anger you feel and it will go away. At worst, you'll see it places boundaries on your own life."

Anger Entropy. Cool!  :Headbang:

----------

Physics Hunter (10-22-2021)

----------


## UKSmartypants

> Without energy you cannot do work, without work you cannot exploit info.
> 
> Done and out.



No, you are misrepresenting what 'information' is.  There is only two sort  of basic information - spin and charge. They are independent of energy and mass, and obey the Pauli Exclusion Principle.  And thats all Mass equivalence applies to.  Launders Principle has shown that the mass of a bit of information at room temperature (300K) is 3.19 × 10-38  Kg




Now let me chuck this into the ring

The estimated mass of a bit of information at T = 2.73K is mbit = 2.91 × 10-40 Kg. Assuming that all the missing dark matter is in fact information mass, the initial estimates indicate that ∼1093 bits would be sufficient to explain all the missing dark matter in the visible Universe. Remarkably, this number is reasonably close to another estimate of the Universe information bit content of ∼1087 given in 2008 via a different approach.


This means if 4/5ths of the mass of the universe, supposedly Dark matter, is in fact D6 information, then the Universe has already all but completed its Heat Death, and the residual entropy is irrelevant, and certainly no indicator of the existence of time.

----------


## nonsqtr

You mean Landauer's principle - which it turns out, is nothing more than a restatement of the Gibbs formulation.

But, it turns out, there is absolutely no relationship between logical reversibility and thermodynamic reversibility. There's at least one group that's already claimed a violation of Landauer's principle.

----------


## nonsqtr

Landauer's principle is based on circular reasoning. It assumes the validity of the Second Law, and then draws conclusions from the assumption.

The Second Law is a STATISTICAL (not mechanistic) description. It "assumes" that only certain combinations are possible, and those assumptions are already provably false.

And yes, there is no such thing as a closed system.

----------


## nonsqtr

You wanna know the easy way to determine that Landauer's law is bullshit, "by inspection"?

Look at the equation. It has the term T in it. It's bullshit.

----------


## nonsqtr

> No, you are misrepresenting what 'information' is.  There is only two sort  of basic information - spin and charge. They are independent of energy and mass, and obey the Pauli Exclusion Principle.  And thats all Mass equivalence applies to.  Launders Principle has shown that the mass of a bit of information at room temperature (300K) is 3.19 × 10-38  Kg
> 
> 
> 
> 
> Now let me chuck this into the ring
> 
> The estimated mass of a bit of information at T = 2.73K is mbit = 2.91 × 10-40 Kg. Assuming that all the missing dark matter is in fact information mass, the initial estimates indicate that ∼1093 bits would be sufficient to explain all the missing dark matter in the visible Universe. Remarkably, this number is reasonably close to another estimate of the Universe information bit content of ∼1087 given in 2008 via a different approach.
> 
> ...


Parity?

----------


## nonsqtr

> No, as in all discussions of a system's entropy, the discussion usually turns to how big the system boundaries are.


However - the entropy of any joint system is no higher than the sum of its parts, but may be lower due to mutual information

----------


## nonsqtr

Warning: the prevailing pop-science definition of entropy as disorder or chaos is WRONG.

It's wrong as the day is long, yet it gets blindly echoed by millions of so-called "experts".

The problem can be easily seen in the Hartley formulation, which is basically the same concept as the Gibbs formulation. (However the Hartley function can be realized without referencing probability, which is interesting)...

Coming at this in a roundabout way, here's the basic problem: there are structures with finite volume but infinite surface area. An example is the Koch snowflake.

----------


## Physics Hunter

> No, you are misrepresenting what 'information' is.  There is only two sort  of basic information - spin and charge. They are independent of energy and mass, and obey the Pauli Exclusion Principle.  And thats all Mass equivalence applies to.  Launders Principle has shown that the mass of a bit of information at room temperature (300K) is 3.19 × 10-38  Kg
> 
> 
> 
> 
> Now let me chuck this into the ring
> 
> The estimated mass of a bit of information at T = 2.73K is mbit = 2.91 × 10-40 Kg. Assuming that all the missing dark matter is in fact information mass, the initial estimates indicate that ∼1093 bits would be sufficient to explain all the missing dark matter in the visible Universe. Remarkably, this number is reasonably close to another estimate of the Universe information bit content of ∼1087 given in 2008 via a different approach.
> 
> ...


If you guys are talking about information at that level, I will butt out, not much interest.

----------


## UKSmartypants

> If you guys are talking about information at that level, I will butt out, not much interest.



well I always do. Nonsqtr tries to talk about human brains but thats irrelevant. I talk cosmology, 'cos the proposition is entropy, and time. Its irrelevant what it does in the brain,  the issue is, is time and entropy universal or local. They might be local at the scale of the brain, but im sure they arent universal.

----------


## nonsqtr

> well I always do. Nonsqtr tries to talk about human brains but thats irrelevant. I talk cosmology, 'cos the proposition is entropy, and time. Its irrelevant what it does in the brain,  the issue is, is time and entropy universal or local. They might be local at the scale of the brain, but im sure they arent universal.


Entropy is meaningless.

The people who try to define entropy locally are WRONG, provably so.

Entropy is a NON-LOCAL measure, not a physical observable.

----------


## nonsqtr

> well I always do. Nonsqtr tries to talk about human brains but thats irrelevant. I talk cosmology, 'cos the proposition is entropy, and time. Its irrelevant what it does in the brain,  the issue is, is time and entropy universal or local. They might be local at the scale of the brain, but im sure they arent universal.


Start here: S = k ln W

"W" is knowledge of the whole.

----------


## nonsqtr

Boltzmann waved his hands over this issue. He tried to split entropy into "configuration entropy" and "other kinds". His ideal gas formulation dealt with his concept of configuration entropy, and him and Maxwell developed a nice little model of colliding particles.

But now consider this from an AI standpoint. "Configuration entropy" is nothing more than the arrangement of bits in a byte. It's just easier in a byte because we know the positions of the bits in advance, we don't have to deal with "unitary transformations that result in the same state".

----------


## nonsqtr

When the cue ball hits the triangle and the billiards go all over the place, the entropy doesn't increase because of the configuration, it increases because THE SPACE HAS GROWN. The little triangle confining the balls has become an entire pool table.

The same thing happens in the brain as happens in the universe: the space expands. The universe is expanding.

----------


## nonsqtr

> well I always do. Nonsqtr tries to talk about human brains but thats irrelevant. I talk cosmology, 'cos the proposition is entropy, and time. Its irrelevant what it does in the brain,  the issue is, is time and entropy universal or local. They might be local at the scale of the brain, but im sure they arent universal.


Entropy has nothing to do with time.

That relationship only exists in an ideal gas, which doesn't exist in real life.

----------


## nonsqtr

Here's a thought experiment that might help:

Let's say you have 8 bits in a byte.

Now rearrange the bits along the vertices of an octagon (which is symmetrical, therefore you lose order except if you choose to define a direction).

Which arrangement has the greater entropy?

----------


## nonsqtr

> well I always do. Nonsqtr tries to talk about human brains but thats irrelevant. I talk cosmology, 'cos the proposition is entropy, and time. Its irrelevant what it does in the brain,  the issue is, is time and entropy universal or local. They might be local at the scale of the brain, but im sure they arent universal.


Here, look - at the core of all randomness are the "distributions", the probability density functions that describe the nature of the randomness.

In the case of Boltzmann's ideal gas, the distribution is based on combinatorics. ("W"). But how would the behavior change if there were a different distribution?

"Entropy", is a measure of mutual information. You can define mathematically, "how different" two distributions are. This is called the Renyi entropy, and it generalizes the Shannon entropy, the Gibbs entropy, collision entropy, min-entropy, and a whole bunch of other things.

The Renyi entropy is an index of diversity. In quantum land it's used as a measure of entanglement. 

Take a careful look at the formula defining the Renyi divergence. The special case of a=1 can only be defined as a limit, yet it generates the Shannon entropy and the Kullback-Liebler formula.

The point is: entropy is MUTUAL information except in one very specific case that can only be defined as a limit. Entropy is INHERENTLY mutual, by definition. There is no such thing as "information that stands alone". A point can not "have" information, if there is information it requires an additional context, maybe spin or charge, some kind of "diversity", which Boltzmann equated with W.

The link between Smarty and me, is that in both cases (cosmology and the brain), the diversity happens "by itself". In cosmology they call it "quantum fluctuations", in neuroscience we call it "spontaneous activity", but a rose by any other name...

So I'll drop a bomb here - if quantum existence is truly "fuzzy" and all possibilities happen simultaneously in parallel like the model suggests, then topology says the most likely configuration is a DUST, like a Cantor dust. If you work with dusts for a while you begin to realize that there is an intimate relationship with probability distributions.

Physicists don't know (much) about dusts. What we know though, is they are non-obvious solutions to "almost all" of the ordinary and partial diff eq's involved in physics. The dynamicists only found out about vortex solutions 100 years ago, and they haven't had time or reason to consider the dusts yet.

----------


## Fall River

> When the cue ball hits the triangle and the billiards go all over the place, the entropy doesn't increase because of the configuration, it increases because THE SPACE HAS GROWN. The little triangle confining the balls has become an entire pool table.
> 
> The same thing happens in the brain as happens in the universe: the space expands. The universe is expanding.



When you say entropy isn't local, what exactly do you mean?  You mention the brain, isn't that local?  What about the rest of the body?  No entropy there?


What is a seed?  Does a seed contain information?  Doesn't physics apply to everything?  Or does physics only apply to what you want it to apply to.

----------


## nonsqtr

> When you say entropy isn't local, what exactly do you mean?  You mention the brain, isn't that local?  What about the rest of the body?  No entropy there?
> 
> 
> What is a seed?  Does a seed contain information?  Doesn't physics apply to everything?  Or does physics only apply to what you want it to apply to.


This was kinda the point about Boltzmann's concept of configuration entropy, and also why I mentioned dusts.

We think of "configuration" as being non-local. For example, bits in a byte. Or, the number of spatial arrangements available to ideal molecules in an ideal gas.

But, a "point" is local. If you take the bit out of it's byte, it only has two states, which could be similar to something like charge or spin in a particle. In the byte it still has two states, but they occur "in context" which gives them additional meaning.

So in Boltzmann's idea, when you put the bit into the byte you get some additional "configuration entropy" over and above the native two states.

Unfortunately, it's not that simple. In the case of quantum entanglement the wave functions merge and there is "mutual information", which is something bits don't have.

Consider: the quantum entity is 'indistinct". It's not really a point, it's kind of a cloud. How would we deal with that concept in the context of the Boltzmann formulation? As the cloud changes size and shape, does the entropy change? After all, the cloud is undergoing configurational changes...

The "dust" concept addresses these issues, and it also suggests a very elegant mechanism for the structure of matter, that hints at why all possibilities seem to exist. A dust can be created with a "branching random walk" which is a phenomenon that is ubiquitous in nature, it determines for instance the shapes of trees and leaves.

A "dust" is non-local by definition. It can't be created with "points", only from intervals. Intervals are non-local, they "cover" a collection of points.

The eye-opener with dusts is they "inherit" information. Interesting stuff.  :Smile:

----------


## UKSmartypants

> Entropy has nothing to do with time.
> 
> That relationship only exists in an ideal gas, which doesn't exist in real life.


Yes it does. Entropy is the pointer to the direction of the arrow of time, because at quantum levels everything is time symmetric.  You cannot determine the direction of time by observing quantum processes.

This implies entropy is a phenomenon of a medium macro scale, the sort of human brain size range.

It cant apply to the universe as whole because theres no proof the Second Law actually applies

and it def doesn't apply to time symmetric quantum processes.

----------


## Fall River

In the book I've been reading, Brian Greene, a retired physics instructor, writes about entropy as it applies to the human body.  In one section he starts with energy from the sun: It's entropy by way of energy being transferred from the sun to grow plants on earth. We then eat the plant material (vegetable) and break it down to supply the energy we need to live.  So energy is transferred to the human body. Then entropy takes place again when the human body gives off heat and waste.  

So entropy is going on everywhere in the universe and we are part of it.  Entropy is going on in the human body all of our lives and we might think in terms of disorder because the older we get the less well our bodies tend to function. Entropy might have something to do with aging because less energy becomes more and more obvious for most people as they reach old age. Perhaps that could be called low quality energy.  I threw in the last 3 sentences, that's not from the book.  That's the way I see it.  :Bom:

----------


## UKSmartypants

> In the book I've been reading, Brian Greene, a retired physics instructor, writes about entropy as it applies to the human body.  In one section he starts with energy from the sun: It's entropy by way of energy being transferred from the sun to grow plants on earth. We then eat the plant material (vegetable) and break it down to supply the energy we need to live.  So energy is transferred to the human body. Then entropy takes place again when the human body gives off heat and waste.  
> * 
> So entropy is going on everywhere in the universe and we are part of it.*  Entropy is going on in the human body all of our lives and we might think in terms of disorder because the older we get the less well our bodies tend to function. Entropy might have something to do with aging because less energy becomes more and more obvious for most people as they reach old age. Perhaps that could be called low quality energy.  I threw in the last 3 sentences, that's not from the book.  That's the way I see it.



no no no it isnt.  I explained previously.

If the universe is flat and infinite (in fact if its infinite  in any geometry) that brings into question the validity of the Second law of Thermodynamics which is the bedrock of entropy, if the second law doesnt apply, then entropy can only apply on a medium macro scale. 

How does Greene account for the  information lost into black holes if the amount of matter/energy/information, and thus entropy,  in the Universe isnt fixed by its geometry?

----------


## Fall River

> no no no it isnt.  I explained previously.
> 
> If the universe is flat and infinite (in fact if its infinite  in any geometry) that brings into question the validity of the Second law of Thermodynamics which is the bedrock of entropy, if the second law doesnt apply, then entropy can only apply on a medium macro scale. 
> 
> How does Greene account for the  information lost into black holes if the amount of matter/energy/information, and thus entropy,  in the Universe isnt fixed by its geometry?


Regardless of what's going on in the universe, he says entropy is energy transferring from one place to another.  He does devote a few pages to Thermodynamics and at the end says, "The nuclear force, in tandem with gravity, is a fount of life-giving low-entropy fuel."  Does that help?   Don't have time to read those pages again at this time. If you need more to understand his thinking, let me know and I'll be back.

He was referring to the Sun when he said, "The nuclear force, in tandem with gravity."

----------


## UKSmartypants

> Regardless of what's going on in the universe, he says entropy is energy transferring from one place to another.  He does devote a few pages to Thermodynamics and at the end says, "The nuclear force, in tandem with gravity, is a fount of life-giving low-entropy fuel."  Does that help?   Don't have time to read those pages again at this time. If you need more to understand his thinking, let me know and I'll be back.
> 
> He was referring to the Sun when he said, "The nuclear force, in tandem with gravity."



yea but the trouble is, he cant tell you what gravity actually is. No one can at the moment. And he has no proof entropy is universal for the reasons i mentioned.

All physics is guesswork, until somone proves they are right  :Big Grin:

----------


## nonsqtr

> Yes it does. Entropy is the pointer to the direction of the arrow of time, because at quantum levels everything is time symmetric.


I challenge that statement.

Decoherence is NOT time symmetric, you can not "recohere" an observation.




> You cannot determine the direction of time by observing quantum processes.


Horse feathers.




> This implies entropy is a phenomenon of a medium macro scale, the sort of human brain size range.


A dust has the same number of bits as the interval from whence it came.

Study dusts. A dust is a perfect set that is nowhere dense. A simple dust is a binary tree. (Not necessarily symmetric, there are stochastic dusts as well). Dusts are complete metric spaces, and compact by Heine-Borel.

Any compact metric space is a continuous image of a dust. Dusts have natural Haar measures, so functions can be integrated over them. Therefore, if you normalize so the measure of the set is 1, a dust becomes an infinite series of coin tosses. A Cantor set is a "universal probability space", because the Haar measure is an image of any probability.




> It cant apply to the universe as whole because theres no proof the Second Law actually applies


You're.making this too complicated.

Start here: selection from a set requires work.




> and it def doesn't apply to time symmetric quantum processes.


Describe how an observation is time symmetric.

Short answer: it isn't - and if your equations say it is, they must be wrong.

----------


## nonsqtr

> no no no it isnt.  I explained previously.
> 
> If the universe is flat and infinite (in fact if its infinite  in any geometry) that brings into question the validity of the Second law of Thermodynamics which is the bedrock of entropy, if the second law doesnt apply, then entropy can only apply on a medium macro scale. 
> 
> How does Greene account for the  information lost into black holes if the amount of matter/energy/information, and thus entropy,  in the Universe isnt fixed by its geometry?


WTF are you on about?

You're drawing conclusions about the nature of the universe from someone's model of an "ideal" gas? (Which we know in advance doesn't even exist?)

In Boltzmann's model, the only reason "entropy must increase" is the system trends to thermodynamic equilibrium.

If all the processes in a thermodynamic system are reversible, THE ENTROPY IS CONSTANT.

----------


## nonsqtr

> WTF are you on about?
> 
> You're drawing conclusions about the nature of the universe from someone's model of an "ideal" gas? (Which we know in advance doesn't even exist?)


Entropy is not thermodynamics, entropy is information. The only linkage between thermodynamics and information is W, the "number of available states". And for Boltzmann's model to work: states are required to be consistent (meaning any two observers see the same entropy), and states are required to be disjoint (redundancy equates with mutual information, since factors other than combinatorics likely determine the symmetries).

Study dusts. They solve a lot of these problems.

----------


## nonsqtr

> Yes it does. Entropy is the pointer to the direction of the arrow of time, because at quantum levels everything is time symmetric.  You cannot determine the direction of time by observing quantum processes.
> 
> This implies entropy is a phenomenon of a medium macro scale, the sort of human brain size range.
> 
> It cant apply to the universe as whole because theres no proof the Second Law actually applies
> 
> and it def doesn't apply to time symmetric quantum processes.


I suggest a simpler explanation.

The universe is expanding.

Therefore entropy increases.

Just like the billiard ball example - the space expands.

----------


## Fall River

> I suggest a simpler explanation.
> 
> The universe is expanding.
> 
> Therefore entropy increases.
> 
> Just like the billiard ball example - the space expands.


And the universe is expanding because of antigravity, which started with the "big bang".  And the speed of expansion is increasing rather than decreasing as first expected.   :Smiley20:

----------


## Fall River

> How does Greene account for the  information lost into black holes if the amount of matter/energy/information, and thus entropy,  in the Universe isnt fixed by its geometry?



Not exactly sure how to answer that question. But I just read that he envisions all galaxies eventually being consumed by the black holes at their center.  That's how the universe goes dark.

----------


## nonsqtr

> And the universe is expanding because of antigravity, which started with the "big bang".  And the speed of expansion is increasing rather than decreasing as first expected.


Look, um... at the risk of being an arrogant prick (pardon the vernacular) and going up against the status quo and hundreds of eminent physicists - here's a point I feel obligated to make, in this discussion.

Physical phenomena are FUNDAMENTALLY random. It's not that we draw upon a probability distribution whenever we need to explain some combinatorics - the things physicists call "observables" are fundamentally RANDOM variables. So when you see an equation like p = mv it should immediately arouse your suspicion. Same for S = k ln W, because W is fundamentally a RANDOM variable. In real life W is random, because the entity in question is a "cloud".

Just like in quantum mechanics - you really can't say that anything ever has a "definite" position, momentum, or even size or shape. Because the quantum entity is a "cloud". In my opinion it's not a particle, it's not even a wave - it's a DUST. Which has both particle like and wave like properties, looks like a cloud, and is NATIVELY random and procedural. 

This construction explains many, many things. Like why all possibilities seem to be on the table in quantum-land. And why randomness exists and what it is exactly. And why the universe is expanding - 

Physicists need to reformulate things like Maxwell's equations at the stochastic level. Even though we may not "see" randomness there, we will if we look at a fine enough level. So these dusts and things, they are non-obvious solutions of the same physical equations, "when" they are formulated stochastically.

You see, a dust, for example a Cantor dust, is a perfect set and a compact topological space with measure. It maps bijectively onto the interval [0,1] (which coincidentally maps directly to a probability distribution), and you can do math on it just like you would do on the real number line. It's cardinality is the same - the only thing that's different is it's Hausdorff dimension (which is fractional, because it doesn't fill "all" the available space).

But this is one of those Flatland things - if we were living "in" a dust there's no way we could tell, and the unfilled space would "look like" extra dimensionality to us.

----------


## Fall River

Thanks, I'll give that some thought.   :Thinking:

----------


## UKSmartypants

> And the universe is expanding because of antigravity, which started with the "big bang".  And the speed of expansion is increasing rather than decreasing as first expected.



Not antigravity, negative gravity, not the same thing.


Or alleged Dark energy


In fact, Teleparallel Gravity solves the problem.  https://arxiv.org/abs/gr-qc/0011087

----------


## nonsqtr

> Not exactly sure how to answer that question. But I just read that he envisions all galaxies eventually being consumed by the black holes at their center.  That's how the universe goes dark.


How about the creative part? Why is the universe expanding "everywhere"? I don't think that's a destructive process, rather I think it's quite the other way around.

Dusts - the simple formula for a dust is what the computer scientists call a "recursive" algorithm. It means you do the same simple thing over and over again. For example, in the best known version of a dust which is the ternary Cantor set, the algorithm is, "chop it into 3 parts and discard the middle third". So, everything you see, you do that same simple thing to. Recursion is well known in AI (symbolic and otherwise), and the subspace amplification being studied in ANN's is also a form of recursion.

So imagine a process that operates so fast it's at Planck scale. 10^-42 seconds per iteration or some such thing. The fastest scale I know of that we can currently see is in the fractions of a femtosecond which is 10^-15, and for comparison the time it takes for an electron to complete a circular orbit is in the nanoseconds which is 10^-9.

So we have almost THIRTY orders of magnitude within which to recursively create our dust using a simple branching random walk, which is nothing more than a coin flip. The algorithm used to create the dust is algebraically accessible, it's part of what they call a "modulo group", so for instance in the ternary example where we divide things into thirds, we get sequences where the digit '2' has a specific meaning, stuff like that. It also has a relationship with which numbers are prime, and it also has something to say about the existence of universal magic number like pi and e. (For instance the series definition of Pi discovered by Madhava of Sangamagrama in the 14th century is also the Dirichlet L-series underlying the beta function at b(1)).

So now, try to wrap your mind around the meaning of "information" in this context. At 30 orders of magnitude the Law of Large Numbers has long since come into play, and everything seems Gaussian and we have no way of determining what's underneath. Even though our construction may eventually result in "discrete" points, there are so many of them that for all practical purposes things seem continuous and we also have near-perfect wave solutions.

So now here we are at the macro level, 30 orders of magnitude up, and we see a blip and we call it a "quantum fluctuation" and we say it happens "randomly". Well, the interesting thing about dusts is they're bassackwards. Most of us think we can only build an interval from points. But the dust STARTS with the interval, and RESULTS in points. There are so many bizarre and mind boggling things about dusts it's hard to list them all, but if we're talking about entropy, I submit for your consideration that DUSTS are the way to bridge the Renyi entropy with the von Neuman entropy. Dusts are naturals in Renyi-land, and we just have to translate them.

----------


## nonsqtr

Note to @UKSmartypants : this scenario can be generated geometrically using a "cusp catastrophe".

----------


## UKSmartypants

I fail to see how entropy can increase when the fabric of spacetime is expanding at 3.5c. What is happening is the vacuum energy density of the universe is dropping like a stone, (if the amount  of mass/energy/information is fixed) so the average entropy per  Planck volume cannot possibly be increasing faster than the rate the average vacuum energy density is falling.

IMHO the final fate of the universe will not be max entropy, but when the energy density of the universe drops below the point the underlying twistor network can sustain, and spacetime simply vanishes because the 4 uncompactified dimensions we live in curl back up.

----------


## nonsqtr

> I fail to see how entropy can increase when the fabric of spacetime is expanding at 3.5c. What is happening is the vacuum energy density of the universe is dropping like a stone, (if the amount  of mass/energy/information is fixed) so the average entropy per  Planck volume cannot possibly be increasing faster than the rate the average vacuum energy density is falling.
> 
> IMHO the final fate of the universe will not be max entropy, but when the energy density of the universe drops below the point the underlying twistor network can sustain, and spacetime simply vanishes because the 4 uncompactified dimensions we live in curl back up.


Well let's see, the wavelength of the microwave background radiation is gradually increasing. Presumably that's "because" the universe is expanding therefore the energy density conforms.

According to the scuttlebutt that radiation represents a temperature of 2.375K or whatever it is - which is "sufficient" to support quantum fluctuations in otherwise empty space. 

What is your view to the 'entropy' in this situation?

----------


## nonsqtr

> I fail to see how entropy can increase when the fabric of spacetime is expanding at 3.5c. What is happening is the vacuum energy density of the universe is dropping like a stone, (if the amount  of mass/energy/information is fixed) so the average entropy per  Planck volume cannot possibly be increasing faster than the rate the average vacuum energy density is falling.
> 
> IMHO the final fate of the universe will not be max entropy, but when the energy density of the universe drops below the point the underlying twistor network can sustain, and spacetime simply vanishes because the 4 uncompactified dimensions we live in curl back up.


What makes sense from the Renyi standpoint is we can talk about information that is transferable and information that is not transferable.

This is why I asked about CMB, to distinguish it from ZPE. So for example the average kinetic energy from CMB is transferable, whereas that from ZPE is not. 

One of the interesting things about the field formulations (QED, QCD) is we get exclusionary relationships between the values and their first derivatives. This is the same thing we see in the brain ("for no apparent reason", to most biologists).

If there is "non transferable information", how does that play relative to the concept of entropy? It's "invisible information", hidden to the observer - does it still count?

----------


## nonsqtr

Here's another good one - explain how entropy behaves in the Unruh effect.

----------


## Fall River

> Look, um... at the risk of being an arrogant prick (pardon the vernacular) and going up against the status quo and hundreds of eminent physicists - here's a point I feel obligated to make, in this discussion.
> 
> Physical phenomena are FUNDAMENTALLY random. It's not that we draw upon a probability distribution whenever we need to explain some combinatorics - the things physicists call "observables" are fundamentally RANDOM variables. So when you see an equation like p = mv it should immediately arouse your suspicion. Same for S = k ln W, because W is fundamentally a RANDOM variable. In real life W is random, because the entity in question is a "cloud".
> 
> Just like in quantum mechanics - you really can't say that anything ever has a "definite" position, momentum, or even size or shape. Because the quantum entity is a "cloud". In my opinion it's not a particle, it's not even a wave - it's a DUST. Which has both particle like and wave like properties, looks like a cloud, and is NATIVELY random and procedural. 
> 
> This construction explains many, many things. Like why all possibilities seem to be on the table in quantum-land. And why randomness exists and what it is exactly. And why the universe is expanding - 
> 
> Physicists need to reformulate things like Maxwell's equations at the stochastic level. Even though we may not "see" randomness there, we will if we look at a fine enough level. So these dusts and things, they are non-obvious solutions of the same physical equations, "when" they are formulated stochastically.
> ...



Okay, I gave it some thought and here's my answer:  It sounds like Brian Greene's theory, for how the universe ends, has as good a chance of being correct as any other theory.  If he's in agreement with hundreds of eminent physicists, I don't see how that can be a drawback.  Those who are in the minority will have to work harder to prove otherwise.

----------


## nonsqtr

> Okay, I gave it some thought and here's my answer:  It sounds like Brian Greene's theory, for how the universe ends, has as good a chance of being correct as any other theory.  If he's in agreement with hundreds of eminent physicists, I don't see how that can be a drawback.  Those who are in the minority will have to work harder to prove otherwise.


I'm not so concerned with WHETHER (and/or how) the universe ends. The generative process is much more interesting.

We could, for example, consider the concept of the "big bang" a little more abstractly - as an instability.

One could postulate that "instability" equates with a quantum fluctuation, but what's to say that this instability might not be carried into every part of the space that was created?

All these theories of cosmology are very LINEAR, and if we're dealing with instability we're "most likely" dealing with nonlinearity - and if that's the case we have to look at "nonlinear" thermodynamics where new phases of matter get created through long range (but still entropic) interactions.

The problem with the "prevailing" cosmology is it's based on the Standard Model, and the Standard Model doesn't work! Ask Smarty, he'll tell you. If we're being polite we say it's "incomplete", behind closed doors we say it's missing half the picture.

Dusts really do solve a lot of the problems. The idea is that the quantum cloud varies from one moment to the next "because" there is an underlying creative process. There is no requirement that the points in a dust remain stable, and we know very little about the stability of the generative processes (most of it is still based on the original set theoretic math of Georg Cantor, and it's the basis of modern topology but stable manifolds are hard enough, most physicists never get as far as the nonlinearities).

----------

Fall River (10-26-2021)

----------


## nonsqtr

> Here's another good one - explain how entropy behaves in the Unruh effect.


Anyone?

The Unruh effect says, an accelerating observer will see a heat bath even when a static observer sees empty space.

So, what happened here? The number of available states magically increases with acceleration?

----------


## nonsqtr

The problem with entropy is it's not a physical quantity, it's a statistic.

Look at the definition: p ln p. Probability, which depends on observation.

The problem is not that the probabilities don't exist, the problem is we can't measure them! If you have 100 ideal gas particles there are something like 10^50 equivalent equilibria.

The further problem with entropy is it's an idealization. It assumes there is no sharing of information. In quantum land we have entanglement entropy, which is basically the idea that information disappears into the mixture. A pure ground state has zero entropy (up to a constant), but a mixed state has non-zero entropy.

In real life, any mixture results in the sharing of information, and any nonlinearity results in impurity. In theory the ground state of a perfect crystal has 0 entropy, but there's no such thing in real life. In theory a vacuum has 0 entropy but there's no such thing in real life, instead there is Zero Point Energy and the Casimir effect.

In thermodynamics we can look at "dissipative" systems which are far from equilibrium and exchange energy with the environment. Practically all such systems are highly nonlinear and exhibit long range entropic interactions. Multiple Nobel prizes have been issued for the work underlying non-equilibrium thermodynamics.

https://en.m.wikipedia.org/wiki/Dissipative_system

In this link, pay particular attention to the section called "quantum dissipative systems". It describes why time is fundamentally irreversible even though the quantum equations are time-independent.

----------


## Physics Hunter

> Look, um... at the risk of being an arrogant prick (pardon the vernacular) and going up against the status quo and hundreds of eminent physicists - here's a point I feel obligated to make, in this discussion.
> 
> Physical phenomena are FUNDAMENTALLY random. It's not that we draw upon a probability distribution whenever we need to explain some combinatorics - the things physicists call "observables" are fundamentally RANDOM variables. So when you see an equation like p = mv it should immediately arouse your suspicion. Same for S = k ln W, because W is fundamentally a RANDOM variable. In real life W is random, because the entity in question is a "cloud".
> 
> Just like in quantum mechanics - you really can't say that anything ever has a "definite" position, momentum, or even size or shape. Because the quantum entity is a "cloud". In my opinion it's not a particle, it's not even a wave - it's a DUST. Which has both particle like and wave like properties, looks like a cloud, and is NATIVELY random and procedural. 
> 
> This construction explains many, many things. Like why all possibilities seem to be on the table in quantum-land. And why randomness exists and what it is exactly. And why the universe is expanding - 
> 
> Physicists need to reformulate things like Maxwell's equations at the stochastic level. Even though we may not "see" randomness there, we will if we look at a fine enough level. So these dusts and things, they are non-obvious solutions of the same physical equations, "when" they are formulated stochastically.
> ...



You are living in your own private Idaho.

You are intelligent, but you are pushing a direction versus looking for the truth.

----------


## nonsqtr

The real problem with entropy is it depends on "thing"-ness. How many states does the "thing" have. Well... depends how you define the 'thing", right?

For example, let's say according to Smarty we have an elementary particle (a 'thing') with charge and spin. The charge and spin are properties of the "thing", we can't separate them from the thing, we can't isolate them without destroying the thing.

But the definition of entropy changes when you put a bunch of things in a box. Now entropy is a property of the "system", more so than the things that make up the system. At this level we no longer care about the states of individual "things", we only care how many are in which state.

Abstractly then, information is a partition. If you take any space and divide it in two, you've created information. If you take any two spaces and combine them, you've destroyed information. Any symmetry is equivalent to a partition, therefore it's inherently associated with information. Any self similarity is inherently a partition therefore it is associated with information. Every distribution is a partition therefore it is associated with information. A bit is a partition, it has "two" states.

----------


## nonsqtr

> You are living in your own private Idaho.
> 
> You are intelligent, but you are pushing a direction versus looking for the truth.


No. I'm making sense of the parts of the miserable Standard Model that make no sense.

----------


## Physics Hunter

> Entropy is meaningless.
> 
> The people who try to define entropy locally are WRONG, provably so.
> 
> Entropy is a NON-LOCAL measure, not a physical observable.


You are conflating informatics with something more basic like heat exchange.

Drive that Prius up the mountain, and then down, you have less energy in the batteries and gas tank then you did when you started.  Any reasonably intelligent 18 year old knows these things.

----------


## nonsqtr

> You are living in your own private Idaho.
> 
> You are intelligent, but you are pushing a direction versus looking for the truth.


Answer Post #72.

Betcha can't, betcha can't.  :Tongue20: 

Nyah nyah, ha ha,  :Tongue20: 

Physics, shmysics.  :Tongue20:   :Tongue20:   :Tongue20: 

 :Smile: 

(-wink-)

----------


## nonsqtr

> You are conflating informatics with something more basic like heat exchange.
> 
> Drive that Prius up the mountain, and then down, you have less energy in the batteries and gas tank then you did when you started.  Any reasonably intelligent 18 year old knows these things.


Hey, either thermodynamics works or it doesn't.

Either dS = T dQ or it doesn't. 

Take your pick.  :Smile:

----------


## nonsqtr

> You are living in your own private Idaho.
> 
> You are intelligent, but you are pushing a direction versus looking for the truth.


I'm pointing out that the Standard Model DOES NOT WORK !!!

It has more holes in it than Swiss cheese!

It's entirely circular math, it proves it's own assumptions.

----------


## Physics Hunter

> Answer Post #72.
> 
> Betcha can't, betcha can't. 
> 
> Nyah nyah, ha ha, 
> 
> Physics, shmysics.   
> 
> 
> ...


Right after you answer my Prius question.  Thermodynamics is measurable.

Gravity is the same way.  Newtonian gravity formulations work just fine for short to medium range projectile calculations.

The extension to informatics is a metaphor having no purchase on the classic, and extremely useful energy/heat formulation.

----------


## nonsqtr

> You are living in your own private Idaho.
> 
> You are intelligent, but you are pushing a direction versus looking for the truth.


What's the "truth"?

Geometry is just a description, it's not the truth.

The Standard Model is based on STATISTICS, it can't possibly be the truth!

So what if there are automorphisms? BFD! It tells us absolutely nothing about underlying mechanisms.

A non-mechanistic description of reality is entirely insufficient.

A dust meets ALL of the requirements of modern statistical thermodynamics. It explains the clouds and the distributions, two things which geometry is entirely incapable of even addressing.

----------


## nonsqtr

> Right after you answer my Prius question.  Thermodynamics is measurable.
> 
> Gravity is the same way.  Newtonian gravity formulations work just fine for short to medium range projectile calculations.
> 
> The extension to informatics is a metaphor having no purchase on the classic, and extremely useful energy/heat formulation.


In other words, you're admitting your model doesn't work.

Well, thank you. That's a start.

----------


## nonsqtr

> Right after you answer my Prius question.  Thermodynamics is measurable.


So what? Coin tosses are measurable too.




> Gravity is the same way.  Newtonian gravity formulations work just fine for short to medium range projectile calculations.


A model that only works some of the time under restricted conditions? No good.




> The extension to informatics is a metaphor having no purchase on the classic, and extremely useful energy/heat formulation.


Boltzmann claimed otherwise.

----------


## nonsqtr

I'm not the only one thinking about this.

For instance -

https://www.researchgate.net/publica...superdiffusion

https://www.sciencedirect.com/scienc...7596011830135X



Google "stochastic dyadic Cantor set". There's quite a bit of investigation going on right now.

----------


## nonsqtr

And, I will also point out, that BIOLOGISTS first noticed, described, and studied Brownian motion. Physicists EVEN NOW don't know what it means.

Brownian motion was first described by the Roman philosopher/poet Lucretius in 60 BC. Fast forward to Stiles and Gleichen who studied pollen, and another while to the botanist Robert Brown.

It has been the biologists all along who promoted the importance and fundamental math and physics of fractional processes and particularly stochastic fractals. These processes are UBIQUITOUS in nature, they occur anywhere and everywhere.

I mean look, let's be real. Based on what we know at this point, attempting to formulate physics "without" random processes is boneheaded stupid. EVERYTHING in nature is fundamentally random, there's nothing that isn't.

----------


## nonsqtr

> You are conflating informatics with something more basic like heat exchange.
> 
> Drive that Prius up the mountain, and then down, you have less energy in the batteries and gas tank then you did when you started.  Any reasonably intelligent 18 year old knows these things.


Here's a trick:

In simulation we usually require the time step dt to be smaller than the smallest time constant in the system, that way our Monte Carlo's converge.

However there is a different and much more clever approach:

Convert the difference equation to a RECURSIVE algebra.

This makes t merely a label, and removes the dependency of the solution on the time step.

It ALSO addresses (and eliminates) the difference between the Ito and Stratonovich interpretations of stochastic integrals.

Mathematicians are well on their way to showing that any time difference equation can be converted into a recursive algorithm.

And recursion is a very short step from the dust-making algorithm we were talking about earlier.

----------


## Fall River

> I'm not so concerned with WHETHER (and/or how) the universe ends. The generative process is much more interesting.
> 
> We could, for example, consider the concept of the "big bang" a little more abstractly - as an instability.
> 
> One could postulate that "instability" equates with a quantum fluctuation, but what's to say that this instability might not be carried into every part of the space that was created?
> 
> All these theories of cosmology are very LINEAR, and if we're dealing with instability we're "most likely" dealing with nonlinearity - and if that's the case we have to look at "nonlinear" thermodynamics where new phases of matter get created through long range (but still entropic) interactions.
> 
> The problem with the "prevailing" cosmology is it's based on the Standard Model, and the Standard Model doesn't work! Ask Smarty, he'll tell you. If we're being polite we say it's "incomplete", behind closed doors we say it's missing half the picture.
> ...



I vaguely remember Greene had something to say about "dust".  This evening I'll take a look at the index and see what I can find.  It could be he took it into account.

----------


## nonsqtr

> I vaguely remember Greene had something to say about "dust".  This evening I'll take a look at the index and see what I can find.  It could be he took it into account.


Lookie: https://www.researchgate.net/publica...astrophe_Model

Here's kinda my rough-outline-of-a-model:

Geometry alone can generate what amounts to a "branching random walk", as long as the energy driving it is somehow replenished (so it's "ongoing").

I'm going to "ass"ume that the quantum cloud actually does fluctuate from one instant to the next, and more specifically, the claim is the "space" it actually occupies is different from one moment to the next.

A quantum walk is kinda-sorta what we're talking about, but it's more specific than that. We can say a lot about the walk just off the cuff. What's apparent is that it must be stable at the macroscopic level and unstable at the microscopic level, suggesting a dissipative structure.

One can envision the generator as a fountain, spewing out randomly directed bits of water. The droplets will appear as a "cloud" from afar, they will rise for a while, when they reach the perimeter they will be "seen" because of reflections from sunlight and such, and then eventually they'll fall back into the fountain. 

This is not "that" outlandish of an idea. The same thing basically happens when you replace the continuous space in the Schrodinger equation with a discrete set, then you get a continuous time random walk on a graph. A discrete time walk is a combination of a stochastic event and a conditional shift, applied repeatedly (recursively). And because of the (possibly) asymmetrical partitioning of information, the probability distribution for this walk is distinctly non-Gaussian. Not only that but it depends on the initial conditions (it has memory). All this arises from a so-called linear system.

----------


## Physics Hunter

Nyquist.

And you are inventing the past.  I have been an expert LISP programmer since the 80's.

----------


## Authentic

Nyquist? He won the 2016 Kentucky Derby.

----------


## nonsqtr

> Nyquist.
> 
> And you are inventing the past.  I have been an expert LISP programmer since the 80's.


Lots of idiotic single parentheses.  :Grin: 

But since you mentioned Nyquist, we could go off on a whole other tangent with that.

But first let's make another observation about the human brain, relative to intelligence.

If you consider the motor and sensory processing along a timeline time-locked to "now" (which makes it a moving window whose "velocity" is exactly coincident with physical time), then FEELINGS are organized in the frontal part of the timeline. Everything that has to do with homeostasis and the state of the organism, is up front.

Furthermore - all of the attentional control mechanisms are encoded in terms of "meaning to the organism", which primarily relates to homeostasis, and "goals". For example - the connection between the hippocampus and the anterior cingulate cortex. The hippocampus, is on the very back end of the sensory timeline, it is involved in object and event mapping and short term memory thereof. However the ACC is up front, anterior to "now" along the timeline. And there, neurons encode "meaning", whereas in the hippocampus they mainly encode sensory configurations.

"Goals", in evolution, are an advanced form of homeostasis. As distinct from evolving from an advancement of sensory/logical configurations.

The organization of homeostatic brain systems in "timeline" form is a fascinating study. For instance we are forced to consider things like the hormonal control of hunger in the arcuate nucleus, which then makes us study the organization of the lateral and medial hypothalamus.

These things are not necessarily "obvious" when considering the logic of ANN's, but are very important when we're using the word "intelligence" relative to human beings.

----------


## nonsqtr

> Nyquist.
> 
> And you are inventing the past.  I have been an expert LISP programmer since the 80's.


So, exactly. Nyquist. But... Nyquist "relative to Heisenberg". Ha ha. There is definitely a relationship. It's not as obvious as it looks, though.

----------


## nonsqtr

> Nyquist? He won the 2016 Kentucky Derby.


It was a random event.

----------

Authentic (10-27-2021)

----------


## nonsqtr

So, in "my" model, the quantum has gazillions of microstates, just like in the geometric model. (There's actually no difference between the two models, they can be reconciled, they're two different views to the same thing).

I mean, this seems kinda intuitive and obvious to me, there's two different definitions of "thingness" going on in the Standard Model. We say a "thing" has a property like spin, or charge - but then we also say the "thing" has an infinity of microstates, which implies there are things underneath the thing. Which takes us into QED and such, where those constituents are explained to be waves and universal fields. But that does NOT explain the compartmentation of information!

This is where programmers can be a little thick headed sometimes. We think in terms of "bits", which are unitary entities. Look around - half of quantum research is still trying to make analog systems confirm to digital thinking.

But, in reality, ALL information is defined in mutual terms. There's no such thing as a "bit", it doesn't exist, it's just a theoretical abstraction. The very definition of probability entails knowledge of the whole, and it can't be separated.

This is why the dissipative structures are so important. A cusp catastrophe is the universal unfolding of the singularity of y = x ^ 4, and fourth order equations are quite common in nature.

And, there are conservation laws associated with stochastic processes, there are stochastic versions of Noether's theorem and so on.

A "bit" is kind of an idealized "perfect information", but the reality is it takes lots and lots of energy to maintain it.

----------


## nonsqtr

> Nyquist.
> 
> And you are inventing the past.  I have been an expert LISP programmer since the 80's.


Okay, so here's the challenge back to you:

The ground state of a perfect crystal is defined to have zero entropy.

Go look at a perfect crystal, and tell me there's no information in it.

That concept is completely nonsensical, would you agree?

Therefore - it should be quite obvious, that "configuration entropy" is not all there is

----------


## nonsqtr

AND FURTHERMORE (ha ha - I could go on forever but I won't, the point should be made by now) - 

There is information "in" the probability distribution long before there is an outcome.

This business of basing all of known physics on "outcomes" is completely inadequate.

Sure, outcomes have geometries and must respect certain symmetries and so on - but so what?

All that tells us is, the relationship between the outcomes. It tells us nothing about the stuff creating the outcomes, which is what we really want to know about.

----------


## Fall River

> Lookie: https://www.researchgate.net/publica...astrophe_Model
> 
> Here's kinda my rough-outline-of-a-model:
> 
> Geometry alone can generate what amounts to a "branching random walk", as long as the energy driving it is somehow replenished (so it's "ongoing").
> 
> I'm going to "ass"ume that the quantum cloud actually does fluctuate from one instant to the next, and more specifically, the claim is the "space" it actually occupies is different from one moment to the next.
> 
> A quantum walk is kinda-sorta what we're talking about, but it's more specific than that. We can say a lot about the walk just off the cuff. What's apparent is that it must be stable at the macroscopic level and unstable at the microscopic level, suggesting a dissipative structure.
> ...


It seems Brian Greene wrote his book for non-physicists like me who are curious, so he tries to keep it as simple as possible.  Even at that, I had to read some sections multiple times.  At first I looked in the index for the word "dust" but couldn't find anything. Then I found the section where he talks about uncertainty, but instead of dust he uses the words "foggy, hazzy, and fuzzy".  

Here's some of what he had to say: Inflationary cosmology takes quantum mechanics into account and it's known as the "quantum mechanical uncertainty principle". Rather than complete certainty, a quantum physicist sees quantum fuzziness - uncertainty.

He also said quantum mechanical predictions have been confirmed by "a century of experiments".   

So, evidently, he sees the value in it and is not ignoring it.

----------


## UKSmartypants

> Well let's see, the wavelength of the microwave background radiation is gradually increasing. Presumably that's "because" the universe is expanding therefore the energy density conforms.
> 
> According to the scuttlebutt that radiation represents a temperature of 2.375K or whatever it is - which is "sufficient" to support quantum fluctuations in otherwise empty space. 
> 
> What is your view to the 'entropy' in this situation?



well there could be several reasons the wavelength of the CMB is increasing, apart from spacetime stretching. Actually, i dont buy that at all, because with spacetime expanding at 3.5c   the Doppler shift would be massive and noticeable. Its increasing because its cooling, but  that doesnt necessarily mean entropy is increasing, we could just be losing information somewhere..The cooling could easily be because the vacuum energy density is dropping. at some point itll drop to near zero kelvin, and insufficient to maintain the spinor  networks. After all you cant have a spinor with no information.

----------


## nonsqtr

> well there could be several reasons the wavelength of the CMB is increasing, apart from spacetime stretching. Actually, i dont buy that at all, because with spacetime expanding at 3.5c   the Doppler shift would be massive and noticeable. Its increasing because its cooling, but  that doesnt necessarily mean entropy is increasing, we could just be losing information somewhere..The cooling could easily be because the vacuum energy density is dropping. at some point itll drop to near zero kelvin, and insufficient to maintain the spinor  networks. After all you cant have a spinor with no information.


Ta-da! Here, present for you: https://www.slac.stanford.edu/econf/...pers/art38.pdf

A "stochastic loop space" is exactly what we have in the brain, as previously discussed.

Nature re-uses that which is successful. 

And, think: we have quantum levels based on particle-in-a-box or some such thing - does anything ever behave that way in real life? No! The regular behavior is constantly disrupted by random events. There are partial quanta exchanged in entangled collisions, giving rise to "virtual photons" and other madness like that... The wave is constantly being distorted, knocked around by "events".

It seems to me, the easy way to explain all this is with a stochastic optimization process, operating at a time scale so rapid (Planck scale?) that everything we see seems stable and regular (Gaussian).

I'm a little familiar with spinors vis-a-vis quaternion neural networks. Like: https://www.sciencedirect.com/scienc...93608008000592

So, let's focus for a moment. A 'qubit' is supposed to generalize a bit. But really the relationship is in the dimensionality of the Hilbert space, a "state" being a vector. A bit has two states, a qubit has two "dimensions". In both cases you can derive probability distributions  for all manner of oddball and complex situations, but they have the same behavior as physical observables when you measure them - a narrow variance in one means a wide variance in another. This tells us that the outcome and the geometry are closely linked.

----------


## Authentic

> It was a random event.


It was like, totally stochastic!

----------


## Call_me_Ishmael

> It was like, totally stochastic!


"Here's my apartment. It's a tubularly stochastic manifold in 4 space, dude. It never looks the same. Here. Take a drink from my Klein bottle. It's wine made from the tears of mathematicians."

----------


## UKSmartypants

> "Here's my apartment. It's a tubularly stochastic manifold in 4 space, dude. It never looks the same. Here. Take a drink from my Klein bottle. It's wine made from the tears of mathematicians."



Although of small floor area, we have more space here because we went for the 10D Calabi Yau manifold apartment, although we only use 7 of the dimensions, the other  three we use for storage and as a dog kennel....

----------

Authentic (10-27-2021)

----------


## nonsqtr

Okay - if you take a particle in a box, initially in the ground state, and divide the box into two unequal halves, if the partition is done sufficiently slowly the particle will ALWAYS end up in the larger half.

Not "sometimes with a probability" - always.

So, if you have a box constrained to certain sizes ("quantized") and it tries to grow because of some weirdo collision or geometric anomaly (but then collapses right back again because of the quantization), the particle is guaranteed to vacate the part that splits off.

If you go the other way and split the box really really fast, you get things like long range spinor alignment in Bose-Einstein condensates. The particle will "reach out of the box" so to speak, and start engaging in long range interactions.

----------


## Physics Hunter

> Okay, so here's the challenge back to you:
> 
> The ground state of a perfect crystal is defined to have zero entropy.
> 
> Go look at a perfect crystal, and tell me there's no information in it.
> 
> That concept is completely nonsensical, would you agree?
> 
> Therefore - it should be quite obvious, that "configuration entropy" is not all there is


I could argue either side of that, and win.

But you are wasting your and my brains with a lot of words that lead to nothing.

----------


## Northern Rivers

I'm finding plenty of entropy. Nuthin' works as good as it did, once upon a time.  :Dontknow:

----------

Fall River (10-28-2021),nonsqtr (10-28-2021),Oceander (10-28-2021)

----------


## nonsqtr

Okay. So let's talk about something completely different. Programmers will love this.

"L-Systems".

Where the L stands for Lindenmayer.

L-Systems are basically grammars, except instead of instantiating one rule at a time like an ordinary grammar, it applies multiple rules in a single iteration.

An L-System, it turns out, is fundamentally recursive and can be used to generate self-similar fractal shapes. Read through the section called Examples, which includes the Cantor dust.

https://en.m.wikipedia.org/wiki/L-system

Now scroll to the section called "Stochastic Grammars". The person who wrote this section was apparently unaware of the research in this area. A grammar is an algebra, and a stochastic grammar is a stochastic algebra. 

Now, let's return to the quantum cloud. "Replenishment" is among the best studied stochastic processes (since it applies to supply chains, which most businesses want to know about). But replenishment also describes cells growing in the form of a fern leaf, or the dendrite tree of a neuron. The locus of replenishment is not necessarily the same as the locus of growth, for example there is growth at the tip and replenishment at the base. 

In real life all combinations exist. So, pick one, think about the growth of a fern leaf based on an internal CHEMICAL process that guides the structure into a central branch with the characteristic bilateral leaf arrangement. Every branch looks the same, yet no two branches are alike.

----------


## nonsqtr

> I'm finding plenty of entropy. Nuthin' works as good as it did, once upon a time.


lol - that's a different concept though, us musicians quaintly call it "maintenance".  :Smile:

----------


## nonsqtr

> I could argue either side of that, and win.
> 
> But you are wasting your and my brains with a lot of words that lead to nothing.


It's not nothing. It's the definition of information, which in turn defines entropy, which is the topic.

Physicists think in terms of "degrees of freedom" - for instance, we have our ordinary 3 dimensions and time but there is also "spin" and the quantum computing types love "spin", right? So we think in terms of the degrees of freedom that a particle (a "thing") natively has. Spin happens to be useful because it's binary, and it maps neatly into our digital world.

So "some" people would leave it at that, and say the particle has 5 degrees of freedom instead of 4, and kinda the question becomes "how many bits can you cram into a particle".

But that's not at all what's happening, right? There may be considerably more bits than are visible or measurable. Hilbert space is an elementary example, it looks simple enough, you have a unit circle that maps a state. Well, so, this construction maps directly into probability-land. Where you discover, some of the less obvious degrees of freedom. You can make the wave/particle behave the exact same way at the visible level, while radically altering the underlying probability distributions - and in quantum-land the analogy would be the many different ways to achieve the same mixed state from multiple pure states.

----------


## nonsqtr

When two pure quantum states mix, information "disappears into" the mix.

The reduced density matrix then describes the state and the von Neumann entropy of this matrix is a measure of the entanglement.

For a pure state, the entropy is zero. For a mixed state, it's non-zero.

So now, we observe that entanglement behaves the opposite way from what we expect. Normally, we would have to put energy in, to pull a mixed state apart and create a more ordered situation from it. But instead, the mixed state gives up information at the moment of collapse. If we are theorizing that energy equals information, this can only mean that the particle is drawing energy (information) from the observing system. Which leads us to "weak measurement" and all the rest.

At this level then, if we impart even one quantum into the wave, it changes state. This is reminiscent of the Josephson effect and things at that level, which they can now do "some" of at room temperature, instead of requiring supercooled semiconductors.

----------


## nonsqtr

So now we can consider "how much" information disappears into the mixed state, and the general answer is: less than a quantum.

Perhaps this is why, when we impart "one" quantum into the system, by way of an observation, we disrupt the stability of the mixed state.

Reichenbach's principle says: if there is a correlation between two events A and B, then one of three things must be true: either

1. A caused B, or
2. B caused A, or
3. A third event caused both A and B

(3) is the case when we create spin pairs, with a prism or through black body radiation or any other way. The requirement is that EVENT C BE ABSOLUTELY SIMULTANEOUS IN ALL REFERENCE FRAMES, otherwise we have different observers disagreeing on the ordering of the results.

And, this is exactly what we see. Disentanglement (decoherence) occurs at least 10,000 times faster than the speed of light.

----------


## nonsqtr

So, this is the back door that has to do with INFORMATION, that the physicists haven't found yet.

Physics (geometry) has to account for spin pair creation (information creation) that is ABSOLUTELY SIMULTANEOUS IN ALL REFERENCE FRAMES.

So far, such a thing doesn't exist. As close as we get, is the observation of quantum entanglement.

Every piece of our existence - atoms - are highly entangled. There's hardly any such thing as a pure state in real life.

But one of the reasons I believe in time, is this requirement for absolute synchronization.

----------


## nonsqtr

So, if you're following my explanation, we now have two completely different definitions of "information".

The one case, involves sending a bit through a channel, so, you put the information someplace, then you move it, then you recover it.

The other case, involves creating an entangled pair which is essentially "both states of the bit at the same time", and instead of storing the information in the polarization state it goes into the delta in the entropy of the mixed state.

That these are DIFFERENT mechanisms should be self-evident: the first example is communication and is therefore limited by the speed of light, whereas the second one involves spacetime separated events that occur simultaneously in all reference frames, in seeming violation of relativity.

----------


## nonsqtr

The big tell from the quantum standpoint is that different entropies can be associated with the same state.

This is because there is more than one density matrix that can represent the same state.

This forces us to define entropy as the MINIMUM value of all these predicted values, across the space of equivalent density matrices.

I submit for your consideration, that this right here is a perfectly adequate and sufficient source of "quantum fluctuations".

----------


## UKSmartypants

> I'm finding plenty of entropy. Nuthin' works as good as it did, once upon a time.


Thats not entropy, thats politics.

----------


## Fall River

Entropy definition - Google Search

----------


## nonsqtr

> Entropy definition - Google Search


Well so, the Renyi entropy generalizes both the Shannon entropy and the von Neumann entropy.

It's actually a "family" of entropies, and if you look at the formula you'll observe by inspection that it has a singularity at a=1.

Which kind of entropy we use in any given context is important. But, over-arching all of that, is the realization that information is a MUTUAL definition. The thermo people talk about the boundaries of the system, and the quantum people talk about the interaction between the wave and the measuring device.

----------


## nonsqtr

> Entropy definition - Google Search


Allow me to introduce you to a chemical reaction, called a Brusselator.

It involves three molecules: potassium bromate, malonic acid, and manganese sulfate.

In this particular reaction, the entropy that we see "at first" changes into something completely different over time - "because of" long range nonlinear entropic interactions that aren't accounted for in the combinatorial model.

Look at this behavior:



https://en.m.wikipedia.org/wiki/Brusselator

Periodic behavior of the stochastic Brusselator in the mean-field limit | SpringerLink

----------


## nonsqtr

The bottom line here is we DO NOT KNOW HOW to define information.

The physical requirement is very simple: collapse of a mixed state must be absolutely simultaneous in all reference frames! Even if the particles are separated by thousands of miles! !!!

Once Mr. Greene and the rest of the physics community solves that one, we'll be closer to a useful understanding.

----------


## nonsqtr

To get a stochastic model to work, it has to explain periodicity and it has to explain quantization.

Periodicity is easy, that part has been well studied.

We wish the quantization to occur naturally as a consequence of the numeric structure, as distinct from arbitrarily fiating its existence.

----------


## nonsqtr

> To get a stochastic model to work, it has to explain periodicity and it has to explain quantization.
> 
> Periodicity is easy, that part has been well studied.
> 
> We wish the quantization to occur naturally as a consequence of the numeric structure, as distinct from arbitrarily fiating its existence.


For example, Loop Quantum Gravity is one such formulation.

----------


## nonsqtr

So, we come to realize that entropy has nothing to do with chaos and disorder (and those who say it is are blindly regurgitating words they don't understand). Entropy has to do with information, and information is a Heisenberg thing, there is a lower bound beyond which we can not see. In a qubit which is based on "spin" which is a binary concept, how many degrees of freedom are there exactly? Well, in an ordinary spatial dimension (call it x), we can consider that the position along the x axis is one degree of freedom - or, we can assign a value (a "scalar", call it y) to every point in space, and therefore consider that the relationship has infinite degrees of freedom. It is apparent that the "amount" of information has a lot to do with the use we put it to. If we relax the requirement that kW=W for the Hilbert-space vector describing a state, we discover that the projection back to 1-space looks exactly like a Riemann sphere. A state vector in Hilbert space is a compactification into a projective geometry. There is a RICH literature on stochastic compactification, and one of the most important features is we lose the origin. Things end up in an "affine" space because the actual origin turns out to be a singularity in the projection. The Hilbert space is at best a convenient computational device, things don't work that way in real life.

----------


## UKSmartypants

M theory demands 11D. So thats how many degrees of freedom there must be.


Entropy was never a concept never designed to be applied to anything other than steam engines.  Over the centuries they kept pasting bits onto it willy nilly. The entire thing needs a rewrite.

And since entropy is used as the Arrow of time, if theres more than one sort then my proposition there is no such thing looks more realistic.  Time is merely moviment  along a 1D axis, one plank frame at a time.  

Also, since information disappears into  a Black Hole, it show the universe isnt a closed system with a fixed amount of mass/energy/information, which undermines the 2nd law.


Im really coming round to the idea that 'Dark matter' is just higher dimension information, and dark energy is connected to its entropy

----------


## nonsqtr

> M theory demands 11D. So thats how many degrees of freedom there must be.


lol - see? Told ya. "Degrees of freedom".  :Grin: 

You're one of the people who believes in "intrinsic information", or some call it "self-information".

So here's a question: if you create a spin pair, have you created information?

The Shannon/von Neumann approach works only for local systems, that is, pure states without mutual expectation.

Here: New method of quantum entanglement vastly increases how much information can be carried in a photon | UCLA

This is UCLA, mind you... and here we have an example of a stochastic quantum replenishment process involving entanglement:

https://www.livescience.com/physicis...hot-atoms.html

----------


## nonsqtr

> Entropy definition - Google Search


The definition of entropy apparently being used in the OP is this:

The Shannon entropy is the amount of classical information we gain, on average, when we learn the value of a random variable.

----------


## UKSmartypants

> So here's a question: if you create a spin pair, have you created information?
> 
> l


You must have done. even entanglement must create information, and decoherence, but in 11D.

----------


## nonsqtr

> You must have done. even entanglement must create information, and decoherence, but in 11D.


Well then, what do you think of the idea of black holes becoming white holes?

You've heard the proposal that an electron is equivalent to a positron moving backward in time, yes? In biology we have "positron emission tomography" (PET scans):

Witnessing Entanglement In Compton Scattering Processes Via Mutually Unbiased Bases | Scientific Reports

If we were to take seriously the idea that time is undefined at the Planck level, the concept of "frequency" would have no meaning either and therefore most of the maths would break down completely.

----------


## UKSmartypants

> Well then, what do you think of the idea of black holes becoming white holes?
> 
> You've heard the proposal that an electron is equivalent to a positron moving backward in time, yes? In biology we have "positron emission tomography" (PET scans):
> 
> Witnessing Entanglement In Compton Scattering Processes Via Mutually Unbiased Bases | Scientific Reports
> 
> If we were to take seriously the idea that time is undefined at the Planck level, the concept of "frequency" would have no meaning either and therefore most of the maths would break down completely.



Well i cant see white holes being the opposite of black holes, but Hawking says a black hole must turn into a white hole after several trillion trillion years.I just dont see how it can work.  I mean, i dont see how it can tunnel through a compacted manifold to appear elsewhere in the same universe. It just doesnt make sense. 

Well frequency and wavelength are related. Therfore there must be a maximum frequency when the wavelength equals the Planck length, and it works out to is the reciprocal of the Planck time, which is *10^43 Hz*, which also implies theres a maximum temperature, which is  *142 nonillion kelvins (10^32 K.) or an energy of 6.7x10^9 joules,* which makes the Schwarzchild radius the same as the Planck distance.

----------


## nonsqtr

Hm. Another dup. Slow site today..

----------


## nonsqtr

Okay.

So, I continue to insist, that spin pair generation AND annihilation are delocalized events. And, I think with your knowledge of geometry you can address this, and figure it out.

Let's begin with what we "think" we know, in the traditional view. We can draw a Feynman diagram and a light cone showing the generation of a spin pair, and then it's decoherence through measurement.

The annihilation/collapse part is straightforward, and has been well studied, and the event is clearly and obviously delocalized. What we learn from the observable collapses is the decoherence has to be absolutely simultaneous in all reference frames. This is an immediate and obvious consequence of the observation, and one which theoretical physicists have never addressed. "I personally" don't know what this is, but if it's a connection on a Calabi-Yau manifold this HAS TO be taken into account as one of the required constraints.

On the generation side, there is so far some limited evidence that the generation event can be spatially delocalized. For example, download the original article for this, the download button is on the link page.

https://www.researchgate.net/figure/...fig1_319259434

The longer I study this the more I come to the conclusion that continuity makes no sense, and something resembling Penrose's "spin foam" makes a lot more sense.

Because if you look at this non-locality in continuous terms it requires you to form the basis from intervals. "Points" don't work, and if things were truly continuous they would.

----------


## nonsqtr

-- dup --

----------


## UKSmartypants

> https://www.researchgate.net/figure/...fig1_319259434
> 
> The longer I study this the more I come to the conclusion that continuity makes no sense, and something resembling Penrose's "spin foam" makes a lot more sense.



Ahaha your slowly catching up with me  :Stick Out Tongue:   its all about twistors in fact

----------


## nonsqtr

Oy - so back to entropy (we're still on entropy).

A "field" of any kind is basically an aether, it's used to communicate non-locally. And it's more than that, it adds information.

If we have a spatial dimension (call it R for real numbers), an object (point) on it has one degree of freedom which is its position. The real numbers can be mapped bijectively to the interval (0,1) so the situation much resembles a probability distribution, we have "one" outcome which is the position.

But if we have a field over that dimension, we gain as many additional degrees of freedom as the field has. For example for a scalar field, each position in R now has a value attached to it, which is the field (strength or influence). So then, the whole idea of a field in the first place is it interacts with "everything", so there's no such thing as a non-interacting particle. So in a way, the field "endows" each particle with a variable "amount" of information. It's variable because information is (always) shared between the field and the particle - and "how much" information is in the particle at any given instant, depends on how much has been absorbed into the field.

So if you measure the information (entropy) in the particle, you'll find it's not only variable, but if you put it in a field you can change it. Which is exactly what we see. We can so far put up to 32 times the information into a photon, than it can natively absorb, by putting it into the right combination of fields.

This business of fields, these are not simple relationships. But the idea that we can "overload" a photon by putting it in the right combination of fields is important. For instance - in the Boltzmann model of entropy in an idealized gas with a perfect boundary, we get completely different behavior if it's an ionized gas (has an electric charge, which means field interactions), and it gets even wilder when we put the whole experiment in a microwave cavity, then we even get new phases of matter like spin glasses and such. 

One must consider the time course of the experiment. Boltzmann's is an equilibrium model, and any system approaching equilibrium will take time to do so. Boltzmann's model depends on the collisions, which have velocity and momentum and take time. There's no such thing as an "instantaneous" transition to an equilibrium state. Ultimately the equilibrium condition is best described as a stochastic optimization over the available space.

----------


## nonsqtr

> Ahaha your slowly catching up with me   its all about twistors in fact


Okay, so let's talk about the first of the two requirements.

The first requirement is, that decoherence of two entangled particles be viewed as simultaneous by all observers in all reference frames - even if the particles are spatially separated by thousands of miles! (I'm not yet sure about accelerating reference frames, so for now we'll just say observers at rest).

In essence this amounts to "folding space" (even though it doesn't really), or equivalently changing the definition of time. 

Which is your opportunity to prove that time doesn't really exist, but in so doing you have to address this constraint.

And, if you use twistor theory, I'm guessing the maths will apply to the creation part as well as the destruction part.

----------


## UKSmartypants

well as I see it, the tricky bit is how you represent information entropy in a compactified dimension. Just because its compactified doesnt mean it cant hold energy or information


Compactified means the single dimensional axis has no length, ie Null Lines (light rays) have no length, and since they have no length, they have no 'time'.  In fact null lines in any dimension must have zero length anyway because of E=sqrt(MC^2)^2+PC^2. What happened just before the big bang was 4 dimensions expanded out along there own single axis, creating 4D. And maybe then all the compactified entropy or information then poured out as the Big Bang.

So, where did all the energy in the 10D manifold come from?  Was it  accrued information or maybe entropy from previous universes?  Maybe thats what the Dark energy is, the total of the accrued information on a 2D world sheet projecting at right angles into our manifold.


See, as I see it.  A compactified dimension is basically a black hole, it has a zero length axis.  So  a black hole here is a rupture, its a point where our uncompactified 4D spacetime has a tear where a single point of the remaining six compactified  dimensions poke through, ie are observable .  Thats why they have such massive gravity and warp space, at the very centre is a singularity, one planck length wide, that exposes our universe to all the information inside the compactified D6/ Ther more matter that pours in, the bigger the tear gets, until you have a supermassive black hole,

Consequently, black holes do not go anywhere, theres no white hole at the other side, black holes just point  to the Metaverse.  Wheras we have observed black holes, no one has ever come across any suggestion or example of a white hole, wheras they should be millions all over the place - as many white holes as black holes in fact. (unless you are going to argue none of the white holes will appear for trillion of years. )

So back to the original point, entropy then in this universe isn't fixed, because its pouring back into the originating manifold via black holes. Its like an oscillating chemical reaction. When this universe empties of information entropy, it collapses and a new universe  erupts, out of all the energy that poured into the originating manifold previously from us. So its cyclic, but not by Big  Crunch, its because the energy density drops to nothing, the fabric of spacetime collapse, and then reexplodes



It also explains what happens to the information that falls into a black hole, and thus why there is no Information Paradox. and it allows the universe to be infinite with no awkward questions - the twistor space its built on extends to the moment of the Big Bang, but the underlying compactified 2D manifold it projects from is effectively infinite. 
This might be one of the most revolutionary insights into this subject ever, or im totally on the wrong track and talking bollox. Since im a nobody in Science, we'll never know.....

----------


## nonsqtr

Interesting view.

"Cosmological" view.

I'm new to compactifications - what I know so far is there are two of them that are convenient for R, and the part that matters to the physicists is how many infinities there are.

But as you say, the part that should really matter more, is the nature of the intersection.

Which brings us right back to dusts. (Think about it... it's kinda shocking when we first do...)

The biggest compactification is Stone-Cech, which I don't understand yet, it seems difficult/scary.

From the probability standpoint, there is no general way to extend a distribution into higher dimensions, unless there are additional constraints. For example you can take a Gaussian and "rotate it" in a unitary manner, etc.

I thought the whole idea behind twistors was to address the nonlinearities, is that not so?

Four complex dimensions is a lot, you can do a lot with that.

----------


## nonsqtr

Here's a thought experiment.

Take any probability distribution, defined on [0,1]. Map it to R. Now compactify. 

Do we have the same "amount of information" in the compactified representation?

Let's keep it simple, let's say we use Alexandroff and compactify to S(1), we have "added a point", right?

----------


## UKSmartypants

> Interesting view.
> 
> "Cosmological" view.
> 
> I'm new to compactifications - what I know so far is there are two of them that are convenient for R, and the part that matters to the physicists is how many infinities there are.
> 
> But as you say, the part that should really matter more, is the nature of the intersection.
> 
> Which brings us right back to dusts. (Think about it... it's kinda shocking when we first do...)
> ...


Indeed. But if one dimension is complex, all ten must be. Imagine the capacity to store information in that.



Here -  ASD manifolds and their twistor spaces.   The solution is in expanding this to 10D, but i cant, the maths is beyond me. Otherwise id be having tea with Roger Penrose.

http://insti.physics.sunysb.edu/conf...lks/LeBrun.pdf

----------


## nonsqtr

Seems to me we have to be very careful with these compactifications.

We can see the problem immediately if we look at the quaternions with and without the reals. There are no algebraically coherent triplexes, but we can have octonions.

Ordering is another issue, when we change dimensionality.

----------


## nonsqtr

> Here's a thought experiment.
> 
> Take any probability distribution, defined on [0,1]. Map it to R. Now compactify. 
> 
> Do we have the same "amount of information" in the compactified representation?
> 
> Let's keep it simple, let's say we use Alexandroff and compactify to S(1), we have "added a point", right?


Let's say we do this, what happens to our probability distributions?

Well... something like this: Average Discrete Energies of Spectra of Gaussian Random Matrices | SpringerLink

So the short answer is, it depends if the point at infinity has zero probability or not.

If the point at infinity has non-zero probability, we can't compactify without adjusting our distributions.

This constraint would be "in addition to" the consideration of which functions we can actually extend into the compactification (like which have the same limit from the left and right).

----------


## nonsqtr

Okay so, at this point it should be clear that entropy has nothing to do with disorder. Instead, it has to do with the organization of information 

Yes, if you pack all the gas into one side of the box you get more collisions. Duh. So what?

What seems to matter is that the final ("preferred") state is an equilibrium, and a stochastic process is the path for how it gets there.

But is it really the "highest entropy state"? No way! By making a few simple adjustments we can double the amount of information or cut it in half. We can easily get our gas to organize itself into a plasma or half a dozen other "states of matter".

And, when we look down as low as we can look, at the quantum level, we find that information disappears into mixed states, and it varies in amount, and it causes strange things to happen in spacetime.

We can not DEFINE what is the most ordered or disordered state, until we understand the way information behaves at the microscopic level. We already know there are long range interactions that can sometimes take precedence over local organization. 

If you believe in non-zero vacuum energy then a state of zero information is impossible. What we are left with then is a "minimum", kind of like an energy well. Physical systems find the minimums of energy wells through gradient descent (mostly). The Boltzmann gas achieving thermodynamic equilibrium is an example. But if you take all the molecules in the gas and put them on one side of the box, you still have the same amount of information, the same number of bits, the same number of molecules. Boltzmann's definition of entropy is completely inadequate.

And if I were going to be direct about it, I'd put Boltzmann into the same category as Marvin Minsky, the guy who said Perceptrons would never work, and thereby set back the science of machine learning by at least forty years, because people believed his bullshit, and kept regurgitating it.

Look - the classical definition of entropy is what we call "anthropomorphic". It's based on what we "perceive", it's based on probability.

Which is a very, VERY different concept from putting a bit somewhere and retrieving it later. The bit that you're putting somewhere, has nothing to do with your perception.

These two different definitions of information are so far entirely contradictory. Because we don't know what happens at the micro-level. The confusion in this part of the science is most apparent in the area of "quantum memory", if you Google on that it seems a lot of people think it has to do with error correction (because you can't coerce a quantum state, although you can "guide" one). The idea that one wishes to store a classical bit as a qubit is kind of nonsensical, wouldn't you say? Yet there's hundreds of papers about that very thing!

Information, ultimately, has to do with symmetry and the breaking thereof. That's my view.

We are just now beginning to understand the deep relationship between stochastic processes and algebraic topology. Papers start maybe 2017 or so. We can "synthesize" actual real physical dimensions using random numbers, that has been the astounding discovery. We can add degrees of freedom to a system using random numbers, just as easily as we can with geometry or fields or any other way.

Edit: you can see something interesting if you look at the Cayley-Dickson construction. As we go from real numbers to complex numbers to quaternions to octonions, we lose first order, then commutativity, then associativity - these are "symmetries" of the real number field.

----------


## nonsqtr

Hm. This sounds wacky at first read, but it's actually an interesting idea.

Probability as a Field Theory

"Gaussian multiplicative chaos".  :Grin: 

Along a completely different tack, in reality there are "impossible events" that have zero probability - however in the brain these events must have non-xero probability because we can imagine them. Therefore we need a way of assigning non-zero probabilities to infinitely unlikely events, and here is such a way: Infinitesimal Probabilities* | The British Journal for the Philosophy of Science: Vol 69, No 2

And here's a pretty complete summary of Renyi conditional probabilities: Conditional RÃ©nyi Entropy and the Relationships between RÃ©nyi Capacities

This has a direct application in the area of vorticity, where we need a conditional probability field defined on an unbounded measure (eg velocity or vorticity fields on R(3)).

----------


## nonsqtr

Here's another idea. Quantum white noise. And analysis of reality therewith. To wit:

CHAPTER 4: QUANTUM WHITE NOISE CALCULUS AND APPLICATIONS | Real and Stochastic Analysis

----------


## Fall River

> It's not that simple.
> 
> Life acts in opposition to entropy. It's the exact opposite, life represents extreme concentrations of information and structure.



There's a section in Brian Greene's book with the heading "Thermodynamics and Life"

Under that heading he says, "Life, like all physical systems, abides by the dictates of entropy."

----------


## nonsqtr

> There's a section in Brian Greene's book with the heading "Thermodynamics and Life"
> 
> Under that heading he says, "Life, like all physical systems, abides by the dictates of entropy."


 :Smile: 

Here's how weird entropy is - which I doubt Mr Greene understands (cause I don't either).

There is something called the Unruh effect. Let me describe it for you.

There are many definitions of "vacuum" (as in the vacuum of space), but in the quantum world when things are at 0 temperature they're said to be in their "ground state", which means the lowest possible energy they can attain.

But, the ground state is not 'exactly' zero, it's some very small teeny-tiny amount that's so small it's on the level of quantum fluctuations.

When we "look at" the vacuum, what we "see" is it's not empty. We see pairs of particles seemingly popping into existence from nowhere, then popping out again. (They're usually short-lived, they call 'em virtual particles). So the vacuum energy, is non-zero. And, physics tells us all energy is related to frequency, and wavelength according to de Broglie's relations.

So in the Unruh effect (which is derived from quantum field theory), a static or uniformly moving (at fixed velocity) observer will see a teeny-tiny background temperature (which in real life is maybe 2-3 degrees Kelvin). But an ACCELERATING observer will see a heat bath. (A "higher temperature").

The magnitude of this effect can be significant, for instance a particle in a CERN accelerator at 10^21 m/s/s will see a vacuum temperature of almost 400,000 degrees Kelvin.

It seems like all the frequencies in the background energy are being red-shifted by the acceleration. But if we think about this in terms of entropy (and the spontaneous symmetry-breaking leading to "background information"), what it means is the accelerating observer sees either more emissions or they occur more frequently. (Which might be the same thing).

The accelerating observer "sees" more information than the inertial observer.

----------

Fall River (11-04-2021)

----------


## Fall River

Entropy may be weird and not totally understood but does that mean it doesn't pertain to life as it does to all other physical systems?

----------


## UKSmartypants

In the end its a consequence of the flaws in general relativity, along with time. Relativity is almost certainly a manifestation of a deeper theory of Quantum Gravity, which will require neither time nor entropy, but appear at a higher level as emergent fields.

----------


## nonsqtr

> Entropy may be weird and not totally understood but does that mean it doesn't pertain to life as it does to all other physical systems?


Well so, based on this thread, "what kind of entropy are you asking about?"

Thermodynamic entropy? Or information entropy? Or... ?

I believe I've demonstrated quite clearly that there is no such thing in the real world, as a state of "zero information".

Yet, hundreds of quantum theorists still say there is, and they're all WRONG, every single one of them is completely wrong.

And, I believe I've shown in this thread, that thermodynamic entropy is mostly configuration entropy, which is a tiny subset of the information space. 

Are you asking about life at the thermodynamic level? Cause like, who cares how much heat a person dissipates, it's not important. If we start talking about brains though, and information entropy, we're in a whole different universe.

----------


## Fall River

I guess, based on Brian Greene's book, I'm asking about thermodynamic entropy - life at the thermodynamic level.  

Who cares about how much heat a person dissipates?  Anyone who's interested in how life functions. 

If life couldn't function in some way, there wouldn't be a brain.

And, I assume, without thermodynamic entropy, he wouldn't be able to develop a theory of how the universe ends.

----------


## nonsqtr

> I guess, based on Brian Greene's book, I'm asking about thermodynamic entropy - life at the thermodynamic level.  
> 
> Who cares about how much heat a person dissipates?  Anyone who's interested in how life functions. 
> 
> If life couldn't function in some way, there wouldn't be a brain.
> 
> And, I assume, without thermodynamic entropy, he wouldn't be able to develop a theory of how the universe ends.


Okay.

Well, so let's say we have a particle that's in one of these Boltzmann gases, and we'd like to know where it'll go next (in other words, how the system will evolve in time).

Boltzmann used a "collision" model, he treated the gas particles like billiard balls. And, in a billiard ball collision, we can calculate all the angles, momentums, and etc.

In real life though, gas particles are in a "quantum" state, which means they don't really obey the neat math of perfect collisions.

What we're dealing with is something called "scattering amplitude", which means the probability of the particle going this way or that way, upon a quantum interaction with some other particle.

Now, @UKSmartypants will love this, because it comes out of the Twistor Theory of Sir Roger Penrose. In real life, scattering amplitudes are most accurately described by something called an "amplituhedron", which is a volume in twistor space.

https://en.m.wikipedia.org/wiki/Amplituhedron

In a perfect world, the results from the collision model and the Twistor model would be the same, but they're not. The amplituhedron is a polytope in 3d complex projective space, which is actually four complex dimensions. It accounts for the "spin" that one of these billiard balls may have upon collision, which Boltzmann's model doesn't account for.

Boltzmann's version of entropy is a good first approximation, it works in most cases. But it's kinda like Newton's law of gravity, it breaks down the minute things start getting weird.

I'm glad you're interested in these things. From a physiological standpoint, thermoregulation in the human body is very impressive.

----------


## UKSmartypants

Can Constructor Theory crack this issue?

----------


## nonsqtr

I dunno, that's pretty random.  :Grin:

----------


## UKSmartypants

> I dunno, that's pretty random.



It seems reasonable to me. I suspect your issue is the video doesnt contain the word 'neuron' in it anywhere.   :Smiley ROFLMAO:

----------


## fmw

> One small addition, even though Einstein did say "Energy cannot be created or destroyed"  That had to be prior to 1944 when in fact mass was coverted to energy.  So it should be, "Energy cannot be created or destroyed except in a nuclear reaction.


I remember from school that there is a law of conservation of mass/energy.  It is this combination that is fixed.  It is possible to convert mass to energy or energy to mass but it is impossible to increase or decrease the total of mass and energy combined.  I think it went something like that.

----------


## nonsqtr

> I remember from school that there is a law of conservation of mass/energy.  It is this combination that is fixed.  It is possible to convert mass to energy or energy to mass but it is impossible to increase or decrease the total of mass and energy combined.  I think it went something like that.


Yes. In real life it's much more complicated than that. But, even if we stick with the simple models, it's still more complicated.

The Gibbs free energy is

G = H - TS

where S is the entropy and H is the enthalpy

H = U + PV

where U is the 'internal energy".

So you can actually draw energy from entropy, and do useful work with it.

Osmosis is an example.

So you can see, in the expression for energy, there are terms for the unit, how many units, and how the units are arranged.

----------

