

The Physics of Life, Episode 3: Inside and Outside: a Deep Dive into Entropy, Its History and Deeper Meaning
The nature of the origin of life from the standpoint of a specific shared set of definitions of terms, We recap Episode 2 definitions of complexity, chaos, order, and entropy, + a deep dive into the history of entropy and its importance.
Thermodynamic systems,
Kolmogorov complexity measure
Crossing the boundary to the outside, complexity is greatly reduced in terms of how you interact with it. Once boundary is crossed complexity transforms. Chaos can be looked at in various ways - from an archetypal context to an emotional state to the strict definition it has within physics.
In disorder, there is a reference point you can go back to, but in chaos, there is no reference point.
The matrix of predictability and determinism. Deterministic but not predictable describes chaotic systems
Unpredictable and non-deterministic - a true random number generator
Predictable but non-deterministic - a computer running a program (there's some intelligence out there that we are responding to) Information - Julian Barbour article, Bit from It. In the Shannon sense - how many bits In the sense of telling you something useful (i.e. tall or short, which can encompass a lot of different states. Yes/No questions don't necessarily map to number of states. In the sense of black holes, holographic principle. In the sense of semantic information
Chaos - strange attractors are acting in phase space. Order coming. out of chaotic systems (unpredictable, but bounded)
Shapes in multiple dimensions. Brain doesn't see the world the way Mother Nature sees it. Entropy definitions are incomplete. Ex: a measure of disorder, a measure of number of states -
Boltzmann, Joule, Carnot, Clausius
Entropy is a bookkeeping device to make sure that you don't violate Carnot's principle. How much free energy depends on the difference between hot and cold. Entropy is the mileage indicator for the disorganization, not the disorganization itself.
Carnot - The max efficiency is T hot minus T cold divided by T h Kelvin - perfect engine Clausius - perfect refrigerator
Entropy is. qualitative metaphysical statement. Can't go backwards. Does the arrow of time only move forwards because of entropy? Or does the second law occur (does entropy always increase) because the arrow of time already has a direction? People assume science is a step by step process - but you stare at it for a long time, and then one day, the answer comes.
Qualitative boundary between the inside and the outside. Inside: thermodynamic entropy ( states/probabilities/energies) tends to maximum disorder, only works at thermal equilibrium Probability of going. back is vanishingly small. Outside: Out of equilibrium will tend towards a new equilibrium and tend towards maximum forgetfulness. You can't go back because you have no idea where came from. (self-organized criticality & complexity economics. Any living system will degrade over time.
When complexity on the inside is hidden, you're dealing with a simple object (emergence. Weak emergence -boundary is artificial - cylinders, pistons, etc. Strong emergence - system is making its own boundary (self-organizing, maintaining its own boundary.
Playlist for the whole 6 part series: https://www.youtube.com/playlist?list=PLoARw9zo4EUZhxqfaqYU5yjy0-Tr8RlB4