

The Physics of Life, Episode 4: The Necessity of Connectedness, Entropy as a Process rather than a Thing
Glen, the physicist, and Karen, relentlessly curious, continue to tackle the Nature of Life. Apologies in advance for the sound problems with the imported video. Mercifully, it's less than half a minute, so persevere and you will be rewarded.
Scroll down for video links plus playlist to entire series of 6 episodes.
Going forward, we will do a short intro that tackles another view of entropy, and then will move on to the fundamental nature of computation. In this video, we discuss thermodynamics, entropy (the process of converting one form of energy to another - the traffic laws for that process). Next time, we will discuss the aspect of counting states (from Boltzmann) The mathematics of the free energy principle is impenetrable, but nevertheless useful as a paradigm. In systems doing work internally, free energy tends to a minimum, entropy tends to a maximum. In physics, finding equivalent equations of state allows you to compute a solution.
Looking for a.non-biological definition of life. Markov blankets, markov boundaries - a description of connectedness as opposed to dispersion. Boundaries can be functional, not just physical. Openness of boundaries - permeability. Plus, there needs to be an immune system that keeps from breaking up the group. Persistence over time is another hallmark of life.
The boundary persists over time. Also a hallmark of strong emergence. Sensory states: asking questions. Making a measure of its environment. Active states: making a choice in responding to the information. What's in between are the inferences, the if/else/then, the decision matrix. Pop the hood and see what's underneath. Computation. Your senses are always looking for anomalies - for what has changed.
The free energy principle as applied to AI - cost function, reward function, potential energy function. Macro states are defined by thermodynamic variables. The fundamental thing is the second law of thermodynamics.
The concept of entropy was determined later as a mathematical way of describing the process of change in things. (In other words, entropy is not disorder, only defined as a change in things).
HOMEWORK: Contemplate a key in a lock as an example of an intelligent system. Consider the lock mechanism is asking a question of the key. What shape are you? If the shape is correct, the lock opens. Intelligence is required, but the intelligence is built into the system itself. Inserting the key in the lock represents the sensory states. Unlatching or latching represents the active states. In between is the if/else/then built into the system by some intelligence.
Next time, we will dive more deeply into entropy, then move on to computation, analyzing Paul Davies' idea of life as information. Glen would say that computation is more fundamental than information. We tried to watch 7:50 to 8:15 -
Entropy is not Disorder https://youtu.be/vX_WLrcgikc
Free Energy Principle with Karl Friston: https://youtu.be/NIu_dJGyIQI
Whole 6 part series Playlist: https://www.youtube.com/playlist?list=PLoARw9zo4EUZhxqfaqYU5yjy0-Tr8RlB4