AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The podcast episode discusses the intriguing concept of RNA-based memory transfer, which involves extracting RNA from trained animals and injecting it into recipient animals to transfer learned information. The possibility of such memory transfer challenges the traditional understanding of memory storage in the brain and has been explored in experiments with planaria and rats. However, the complexity of the transferred memory and the decoding process remain unanswered questions. While the shared evolutionary decoding is plausible for adaptive memories like fear of the dark, decoding for arbitrary memories poses a significant puzzle. The implications and investigations in this field hold promise for expanding our understanding of memory and information processing in the brain.
The episode delves into the significance of memory transfer for non-adaptive or contrived information. While existing experiments focused on fear-based memories or specific tasks, the challenge lies in transferring complex, novel memories. The ability to transfer and decode such memories raises fundamental questions about neural encoding and decoding mechanisms. The experiment could involve training animals on arbitrary associations, like light flashes indicating a specific action. However, addressing if the recipient brain can analyze and retrieve this encoded information without pre-existing shared evolutionary coding remains a puzzle. Further investigation and experimentation are crucial to unravel the potential of memory transfer beyond adaptive contexts.
The podcast episode challenges traditional views of memory storage by exploring the transfer of memories through different mediums, such as RNA in planaria and rats. This calls for a reevaluation of the prevailing emphasis on synaptic connections as the primary location for memory storage. It highlights the need to consider the possibility of dense connections and their role in information processing. The episode also emphasizes the importance of interdisciplinary research. It critiques the limited inclusion of experimental findings like RNA-based memory transfer in neuroscience education and calls for greater exploration of unconventional ideas that have the potential to revolutionize existing paradigms.
The podcast episode brings attention to research gaps and the lack of exploration of certain innovative concepts, such as RNA-based memory transfer. It highlights the urgent need to fund and support research in these less-explored areas that challenge existing paradigms. The episode stresses the need for open-mindedness and a focus on uncovering new uncertainties and questions rather than solely pursuing established answers. This approach could lead to new breakthroughs, help bridge disciplinary boundaries, and push the boundaries of our current understanding of memory and information processing.
Living things are self-constructed from the beginning, allowing them to adapt to diverse and unpredictable conditions. They determine their own boundaries, solve metabolic and anatomical problems, and act in various problem spaces without predefined structures. This self-constructive nature contributes to their intelligence and plasticity, making them compatible with different interactions and the ability to mix and match with other living and non-living entities.
Living systems exhibit competencies in navigating problem spaces and achieving goals. They can optimize anatomical configurations and recover from perturbations through homeostatic feedback loops among cells, tissues, and organs. Competency shields information from evolution about the actual genome, placing more emphasis on competency rather than the genetic fitness. This can result in a trade-off where genetic fitness levels out at a suboptimal level, while competency continues to increase to adapt to a suboptimal genome.
Living systems, from multicellular organisms to social networks, can exhibit agency and adapt to changing environments. The ability to build models of the future, counterfactual simulations, and anticipate rewards through computation is fundamental for agency. Self-interested agents that can learn and distribute rewards among each other, forming emergent governance structures, are essential for the emergence of intelligent behavior and decision-making.
Insights from studying living systems, such as self-construction, problem-solving competencies, and emergent agency, can provide valuable perspectives for the development of AGI. Understanding the interplay between individual agents, their connections, and rewards, along with the ability to self-organize and adapt to different problem spaces, could inform the design and implementation of AGI systems that possess flexibility, intelligence, and adaptability.
YouTube link: https://youtu.be/kgMFnfB5E_A
This theolocution has been released early in an ad-free audio version for TOE members at http://theoriesofeverything.org.
Sponsors:
- Drink Trade: https://www.drinktrade.com/everything for 30% off
-Uncommon Goods: uncommongoods.com/everything for 15% off your first gift!
- Roman: https://ro.co/curt for 20% off first order
- Masterworks: https://www.masterworks.art/toe (promocode: theoriesofeverything)
Average net returns/Net IRR reflects annualized return on investment, net of all fees and
expenses. Past performance is not indicative of future results.
*New* TOE Website (early access to episodes): https://theoriesofeverything.org/
Patreon: https://patreon.com/curtjaimungal
Crypto: https://tinyurl.com/cryptoTOE
PayPal: https://tinyurl.com/paypalTOE
Twitter: https://twitter.com/TOEwithCurt
Discord Invite: https://discord.com/invite/kBcnfNVwqs
iTunes: https://podcasts.apple.com/ca/podcast/better-left-unsaid-with-curt-jaimungal/id1521758802
Pandora: https://pdora.co/33b9lfP
Spotify: https://open.spotify.com/show/4gL14b92xAErofYQA7bU4e
Subreddit r/TheoriesOfEverything: https://reddit.com/r/theoriesofeverything
LINKS MENTIONED:
- Michael Levin (Solo TOE podcast): https://youtu.be/Z0TNfysTazc
- Michael Levin Theolocution With Chris Fields & Karl Friston: https://youtu.be/J6eJ44Jq_pw
- Joscha Bach Theolocution With John Vervaeke: https://youtu.be/rK7ux_JhHM4
- Joscha Bach Theolocution With Donald Hoffman: https://youtu.be/bhSlYfVtgww
- Joscha Bach (Solo TOE podcast): https://youtu.be/3MNBxfrmfmI
This episode has been released early in an ad-free audio version for TOE members at http://theoriesofeverything.org/
TIMESTAMPS:
00:00:00 Introduction
00:01:55 Bach and Levin speak about each other's work
00:03:34 The cell functions as a neuron
00:07:15 Software as a control pattern
00:10:55 Disciplinary boundaries in academia
00:14:38 The perceptron is a "toy model" of the brain
00:18:44 How do you identify yourself as a researcher?
00:20:10 The benefits of podcasts vs. academia
00:30:04 Beliefs of Bach's and Levin's that have drastically changed
00:38:54 Memory moves outside the brain structure
00:45:06 Engrams and memory storage
00:47:30 The implications of transferring memory between species
00:55:25 Weissman's barrier
00:59:25 The notion of "competence" (Bach's and Levin's largest insight)
01:12:10 Virtualization for unreliable hardware
01:16:02 Defining "competence"
01:22:35 Bach's issues with goals (for and against teleology)
01:27:34 Planarian goals and explicitly encoded instructions
01:34:55 Navigation in "Morphic Space"
01:36:11 One species' birth defect can be another's benefit
01:37:42 The "Intelligence Trap" and bias
01:39:05 Application of each others' work to their own
01:52:25 Necessities of general intelligence in cells
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode