AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Turpentine is constructing a media outlet and network tailored for tech individuals, offering benefits like tactical advice, tech stack recommendations, hiring referrals, in-person events, an investor database, and exclusive perks.
Tim Dygman, an applied mathematician, employs cutting-edge AI methods in computational chemistry to enhance our understanding of electrolyte solutions. By utilizing neural networks trained on quantum mechanics data, he accelerates the prediction of atomic and molecular behaviors, enabling faster and more accurate simulations.
Dygman provides insights into concepts like coarse graining to simplify complex systems by abstracting away detail and equivariance, which aids in data representation. He delves into the technical workings of neural network potentials, showcasing their ability to capture surprising behaviors and integrate into broader AI systems.
The discussion emphasizes how AI is propelling scientific breakthroughs, hinting at the potential for AI systems to surpass human capabilities in various fields. By streamlining computational chemistry processes and combining AI with fundamental physics, researchers are poised to uncover significant insights and drive transformative discoveries.
In the podcast, the speaker discusses how neural network potentials were used to study crystallization phenomena, which typically occur over very long time scales. By examining the radial distribution function and inspecting trajectories, they found no evidence of crystal formation. Coarse-graining the solvent allowed for quicker analysis of crystallization compared to all-atom simulations, highlighting the potential of neural network potentials in predicting phase transformations.
The podcast outlines a surprising self-ionization phenomenon of water observed in simulations, where hydrogens hop off water molecules forming ionized states. Despite not being previously seen in training data, the neural network model successfully predicted such occurrences. This self-ionization process relates to the acidity levels of water and could be further refined by adding specific quantum chemical training data.
The episode delves into the scalability and generalization capabilities of AI in material science. Examples like the universal machine learning potentials showcase the potential for broader applications beyond specific training data. Companies like ByteDance and Microsoft are venturing into scaling up these models to predict various electrolytes and domains beyond their training data, suggesting a future of versatile scientific tools that could significantly impact research and innovation.
Chinese big tech companies and academic groups are producing high-quality work that rivals advancements seen elsewhere. This progress extends to areas like materials design and biology research, with significant economic and technological implications. While concerns exist regarding safety and dual-use applications, such as in designing harmful substances, the overall impact of these advancements on various fields is noteworthy.
The approach of borrowing tools and concepts across different disciplines is highlighted as a crucial aspect of scientific progress. Examples from history, such as physics contributing to biology through X-ray crystallography, emphasize the impact of interdisciplinary collaborations. The discussion also involves the potential of applying tools from statistical mechanics to understand machine learning processes and the future prospect of integrating multi-scale models within a single neural network for scientific simulations and research.
Explore the fusion of AI and computational chemistry with University of Queensland's researcher, Tim Duignan. Learn about transforming our understanding of electrolytes, the art of simulating physical processes using neural networks, and the potential of AI in scientific breakthroughs. Discover Tim's journey creating AI models for complex system predictions and join the discussion on advancing AI's role in future scientific discoveries.
Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj
RECOMMENDED PODCAST:
Byrne Hobart, the writer of The Diff, is revered in Silicon Valley. You can get an hour with him each week. See for yourself how his thinking can upgrade yours.
Spotify: https://open.spotify.com/show/6rANlV54GCARLgMOtpkzKt
Apple: https://podcasts.apple.com/us/podcast/the-riff-with-byrne-hobart-and-erik-torenberg/id1716646486
Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive
The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR
Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/
Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.
(00:00:00) About the Show
(00:02:47) Introduction
(00:04:09) Why electrolyte solutions are important
(00:06:53) What properties are we trying to predict with electrolyte solutions?
(00:09:33) Battery fires
(00:13:38) Molecular dynamics
(00:16:40) Time step
(00:18:51) Scaling
(00:23:09) Sponsors: Oracle | Brave
(00:25:17) Decoherence
(00:27:23) Neural Network Potentials
(00:31:04) How big are the models?
(00:35:46) What architecture is used?
(00:38:12) Equivariance (Part 1)
(00:42:29) Sponsors: Omneky | Squad
(00:44:15) Equivariance (Part 2)
(00:44:16) AlphaFold3
(00:46:52) Zero-shot latent space communication
(00:48:21) What is coarse graining?
(00:54:31) How to know if there is no crystallization
(00:56:53) What is the role of water in the simulation?
(01:01:25) Crystallization
(01:05:26) Matching surfaces
(01:09:32) Self-ionization of water
(01:13:22) Active learning
(01:16:41) Temperature
(01:19:20) Ice and water
(01:21:12) Scaling up the model
(01:25:21) Big tech singularity
(01:27:54) China is not far behind
(01:29:15) Safety concerns
(01:31:16) KANs
(01:34:36) The future of scientific progress
(01:39:28) The art of discovery
(01:43:02) One model to rule them all
(01:47:46) AGI, its risks and benefits
(01:50:21) Power vs. Generality
(01:53:01) Generality for Utility
(01:53:40) Outro
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode