"Age of Miracles" cover image

"Age of Miracles"

Latest episodes

undefined
Dec 20, 2022 • 44min

Not Boring Founders: Exponential

Exponential makes it easy to discover, assess and invest in liquidity pools across chains. It wants to be the Coinbase of DeFi. We were joined by the companies three founders: Driss Benamour, Mehdi Lebbar, and Greg Jizmagian to discuss the current crypto market, Exponential's risk-first approach, and how to asses a digital risk-asset. You can learn more at exponential.fi. You can also watch the full video version of this conversation on our Youtube channel @notboringmedia. --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
6 snips
Dec 19, 2022 • 17min

Blank Page

LINK to essay Every new year is a blank page, a chance to start fresh with new ideas and new sources and new energy. Of course, January 1st isn’t actually any more different from December 31st than December 31st was from December 30th. But the collective time we all spend away from the daily routine during the holidays feels like a reset. And we’ve all tacitly agreed that the new year is a different thing than the old year. So it’s close enough to actually different. This particular upcoming new year feels even more different, like a decadal reset. Markets have crashed (optimistically) or are in the process of crashing (realistically). The wave of momentum that the tech industry has been riding since the early 2010s has crashed (optimistically) or is in the process of crashing (realistically). The past few years have been chaotic and weird, and it came to a boiling point in 2022. The things that made sense a couple of years ago, the sure things, don’t make as much sense, are no longer sure. You might be experiencing the weird as a general vibe or very specifically in your life and career. This post was sponsored by Masterworks. Learn more. --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
Dec 5, 2022 • 24min

Four Seasons Total Tech

Audio Essay: Four Seasons Total Tech Link to full essay: https://www.notboring.co/p/four-seasons-total-tech Packy analogizes the Gartner Hype Cycle to the four seasons, to explain why some technology winters feel so long and cold while all summers feel pretty much the same. We're coming out of a many decades long winter for a bunch of frontier technology spaces (AI, Space, Nuclear, etc) seemingly all at once -- it's still spring, but Packy predicts that the we're about to enter a long, hot summer. Today’s Not Boring is brought to you by… Oceans Oceans matches busy executives and high-growth teams with top EA+ candidates from global talent pools. Oceans recruits, vets, trains, and employs top talent and you get a high-quality, reliable EA+ for a fraction of the cost! --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
Dec 5, 2022 • 1h 20min

Anton Teaches Packy AI | E3 | The First AI Winter

Packy and Anton breakdown one of the early, foundational artifical intelligence papers, "A logical calculus of the ideas immanent in nervous activity," which was first published in 1943. The researchers, Warren S. McCulloch and Walter Pitts, were trying to understand how the brain could produce such complex patterns by using basic, connected cells. Their work was foundational in understanding neurons, and l introduced the concept of the neural network which has since become a key concept in artificial intelligence. This was the oldest paper that Anton and Packy have discussed, and its naturally age led to a lengthy conversation on the history of artificial intelligence. That history -- like the history of many technological fields -- is spotted with long winters, golden ages, broken timeline promises, and sudden developments. Today, it seems, we may be in the middle of a golden age for artificial intelligence.    LINKS: Youtube Link: https://youtu.be/MpBdVJEx2Aw A logical calculus of the ideas immanent in nervous activity: https://www.cs.cmu.edu/~./epxing/Class/10715/reading/McCulloch.and.Pitts.pdf History of the First AI Winter: https://towardsdatascience.com/history-of-the-first-ai-winter-6f8c2186f80b AI Winter: https://en.wikipedia.org/wiki/AI_winter#The_abandonment_of_connectionism_in_1969 Bitter Lesson: http://www.incompleteideas.net/IncIdeas/BitterLesson.html Chroma: https://www.trychroma.com/ --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
10 snips
Nov 27, 2022 • 1h 4min

Decentralization

The idea is this: the world oscillates between centralization and decentralization, with progress sloping upward through the turns. We’re approaching an era of decentralization.  This shift from centralization to decentralization is popping up everywhere I look:  Energy: Fossil Fuels vs. Renewables Manufacturing: Globalization vs. Reshoring Manufacturing: Making vs. Growing Science: Government Funded vs. Decentralized Hard Tech: Government Agencies & Incumbents vs. Startups AI: Closed vs. Open Talent: Big Tech vs. Startups Apps: Big vs. Small Media: Substack vs. Journalism Education: Factory vs. Personalized Finance: Big Banks vs. Fintech We go deep on all of that in this episode.  But here's the wildest part. That's not my real voice you're listening to. It's my AI voice, created by my friends at play.ht by training a model on a bunch of other audio essays. It's wildly good -- give it a listen.   Read the full essay at Not Boring: https://www.notboring.co/p/decentralization --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
13 snips
Nov 25, 2022 • 1h 3min

Anton Teaches Packy AI | Ep 2 | Chinchilla

We're back! In Episode 2, Anton Teaches Packy about Deepmind's March 2022 paper, Training Compute-Optimal Large Language Models, or as it's more commonly known, Chinchilla.  Prior to Chinchilla, the best way to improve the performance of LLMs was thought to be by scaling up the size of the model. As a result, the largest models now have over 500 billion parameters. But there are only so many GPUs in the world, and throwing compute at the problem is expensive and energy intensive. In this paper, Deepmind found that the optimal way to scale an LLM is actually by scaling size (parameters) and training (data) proportionally.  Given the race for size, today's models are plenty big but need a lot more data.    In this conversation, we go deep on the paper itself, but we also zoom out to talk about the politics of AI, when AGI is going to hit, where to get more data, and why AI won't take our jobs. This one gets a lot more philosophical than our first episode as we explore the implications of Chinchilla and LLMs more generally.  If you enjoyed this conversation, subscribe for more. We're going to try to release one episode per week, and we want to make this the best way to get a deeper understanding of the mind-blowing progress happening in AI and what it means for everything we do as humans.   LINKS: Training Compute-Optimal Large Language Models: https://arxiv.org/abs/2203.15556  chinchilla's wild implications: https://www.lesswrong.com/posts/6Fpvc...  Scaling Laws for Neural Language Models (Kaplan et al): https://arxiv.org/abs/2001.08361 --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
17 snips
Nov 7, 2022 • 1h 9min

Anton Teaches Packy AI | E1

Anton Teaches Packy AI is Not Boring's attempt at making AI more accessible to our audience. It's become increasingly obvious that we're in the Golden Age of AI, so we think it's important to demystify what's going on and how it all actually works.   Anton Troynikov is the founder of Chroma, former Meta Reality Labs Research Engineer and Roboticist.   Packy McCormick is the author of the popular tech and buinsess strategy newsletter, Not Boring.   Anton Teaches Packy AI is exactly what it sounds like -- each video, Anton breaks down AI to a level that Packy (your average above average smart person) can understand.   In Episode 1, Anton and Packy discuss the groundbreaking "Attention is All You Need" research paper which kicked off the entire Transformer generative AI wave.    LINKS:  - Attention is All You Need: https://arxiv.org/abs/1706.03762 - The Bitter Lesson: http://www.incompleteideas.net/IncIdeas/BitterLesson.html - Training Compute-Optimal Large Language Models: https://arxiv.org/abs/2203.15556 - Explain Paper: https://www.explainpaper.com/ --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
Nov 1, 2022 • 56min

Not Boring Founders: Joe Connor, Odyssey

Odyssey (formerly Agora) fully believes in the power of technology to provide families with opportunity to choose the educational environment and services that help them best meet their unique learning needs.  In order to help parents do that, we have partnered with the best K-12 providers and vendors in order to create a robust marketplace for our families. Joe Connor is the founder and CEO of Odyssey, and brings with him years of experience in education & law. Packy and Joe have been friends for 23 years, so it was especially special to have Joe come on the podcast to talk about his work at Odyssey. This episode was sponsored by Hyper. Hyper is a different kind of startup accelerator back by a16z and Sequoia Capital. Hyper supports participating startups with access to deep networks, media presence, and now hands-on support from unicorn founders.  You can learn more about Hyper and apply here. Tell them Not Boring sent you! --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
Oct 31, 2022 • 37min

Capital & Taste

Capital & Taste audio essay. Capital is banking built for founders. If that sounds like you, sign up for a Capital account today, whether you’re just raising your first million or doing millions in ARR. Capital expands Party Round’s offering from fundraising to a full suite of financial services focused on startups. Today, that’s banking – so you can hold, spend, and send the money you’ve raised with their flagship fundraising tool (fka Party Round). In the future, it might be treasury management, payroll, credit, software, and services. Capital is a big name for a small company. It hints at big ambitions. There’s a very long way between here and there, and myriad well-funded competitors lurking on the path, but if Capital lives up to its name, it will be because of its taste. --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message
undefined
5 snips
Oct 24, 2022 • 46min

Formic: Automating Abundance

Formic delivers Robots by the Hour to manufacturers. It’s making small and medium American businesses stronger, fighting inflation, and creating abundance Link to full essay here. -------------------------- The goal is abundance. We want more and better things, more cheaply. We want those things to be made closer to home. We want less fragile supply chains, so that those things are always available when we need them. We want more good jobs, and more time unlocked to maximize our human potential. Over the long-term, there are a lot of things we can do to create abundance. Generate clean, cheap, renewable electricity. Improve the education system. Build more houses. Change laws. In the immediate term, the answer is robots. Formic is creating the largest workforce of robots in the country -- and delivering Robots by the Hour to its customers, so that they can increase productivity and generate abundance. --- Send in a voice message: https://podcasters.spotify.com/pod/show/ageofmiracles/message

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner