Last Week in AI cover image

Last Week in AI

#151 - Copilot Pro, LLama.cpp, conversational diagnostic AI, secret AI diplomacy

Jan 21, 2024
Exciting advancements are taking center stage! Microsoft’s Copilot Pro is revolutionizing Office with AI features, while Amazon’s new tool reshapes online shopping. The competitive race heats up as China develops cutting-edge chips. Meanwhile, OpenAI is striking lucrative licensing deals with news publishers. Anthropic's unique $750 million fundraising strategy raises eyebrows in the tech community. Plus, a look at the implications of AI in clinical trials and the complexities of international AI diplomacy. This week is packed with innovation and intrigue!
01:36:10

Podcast summary created with Snipd AI

Quick takeaways

  • Deploying the Llama model on a laptop using the open-source library Llama.cpp enables fast generation speeds and high efficiency for local machine training.
  • Llama Pro, an expansion of the Llama model with specialized blocks, achieves better performance in general language understanding and specialized skills, making it a versatile and powerful language model.

Deep dives

Llama.cpp: Training Llama Model on a MacBook

Llama.cpp is an open-source library that allows the deployment of the Llama model on a laptop, achieving fast generation speeds. The library utilizes for-bit integer quantization and GPU explorations via CUDA. With Llama.cpp, the model can generate 1,400 tokens per second on a MacBook Pro, making it efficient and accessible for local machine training.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner