The importance of fact-checking and veracity varies depending on the use case of artificial intelligence systems.
Internal coherence is a challenge for AI systems in generating coherent narratives or works of fiction.
AI-generated text often lacks temporal reasoning and may create inconsistencies over time.
AI systems currently have limitations in understanding humor and determining the quality of jokes.
The technology for AI-generated novels is currently not advanced enough, but may improve in the future.
Gary Marcus is an expert in artificial intelligence, a cognitive scientist and host of the podcast “Humans vs Machines with Gary Marcus.”
In this week’s conversation, Yascha Mounk and Gary Marcus discuss the shortcomings of the dominant large language model (LLM) mode of artificial intelligence; why he feels that the AI industry is on the wrong path to developing superintelligent AI; and why he nonetheless believes that the eventual emergence of superior AI may pose a serious threat to humanity.