AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How Ghost Text Co-Pilot Is Fast
The ghost text co-pilot is using a smaller model, right? It's using codecs. Part of the trade-off with great power comes super long latency. The more powerful models also stream in their responses like one token at a time. And we saw that completion acceptance rates in regions outside of the US or whatever were way lower.