AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Decentralization for Inference
I think this is a good compete, it will be very valuable for inference. Instead of you connecting to the OpenAI server in the US every time, if this model was open, you can actually have multiple groups or clusters of GPU power. And now you are actually connecting with the cluster closest to you. So your latency and connecting with the server and your latency, instead of having one button neck that everyone connects to, you will have multiple versions. But today, large language models are not suitable for mobiles or like for consumer grid devices that doesn't have GPUs.