This post explains the "scaling laws" that drive rapid AI progress: when you make AI models bigger and train them with more computing power, they get smarter at most tasks. The piece also introduces a second scaling law, where AI performance improves by spending more time "thinking" before responding.