📈 Data to start your week: The AI capacity trap

·Exponential View··

Cheaper AI was supposed to ease the compute crunch. Instead, it made it worse. The Jevons paradox, applied to intelligence, means that every time the price of a token falls, demand rises faster than supply can scale.But the labs can’t clear the queue by raising prices — their customers will defect to the next best alternative. So the compute crunch shows up where economists would least expect it. As we wrote yesterday, OpenAI is passing on opportunities, Anthropic is adjusting session limits for...

Read full article →

Related Articles

Accelerating Gemma 4: faster inference with multi-token prediction drafters
amrrs · Hacker News · 3d ago
ProgramBench: Can language models rebuild programs from scratch?
jonbaer · Hacker News · 1d ago
ZAYA1-8B matches DeepSeek-R1 on math with less than 1B active parameters
steveharing1 · Hacker News · 1d ago
OpenAI’s o1 correctly diagnosed 67% of ER patients vs. 50-55% by triage doctors
donsupreme · Hacker News · 6d ago
A couple million lines of Haskell: Production engineering at Mercury
unignorant · Hacker News · 6d ago