On Information Theoretic Bounds for SGD

·inFERENCe··

Few days ago we had a talk by Gergely Neu, who presented his recent work:Gergely Neu Information-Theoretic Generalization Bounds for Stochastic Gradient DescentI'm writing this post mostly to annoy him, by presenting this work using super hand-wavy intuitions and cartoon figures. If this isn't enough, I will even find a way to mention GANs in this context.But truthfully, I'm just excited because for once, there is a little bit of learning theory that I half-understand, at least at...

Read full article →

Related Articles

Accelerating Gemma 4: faster inference with multi-token prediction drafters
amrrs · Hacker News · 4d ago
OpenAI’s o1 correctly diagnosed 67% of ER patients vs. 50-55% by triage doctors
donsupreme · Hacker News · 6d ago
ProgramBench: Can language models rebuild programs from scratch?
jonbaer · Hacker News · 2d ago
ZAYA1-8B matches DeepSeek-R1 on math with less than 1B active parameters
steveharing1 · Hacker News · 2d ago
A couple million lines of Haskell: Production engineering at Mercury
unignorant · Hacker News · 6d ago