GitHub Copilot, The Perceiver, Beyond the Transformer, Data augmentation, NL augmenter 🦎 β†’ 🐍, Research communication

Β·Sebastian RuderΒ·Β·

Hi all,This newsletter is a bit delayed. I had to skip the last one as I had to take a break after a busy period in the end of May (EMNLP and NeurIPS deadlines). At the same time, there's so much happening that I've found it hard to catch up. Now I'm back, feeling more energized, and updating myself (and you) on what's new.I'll discuss the biggest advances over the last months including GitHub Copilot, the Perceiver, and non-self-attention models.I'll also talk about something that is challengin...

Read full article β†’

Related Articles

Accelerating Gemma 4: faster inference with multi-token prediction drafters
amrrs Β· Hacker News Β· 4d ago
OpenAI’s o1 correctly diagnosed 67% of ER patients vs. 50-55% by triage doctors
donsupreme Β· Hacker News Β· 6d ago
ProgramBench: Can language models rebuild programs from scratch?
jonbaer Β· Hacker News Β· 2d ago
ZAYA1-8B matches DeepSeek-R1 on math with less than 1B active parameters
steveharing1 Β· Hacker News Β· 2d ago
A couple million lines of Haskell: Production engineering at Mercury
unignorant Β· Hacker News Β· 6d ago