The 27 papers
List of 27 papers (supposedly) given to John Carmack by Ilya Sutskever: “If you really learn all of these, you’ll know 90% of what matters today.”
The list has been floating around for a few weeks on Twitter/LinkedIn. I figure some might have missed it so here you go.
Regardless of the veracity of the tale I am still finding it valuable.
https://punkx.org/jackdoe/30.html
The First Law of Complexodynamics (scottaaronson.blog)
Keeping Neural Networks Simple by Minimizing the Description Length of the Weights (cs.toronto.edu)
ImageNet Classification with Deep CNNs (proceedings.neurips.cc)
GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism (arxiv.org)
Multi-Scale Context Aggregation by Dilated Convolutions (arxiv.org)
Neural Machine Translation by Jointly Learning to Align and Translate (arxiv.org)
Quantifying the Rise and Fall of Complexity in Closed Systems: The Coffee Automaton (arxiv.org)
Deep Speech 2: End-to-End Speech Recognition in English and Mandarin (arxiv.org)
A Tutorial Introduction to the Minimum Description Length Principle (arxiv.org)
PAGE 434 onwards: Komogrov Complexity (lirmm.fr)
CS231n Convolutional Neural Networks for Visual Recognition (cs231n.github.io)
Might be good to estimate the date of the recommendation—as the interview where Carmack mentioned this was in 2023, a rough guess might be 2021/22?
I like this format and framing of “90% of what matters” and someone should try doing it with other subjects.