Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
dkirmani
Karma:
1,129
All
Posts
Comments
New
Top
Old
how 2 tell if ur input is out of distribution given only model weights
dkirmani
Aug 5, 2023, 10:45 PM
48
points
10
comments
1
min read
LW
link
The Sequences Highlights on YouTube
dkirmani
Feb 15, 2023, 7:36 PM
23
points
2
comments
2
min read
LW
link
(youtube.com)
I Converted Book I of The Sequences Into A Zoomer-Readable Format
dkirmani
Nov 10, 2022, 2:59 AM
200
points
32
comments
2
min read
LW
link
dkirmani’s Shortform
dkirmani
Nov 2, 2022, 9:49 PM
2
points
3
comments
1
min read
LW
link
OpenAI’s Alignment Plans
dkirmani
Aug 24, 2022, 7:39 PM
60
points
17
comments
5
min read
LW
link
(openai.com)
20 Critiques of AI Safety That I Found on Twitter
dkirmani
Jun 23, 2022, 7:23 PM
21
points
16
comments
1
min read
LW
link
AI Risk, as Seen on Snapchat
dkirmani
Jun 16, 2022, 7:31 PM
23
points
8
comments
1
min read
LW
link
No Abstraction Without a Goal
dkirmani
Jan 10, 2022, 8:24 PM
28
points
27
comments
1
min read
LW
link
Regularization Causes Modularity Causes Generalization
dkirmani
Jan 1, 2022, 11:34 PM
50
points
7
comments
3
min read
LW
link
Back to top