Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
Alex Lawsen
Karma:
483
AI grantmaking at Open Philanthropy.
I used to give careers advice for 80,000 hours.
All
Posts
Comments
New
Top
Old
AI x-risk, approximately ordered by embarrassment
Alex Lawsen
12 Apr 2023 23:01 UTC
151
points
7
comments
19
min read
LW
link
ELCK might require nontrivial scalable alignment progress, and seems tractable enough to try
Alex Lawsen
8 Apr 2023 21:49 UTC
17
points
0
comments
2
min read
LW
link
A tension between two prosaic alignment subgoals
Alex Lawsen
19 Mar 2023 14:07 UTC
31
points
8
comments
1
min read
LW
link
Deceptive failures short of full catastrophe.
Alex Lawsen
15 Jan 2023 19:28 UTC
33
points
5
comments
9
min read
LW
link
alexrjl’s Shortform
Alex Lawsen
29 Aug 2022 14:23 UTC
3
points
14
comments
1
min read
LW
link
Thoughts on ‘List of Lethalities’
Alex Lawsen
17 Aug 2022 18:33 UTC
27
points
0
comments
10
min read
LW
link
An easy win for hard decisions
Alex Lawsen
5 May 2022 7:47 UTC
25
points
0
comments
3
min read
LW
link
Incentive Problems With Current Forecasting Competitions.
NunoSempere
and
Alex Lawsen
9 Nov 2020 16:20 UTC
44
points
21
comments
5
min read
LW
link
Back to top