Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
johnswentworth
Karma:
57,169
All
Posts
Comments
New
Top
Old
Page
1
Follow-up to “My Empathy Is Rarely Kind”
johnswentworth
Jul 31, 2025, 5:21 PM
67
points
25
comments
2
min read
LW
link
My Empathy Is Rarely Kind
johnswentworth
Jul 30, 2025, 3:49 AM
48
points
186
comments
4
min read
LW
link
Generalized Hangriness: A Standard Rationalist Stance Toward Emotions
johnswentworth
Jul 10, 2025, 6:22 PM
345
points
67
comments
7
min read
LW
link
Fictional Thinking and Real Thinking
johnswentworth
Jun 17, 2025, 7:13 PM
56
points
11
comments
4
min read
LW
link
The Value Proposition of Romantic Relationships
johnswentworth
Jun 2, 2025, 1:51 PM
194
points
39
comments
13
min read
LW
link
That’s Not How Epigenetic Modifications Work
johnswentworth
May 24, 2025, 12:15 AM
67
points
12
comments
2
min read
LW
link
Orienting Toward Wizard Power
johnswentworth
May 8, 2025, 5:23 AM
550
points
146
comments
5
min read
LW
link
$500 + $500 Bounty Problem: Does An (Approximately) Deterministic Maximal Redund Always Exist?
johnswentworth
and
David Lorell
May 6, 2025, 11:05 PM
73
points
16
comments
3
min read
LW
link
Misrepresentation as a Barrier for Interp (Part I)
johnswentworth
and
Steve Petersen
Apr 29, 2025, 5:07 PM
113
points
12
comments
7
min read
LW
link
$500 Bounty Problem: Are (Approximately) Deterministic Natural Latents All You Need?
johnswentworth
and
David Lorell
Apr 21, 2025, 8:19 PM
92
points
24
comments
3
min read
LW
link
So You Want To Make Marginal Progress...
johnswentworth
Feb 7, 2025, 11:22 PM
296
points
42
comments
4
min read
LW
link
Instrumental Goals Are A Different And Friendlier Kind Of Thing Than Terminal Goals
johnswentworth
and
David Lorell
Jan 24, 2025, 8:20 PM
181
points
61
comments
5
min read
LW
link
The Case Against AI Control Research
johnswentworth
Jan 21, 2025, 4:03 PM
356
points
81
comments
6
min read
LW
link
What Is The Alignment Problem?
johnswentworth
Jan 16, 2025, 1:20 AM
180
points
49
comments
25
min read
LW
link
The Plan − 2024 Update
johnswentworth
Dec 31, 2024, 1:29 PM
118
points
28
comments
4
min read
LW
link
The Field of AI Alignment: A Postmortem, and What To Do About It
johnswentworth
Dec 26, 2024, 6:48 PM
304
points
160
comments
8
min read
LW
link
[Question]
What Have Been Your Most Valuable Casual Conversations At Conferences?
johnswentworth
Dec 25, 2024, 5:49 AM
54
points
21
comments
1
min read
LW
link
The Median Researcher Problem
johnswentworth
Nov 2, 2024, 8:16 PM
157
points
70
comments
1
min read
LW
link
Three Notions of “Power”
johnswentworth
Oct 30, 2024, 6:10 AM
95
points
44
comments
4
min read
LW
link
Information vs Assurance
johnswentworth
Oct 20, 2024, 11:16 PM
187
points
18
comments
2
min read
LW
link
Back to top
Next