Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
lumenwrites
Karma:
74
All
Posts
Comments
New
Top
Old
[Question]
I want to donate some money (not much, just what I can afford) to AGI Alignment research, to whatever organization has the best chance of making sure that AGI goes well and doesn’t kill us all. What are my best options, where can I make the most difference per dollar?
lumenwrites
2 Aug 2022 12:08 UTC
15
points
9
comments
1
min read
LW
link
What are these “outside of the Overton window” approaches to preventing AI apocalypse that Eliezer was talking about in his post?
lumenwrites
14 Jun 2022 21:18 UTC
2
points
0
comments
1
min read
LW
link
[Question]
How would you explain Bayesian thinking to a ten year old?
lumenwrites
5 Jan 2022 17:25 UTC
7
points
3
comments
1
min read
LW
link
[Question]
How do you write original rationalist essays?
lumenwrites
1 Dec 2021 8:08 UTC
24
points
19
comments
1
min read
LW
link
Back to top