Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
StrivingForLegibility
Karma:
376
All
Posts
Comments
New
Top
Old
Page
2
The Gears of Argmax
StrivingForLegibility
Jan 4, 2024, 11:30 PM
11
points
0
comments
3
min read
LW
link
When Can Optimization Be Done Safely?
StrivingForLegibility
Dec 30, 2023, 1:24 AM
12
points
0
comments
3
min read
LW
link
Optimization Markets
StrivingForLegibility
Dec 30, 2023, 1:24 AM
13
points
2
comments
2
min read
LW
link
Social Choice Theory and Logical Handshakes
StrivingForLegibility
Dec 29, 2023, 3:49 AM
17
points
0
comments
4
min read
LW
link
Distributed Strategic Epistemology
StrivingForLegibility
Dec 28, 2023, 10:12 PM
11
points
0
comments
3
min read
LW
link
Building Trust in Strategic Settings
StrivingForLegibility
Dec 28, 2023, 10:12 PM
24
points
0
comments
7
min read
LW
link
An Ontology for Strategic Epistemology
StrivingForLegibility
Dec 28, 2023, 10:11 PM
9
points
0
comments
5
min read
LW
link
How Emergency Medicine Solves the Alignment Problem
StrivingForLegibility
Dec 26, 2023, 5:24 AM
41
points
4
comments
6
min read
LW
link
A Decision Theory Can Be Rational or Computable, but Not Both
StrivingForLegibility
Dec 21, 2023, 9:02 PM
9
points
4
comments
1
min read
LW
link
Previous
Back to top