This could be through any number of mechanisms like
A story I’m worried about goes something like:
LW correctly comes to believe that for an AI to be aligned, its cognitive turboencabulator needs a base plate of prefabulated amulite
the leader of an AI project tries to make the base plate out of unprefabulated amulite
another member of the project mentions off-hand one time that some people think it should be prefabulated
the project leader thinks, “prefabulation, wasn’t that one of the pet issues of those Bell Curve bros? well, whatever, let’s just go ahead”
the AI is built as planned and attains superhuman intelligence, but its cognitive turboencabulator fails, causing human extinction
Current theme: default
Less Wrong (text)
Less Wrong (link)
Arrow keys: Next/previous image
Escape or click: Hide zoomed image
Space bar: Reset image size & position
Scroll to zoom in/out
(When zoomed in, drag to pan; double-click to close)
Keys shown in yellow (e.g., ]) are accesskeys, and require a browser-specific modifier key (or keys).
]
Keys shown in grey (e.g., ?) do not require any modifier keys.
?
Esc
h
f
a
m
v
c
r
q
t
u
o
,
.
/
s
n
e
;
Enter
[
\
k
i
l
=
-
0
′
1
2
3
4
5
6
7
8
9
→
↓
←
↑
Space
x
z
`
g
A story I’m worried about goes something like:
LW correctly comes to believe that for an AI to be aligned, its cognitive turboencabulator needs a base plate of prefabulated amulite
the leader of an AI project tries to make the base plate out of unprefabulated amulite
another member of the project mentions off-hand one time that some people think it should be prefabulated
the project leader thinks, “prefabulation, wasn’t that one of the pet issues of those Bell Curve bros? well, whatever, let’s just go ahead”
the AI is built as planned and attains superhuman intelligence, but its cognitive turboencabulator fails, causing human extinction