RSS

Black Marble

TagLast edit: Feb 4, 2024, 7:20 PM by gilch

A Black Marble is a technology that by default destroys the civilization that invents it. It’s one type of Existential Risk. AGI may be such an invention, but isn’t the only one.

The name comes from a thought experiment by Nick Bostrom, where he described inventions as pulling marbles out of an urn. Most are white (beneficial), some are dangerous or a mixed blessing (usually described as gray or red balls), and some are black (fatal).

Also sometimes phrased as “black ball” when describing the experiment, including by Bostrom himself, but that already means something else when used by itself.

As a hypothetical example, Bostrom asked what would happen to civilization if a weapon of mass destruction comparable to an atomic bomb was much easier to make, on the level of cooking sand in a microwave or something like that. The natural answer being, once knowledge of the technique spreads, any random psychopath can do it, and some do so, and soon thereafter bomb us back into the stone age. Civilization then can’t rebuild past the point of microwaves without getting destroyed again, as long as the knowledge persists (making black marbles an extreme type of information hazard). But this scenario is path dependent. One could imagine a different civilization with different capabilities that could survive such knowledge. Perhaps one with a world government (no wars) and a screening for psychopathy, etc. Perhaps a dystopian world panopticon could prevent use. Or, for a space-faring civilization that mostly lives in small independent orbital colonies, everybody already has (and civilization is somehow currently surviving) similarly destructive kinetic attack capabilities so maybe the sand-nukes don’t change much.

Ab­sent co­or­di­na­tion, fu­ture tech­nol­ogy will cause hu­man extinction

Jeffrey LadishFeb 3, 2020, 9:52 PM
21 points
12 comments5 min readLW link

The Vuln­er­a­ble World Hy­poth­e­sis (by Bostrom)

Ben PaceNov 6, 2018, 8:05 PM
50 points
17 comments4 min readLW link
(nickbostrom.com)

En­light­en­ment Values in a Vuln­er­a­ble World

Maxwell TabarrokJul 20, 2022, 7:52 PM
15 points
6 comments31 min readLW link
(maximumprogress.substack.com)

The Trans­par­ent So­ciety: A rad­i­cal trans­for­ma­tion that we should prob­a­bly undergo

mako yassSep 3, 2019, 2:27 AM
14 points
25 comments8 min readLW link

misc raw re­sponses to a tract of Crit­i­cal Rationalism

mako yassAug 14, 2020, 11:53 AM
21 points
52 comments3 min readLW link

The Dumbest Pos­si­ble Gets There First

ArtaxerxesAug 13, 2022, 10:20 AM
44 points
7 comments2 min readLW link

Nu­clear Strat­egy in a Semi-Vuln­er­a­ble World

Jackson WagnerJun 27, 2021, 8:17 AM
7 points
1 comment17 min readLW link

My thoughts on nan­otech­nol­ogy strat­egy re­search as an EA cause area

Ben_SnodinMay 2, 2022, 5:57 PM
34 points
0 comments42 min readLW link

A Pin and a Bal­loon: An­thropic Frag­ility In­creases Chances of Ru­n­away Global Warm­ing

avturchinSep 11, 2022, 10:25 AM
33 points
23 comments52 min readLW link

Take 6: CAIS is ac­tu­ally Or­wellian.

Charlie SteinerDec 7, 2022, 1:50 PM
14 points
8 comments2 min readLW link