RSS

Oc­cam’s Razor

TagLast edit: Jan 4, 2022, 3:35 PM by Gordon Seidoh Worley

Occam’s razor (more formally referred to as the principle of parsimony) is a principle commonly stated as “Entities must not be multiplied beyond necessity”. When several theories are able to explain the same observations, Occam’s razor suggests the simpler one is preferable. It must be noted that Occam’s razor is a requirement for the simplicity of theories, not for the size of the systems described by those theories. For example, the immensity of the Universe isn’t at odds with the principle of Occam’s razor.

Occam’s razor is necessitated by the conjunction rule of probability theory: the conjunction A and B is necessarily less (or equally, in the case of logical equivalence) probable than the A alone; every detail you tack onto your story drives the probability down.

Occam’s razor has been formalized as Minimum Description Length or Minimum Message Length, in which the total size of the theory is the length of the message required to describe the theory, plus the length of the message required to describe the evidence using the theory. Solomonoff induction is the ultimate case of minimum message length in which the code for messages can describe all computable hypotheses. This has jokingly been referred to as “Solomonoff’s lightsaber”.

Notable Posts

See Also

External Links

Solomonoff in­duc­tion still works if the uni­verse is un­com­putable, and its use­ful­ness doesn’t re­quire know­ing Oc­cam’s razor

Christopher KingJun 18, 2023, 1:52 AM
38 points
28 comments4 min readLW link

Mes­sage Length

Zack_M_DavisOct 20, 2020, 5:52 AM
134 points
25 comments12 min readLW link

Belief in the Im­plied Invisible

Eliezer YudkowskyApr 8, 2008, 7:40 AM
66 points
34 comments6 min readLW link

Msg Len

Zack_M_DavisOct 12, 2020, 3:35 AM
66 points
4 comments1 min readLW link

Very Short In­tro­duc­tion to Bayesian Model Com­par­i­son

johnswentworthJul 16, 2019, 7:48 PM
32 points
5 comments1 min readLW link

[Question] What is an “anti-Oc­camian prior”?

ZaneOct 23, 2023, 2:26 AM
35 points
22 comments1 min readLW link

[Question] In­stru­men­tal Oc­cam?

abramdemskiJan 31, 2020, 7:27 PM
30 points
15 comments1 min readLW link

Tak­ing Oc­cam Seriously

steven0461May 29, 2009, 5:31 PM
32 points
51 comments1 min readLW link

A Semitech­ni­cal In­tro­duc­tory Dialogue on Solomonoff Induction

Eliezer YudkowskyMar 4, 2021, 5:27 PM
143 points
33 comments54 min readLW link

Oc­cam’s Ra­zor and the Univer­sal Prior

Peter ChatainOct 3, 2021, 3:23 AM
29 points
5 comments21 min readLW link

Math­e­mat­ics as a lossy com­pres­sion al­gorithm gone wild

ShmiJun 6, 2014, 11:53 PM
53 points
82 comments5 min readLW link

Bias to­wards sim­ple func­tions; ap­pli­ca­tion to al­ign­ment?

DavidHolmesAug 18, 2022, 4:15 PM
5 points
8 comments2 min readLW link

In­duc­tion; or, the rules and eti­quette of refer­ence class tennis

paulfchristianoMar 3, 2013, 11:27 PM
11 points
8 comments9 min readLW link

Oc­cam’s Razor

Eliezer YudkowskySep 26, 2007, 6:36 AM
135 points
55 comments5 min readLW link

Notes on Simplicity

David GrossDec 2, 2020, 11:14 PM
9 points
0 comments7 min readLW link

Dis­solv­ing the Prob­lem of Induction

LironDec 27, 2020, 5:58 PM
40 points
32 comments7 min readLW link

A Proof of Oc­cam’s Razor

UnknownsAug 10, 2010, 2:20 PM
0 points
139 comments3 min readLW link

[Question] Why would code/​English or low-ab­strac­tion/​high-ab­strac­tion sim­plic­ity or brevity cor­re­spond?

curiSep 4, 2020, 7:46 PM
2 points
15 comments1 min readLW link

Psy­chic Powers

Eliezer YudkowskySep 12, 2008, 7:28 PM
47 points
89 comments3 min readLW link

Prob­a­bil­ity the­ory im­plies Oc­cam’s razor

Maxwell PetersonDec 18, 2020, 7:48 AM
8 points
4 comments6 min readLW link

State, Art, Identity

musqJan 25, 2021, 8:22 PM
1 point
0 comments2 min readLW link

[Question] What qual­ities does an AGI need to have to re­al­ize the risk of false vac­uum, with­out hard­cod­ing physics the­o­ries into it?

RationalSieveFeb 3, 2023, 4:00 PM
1 point
4 comments1 min readLW link

What kind of place is this?

Jim PivarskiFeb 25, 2023, 2:14 AM
24 points
24 comments8 min readLW link

Ex­pla­na­tions as Hard to Vary Assertions

AlexanderSep 24, 2021, 11:33 AM
17 points
24 comments8 min readLW link

Not us­ing a pri­ori in­for­ma­tion for Rus­sian propaganda

EniScienApr 24, 2023, 1:14 AM
−5 points
4 comments1 min readLW link

Blunt Razor

fryolysisOct 24, 2023, 5:27 PM
3 points
0 comments2 min readLW link

The Strong Oc­cam’s Razor

cousin_itNov 11, 2010, 5:28 PM
17 points
74 comments3 min readLW link

Oc­cam’s Ra­zor May Be Suffi­cient to In­fer the Prefer­ences of Ir­ra­tional Agents: A re­ply to Arm­strong & Mindermann

Daniel KokotajloOct 7, 2019, 7:52 PM
47 points
39 comments7 min readLW link

Against Oc­cam’s Razor

zulupineappleApr 5, 2018, 5:59 PM
4 points
20 comments1 min readLW link

If Many-Wor­lds Had Come First

Eliezer YudkowskyMay 10, 2008, 7:43 AM
95 points
189 comments9 min readLW link

Where Re­cur­sive Jus­tifi­ca­tion Hits Bottom

Eliezer YudkowskyJul 8, 2008, 10:16 AM
125 points
81 comments10 min readLW link

A Priori

Eliezer YudkowskyOct 8, 2007, 9:02 PM
86 points
133 comments4 min readLW link

De­co­her­ence is Simple

Eliezer YudkowskyMay 6, 2008, 7:44 AM
72 points
62 comments11 min readLW link

Kevin T. Kelly’s Ock­ham Effi­ciency Theorem

JohnicholasAug 16, 2010, 4:46 AM
43 points
82 comments5 min readLW link

The prior of a hy­poth­e­sis does not de­pend on its complexity

cousin_itAug 26, 2010, 1:20 PM
34 points
69 comments1 min readLW link

How do low level hy­pothe­ses con­strain high level ones? The mys­tery of the dis­ap­pear­ing di­a­mond.

Christopher KingJul 11, 2023, 7:27 PM
17 points
11 comments2 min readLW link
No comments.