Well, UDASSA is false https://joecarlsmith.com/2021/11/28/anthropics-and-the-universal-distribution. As I argue elsewhere, any view other than SIA implies the doomsday argument. The number of possible beings isn’t equal to the number of “physically limited beings in our universe,” and there are different arrangements for the continuum points.
omnizoid
The argument for Beth 2 possible people is that it’s the powerset of continuum points. SIA gives reason to think you should assign a uniform prior across possible people. There could be a God-less universe with Beth 2 people, but I don’t know how that would work, and even if there’s some coherent model one can make work without sacrificing simplicity, P(Beth 2 people)|Theism>>>>>>>>>>>>>>>>>>>>>>P(Beth 2 people)|Atheism. You need to fill in the details more beyond just saying “there are Beth 2 people,” which will cost simplicity.
Remember, this is just part of a lengthy cumulative case.
If theism is true then all possible people exist but they’re not all here. SIA gives you a reason to think many exist but says nothing about where they’d be. Theism predicts a vast multiverse.
The cases are non-symmetrical because a big universe makes my existence more likely but it doesn’t make me more likely to get HTTTTTTTHTTHHTTTHTTTHTHTHTTHHTTTTTTHHHTHTTHTTTHHTTTTHTHTHHHHHTTTTHTHHHHTHHHHHHHTTTTHHTHHHTHTTTTTHTTTHTTHHHTHHHTHHTHTHTHTHTHHTHTHTTHTHHTTHTHTTHHHHHTTTTTTHHTHTTTTTHHTHHTTHTTHHTTTHTTHTHTTHHHTTHHHTHTTHHTTHTTTHTHHHTHHTHHHHTHHTHHHTHHHHTTHTTHTHHTHTTHTHHTTHHTTHHTH. The most specific version of the evidence is I get those sequence of coin flips, which is unaffected by the number of people, rather than that someone does that. My view follows trivially from the widely adopted SIA which I argued for in the piece—it doesn’t rely on some basic math error.
I didn’t attack his character, I said he was wrong about lots of things.
//If you add to the physical laws code that says “behave like with Casper”, you have re-implemented Casper with one additional layer of indirection. It is then not fair to say this other world does not contain Casper in an equivalent way.//
No, you haven’t reimplemented Casper, you’ve just copied his physical effects. There is no Casper, and Casper’s consciousness doesn’t exist.
Your description of the FDT stuff isn’t what I argued.
//I’ve just skimmed this part, but it seems to me that you provide arguments and evidence about consciousness as wakefulness or similar, while Yudkowsky is talking about the more restricted and elusive concept of self-awareness. //
Both Yudkowsky and I are talking about having experiences, as he’s been explicit about in various places.
//Your situation is symmetric: if you find yourself repeatedly being very confident about someone not knowing what they are saying, while this person is a highly regarded intellectual, maybe you are overconfident and wrong! I consider this a difficult dilemma to be in. Yudkowsky wrote a book about this problem, Inadequate Equilibria, so it’s one step ahead of you on the meta.//
I don’t talk about the huge range of topics Yudkowsky does. I don’t have super confident views on any topic that is controvsial among the experts—but Yudkowsky’s views aren’t, they mostly just rest on basic errors.
I think this comment is entirely right until the very end. I don’t think I really attack him as a person—I don’t say he’s evil or malicious or anything in the vicinity, I just say he’s often wrong. Seems hard to argue that without arguing against his points.
I never claimed Eliezer says consciousness is nonphysical—I said exactly the opposite.
If you look at philosophers with Ph.Ds who study decision theory for a living, and have a huge incentive to produce original work, none of them endorse FDT.
Yeah, I was just kidding!
About three quarters of academic decision theorists two box on Newcombe’s problem. So this standard seems nuts. Only 20% one box. https://survey2020.philpeople.org/survey/results/4886?aos=1399
My goal was to get people to defer to Eliezer. I explicitly say he’s an interesting thinker who is worth reading.
I dispute that . . .
I didn’t say Eliezer was a liar and a fraud. I said he was often overconfident and eggregiously wrong, and explicitly described him as an interesting thinker who was worth reading.
//It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.//
Nope false. There are no academic decision theorists I know of who endorse FDT, no philosophers of mind who agree with Eliezer’s assessment that epiphenomenalism is the term for those who accept zombies, and no relevant experts about consciousness who think that animals aren’t conscious with Eliezer’s confidence—that I know of.
The examples just show that sometimes you lose by being rational.
Unrelated, but I really liked your recent post on Eliezer’s bizarre claim that character attacks last is an epistemic standard.
What’s your explanations of why virtually no published papers defend it and no published decision theorists defend it? You really think none of them have thought of it or anything in the vicinity?
I mean like, I can give you some names. My friend Ethan who’s getting a Ph.D was one person. Schwarz knows a lot about decision theory and finds the view crazy—MacAskill doesn’t like it either.
I wouldn’t call a view crazy for just being disbelieved by many people. But if a view is both rejected by all relevant experts and extremely implausible, then I think it’s worth being called crazy!
I didn’t call people crazy, instead I called the view crazy. I think it’s crazy for the reasons I’ve explained, at length, both in my original article and over the course of the debate. It’s not about my particular decision theory friends—it’s that the fact that virtually no relevant experts agree with an idea is relevant to an assessment of it.
I’m sure Soares is a smart guy! As are a lot of defenders of FDT. Lesswrong selects disproportionately for smart, curious, interesting people. But smart people can believe crazy things—I’m sure I have some crazy beliefs; crazy in the sense of being unreasonable such that pretty much all rational people would give them up upon sufficient ideal reflection and discussion with people who know what they’re talking about.
It may be imaginable, but if it’s false, who cares. Like, suppose I argue, that fundamental reality has to meet constraint X and view Y is the only plausible view that does so. Listing off a bunch of random ones that meet constraint X but are false doesn’t help you .