This is exactly what I was looking for! Thank you kindly, looking through it as soon as I find time.
Paul_G
Teacher in a geology class who is decidedly non-rationalist mentioned that 800 years thing, without a source. Something about thickness of a line.
This is the first topic I’ve found in which I have no idea how to dissect this and figure out what’s going on. It appears that there are incredibly powerful arguments for both sides, and mountains of strong evidence both for and against human caused climate change… Which shouldn’t be possible. A lot of the skeptics seem to have strong arguments countering many of the “alarmist” ideas...
I’m not a good enough rationalist for this, yet. If it weren’t for this community’s famous support of global warming, there is no way I’d believe in it, given the data I have. Strange.
I’m not sure it’s worth posting sources and the like, counter-counter arguments become difficult to follow, and it could easily cause a kurfuffle that I would rather avoid.
Thank you all greatly!
Why do LWers believe in global warming? The community’s belief has changed my posterior odds significantly, but it’s the only argument I have for global warming at the moment. I saw the CO2 vs temperature graphs, and that seemed to sell it for me… Then I heard that the temperature increases preceded the CO2 emissions by about 800 years...
So why does the community at large believe in it?
Thanks!
It’s weekly, but on Mondays, not Tuesdays. Apologies for any inconvenience.
It should actually be December 3rd.
Serious issue—I made a typo. This should be December 3rd. Do not show up today, we meet weekly on Mondays.
Oops. Sorry about that.
“We can imagine any number of universes, that does not always lead to a good argument. In this case, the main issue with the argument is that while we can imagine that universe, it doesn’t look like ours. There’s no talk of consciousness, there’s no self-reflection. Those are things in reality clearly caused by a link between our thoughts and our brains, one that goes in both directions.
Imagining a world in which people act exactly like people do now, but without a consciousness, strays so clearly outside the bounds of Occam’s Razor that there doesn’t seem to be any point in thinking about it. Adding in a mysterious ‘zombie master’ to make the zombies act as though they had consciousness… Well at this point, we’re not talking about anything remotely resembling reality. This entire thought experiment in no way gives us any truths about reality whatsoever. It is completely meaningless.”
As someone with some experience dealing with this, having learned how difficult it is to fix I would reply something like “You are wrong. If you want to learn WHY you’re wrong, tell me and we can work on this together. Otherwise, I’m going to go now.”
Playing the game a bit: “Okay, bear with me a moment, this is going to sound a little odd.
I’m not sure what you mean by “outside the realm of causal processes”. Does that mean it happens on its own, with no outside influence at all? Nothing causes it, it just… Happens? Even if it’s a ‘magic’ skill, shouldn’t he be the one to activate it? I mean, worst case scenario, it’s caused by someone drawing a card from a deck. It doesn’t happen completely independently of reality, it’s CAUSED by something. If your cousin is the only one with this power, I’m sure he could be studied by the scientists and they could figure out what lights up in his brain as he does it.
A minor note, I was once a card magician, and there are very specific ways to either force people to choose the card you want, or to figure out what card they’ve chosen. I can show you a few, if you want.
Next, ‘communing with the entire universe’ is a pretty arrogant thing to say, isn’t it? I never got any communication, anyway. Question for you—how would it feel to look deeply inwards, ask ‘the universe’ questions, and receive answers from your own mind? Would it feel much different from what you feel now? Usually it’s better to assume that confusing or ‘unexplained’ things are happening in your mind, not in reality. You FEEL like the universe has told you that he loves you, but that would look exactly the same as if it was just your unconscious mind telling you. How often have people said that they were deeply, permanently in love, but then it didn’t work out? Do you really think you’re that much better than everyone else?”
Ah, okay. This makes sense to me, but I found the wording rather confusing. I’ll have to warn people I suggest this article to, I suppose.
Thank you kindly!
I don’t post here much (yet), and normally I feel fairly confident in my understanding of basic probability...
But I’m slightly lost here. “if the Sidewalk is Slippery then it is probably Wet and this can be explained by either the Sprinkler or the Rain but probably not both, i.e. if we’re told that it’s Raining we conclude that it’s less likely that the Sprinkler was on.” This sentence seems… Wrong. If we’re told that it’s Raining, we conclude that the chances of Sprinkler is… Exactly the same as it was before we learned that the sidewalk was wet.
This seems especially clear when there was an alarm, and we learn there was a burglar—p(B|A) = .9, so shouldn’t our current p(E) go up to 0.1 * p(E|A) + p(E|~A)? Burglars burgling doesn’t reduce the chance of earthquakes… Adding an alarm shouldn’t change that.
What am I missing?
Hi! My name is Paul, and I’ve been an aspiring rationalist for years. A long time ago, I realized implicitly that reality exists, and that there is only one. I think “rationality” is the only reasonable next thing to do. I pretty much started “training” on TvTropes, reading fallacies and the like there, as well as seeing ways to analyze things in fiction. The rules there apply to real life fairly well.
From there, I discovered Harry Potter and the Methods of Rationality, and from there, this site. Been reading quite a bit on and off over the past little while, and decided to become a bit more active.
Just visited a meetup group in Ottawa (which is about a 2 hour drive), and I no longer feel like the only sane man in the world. Meeting a group of Bayesian rationalists was incredibly enlightening. I still have a lot to learn.
I’m a university student trying to decide between the Torbjorn and an Aeron. Normally I’d just go with the cheaper option, but I’d like to know if there’s enough of a difference to justify spending ten times more on the Aeron. I’ve worked in an Aeron before, and while they’re very comfortable, I don’t want to drop that kind of money on comfort without long term benefit.
Does anyone have any numbers or anecdotal evidence to help sway my decision in either direction? Thanks!