A friend of mine has offered to lend me the Kushiel series on a number of occasions. I’m starting to think I should take her up on that.
swestrup
Well, as an additional data point on how folks find less wrong, I found it through Overcoming Bias. I found that site via a link from some extropian or transhumanist blog, although I’m not sure which.
And I found the current set of my extropian and/or transhumanist blogs by actively looking for articles on cutting-edge science, which turn out to often be referenced by transhumanist blogs.
If we assume I’m rational, then I’m not going to assume anything about Omega. I’ll base my decisions on the given evidence. So far, that appears to be described as being no more and no less than what Omega cares to tell us.
I never knew I had an inbox. Thanks for telling us about that, but I wonder if we might not want to redesign the home page to make some things like that a bit more obvious.
This touches on something that I’ve been thinking about, but am not sure how to put into words. My wife is the most rational woman that I know, and its one of the things that I love about her. She’s been reading Overcoming Bias, but I’ve never been completely sure if its due to the material, or because she’s a fan of Eliezer. Its probably a combination of the two. In either case, she’s shown no interest in this particular group, and I’m not sure why.
I also have a friend who is the smartest person and the best thinker that I’ve ever met. He’s a practicing rationalist but of the sort who uses it as a means to an end. In his case its the design of computer systems of all kinds. Now, I haven’t even bothered to point out the Overcoming Bias and Less Wrong communities to him, as I can’t imagine he’d have any interest in them, although I’m sure he’d provide useful insights if one could get him interested.
So, of the three most likely candidates to participate in this group that I know of, only one does. This may well be partly due to my own biases in which groups of people I select to tell about which blogs I read, but I think some of it has got to be due to this site being somehow appealing to a narrower segment of the population than those who it might be most valuable to.
I have no proposed solution. This is simply an observation.
That, of course, is your opinion and you’re welcome to it. But I thought that I was (perhaps too verbosely to be clear) pointing out that this the original article was yet-another post on Less Wrong that seemed to be saying.
“Do X. Its the rational thing to do. If you don’t do X, you aren’t rational.”
I was trying to point out that there may be many rational reasons for not doing X.
Ah, interesting. That was not considered important enough to get into the RSS feed, so I never saw it.
I find it ‘interesting’ that we’ve both had our posts voted down to zero. Could it be that someone objects to pointing out that the game is a money sink and therefore one might have perfectly rational reasons to avoid it?
I have a Magic deck, but I don’t often play. That’s because Magic is not only an interesting game, its been carefully designed to continually suck more money out of your pocket.
Ever since it was first introduced (I happen to own a first generation deck) the game has been slowly increasing the power levels of the cards so that older cards are less and less valuable and one needs to buy ever more newer cards just to stay competitive.
Add to this the fact they regularly bring out new types of cards that radically shift the power balances in the game and one finds that it becomes a very expensive hobby to keep up with if you want to play with a random assortment of your friends.
So, like Warhammer 40K (another game known for being designed to be a money sink), I’ve deliberately stayed away from being competitive at. Oh, I have a few decks back from when the game was launched and recently was gifted another few by a friend who wanted to play, and I really do enjoy playing, but I’m not going to let myself get sucked in.
My first thought was to assume it was part of the whole alpha-male dominance thing. Any male that wants to achieve the status of alpha-male starts out in a position of being an underdog and facing an entrenched opposition with all of the advantages of resources.
But, of course, alpha-males outperform when it comes to breeding success and so most genes are descended from males that have confronted this situation, strove against “impossible” odds, and ultimately won.
Of course, if this is the explanation, then one would expect there to be a strong difference in how males and females react to the appearance of an underdog.
Montreal, Canada
Well, that’s just me. I’ve never been afraid of leaping feet-first into a paradox and seeing where that takes me. Which reminds me, maybe there’s a post in that.
These are both good points. Frankly I wasn’t trying to rock the boat with my post, I was trying to find out if there was a group of disgruntled rationalists who hadn’t liked the community posts and had kept silent. Had that been the case, this post would (I’m assuming) have helped to draw them out.
As for what I WOULD like to see, that’s a tricky problem in that I am interested in Rationality topics that I know little to nothing about. The trouble is, right now I don’t know what it is that I don’t know.
Comments vs. Upvoting.
I’ve been wondering if the number of comments that a post (or comment) gets should have an effect on a karma score. I say this because there are some 1-point comments that have many replies attached to them. Clearly folks thought the comment had some value, or they wouldn’t have replied to it. Maybe we need have each comment count as a vote, with the commenter having to explicitly choose +,-,or neutral in order to post?
I’m only now replying to this, since I’ve only just figured out what it was that I was groping for in the above.
The important thing is not compression, but integration of new knowledge so that it affects future cognition, and future behaviour. The ability to change one’s methodologies and approaches based on new knowledge would seem to be key to rationality. The more subtle the influence (ie, a new bit of math changes how you approach buying meat at the supermarket) then the better the evidence for deep integration of new knowledge.
You are stating that. But as far as I can tell Omega is telling me its a capricious omnipotent being. If there is a distinction, I’m not seeing it. Let me break it down for you:
1) Capricious → I am completely unable to predict its actions. Yes.
2) Omnipotent → Can do the seemingly impossible. Yes.So, what’s the difference?
When I look at my question there, the only answer that seems appropriate is ‘Introspection’ as that’s at least a step towards an answer.
And if Omega comes up to me and says “I was going to kill you if you gave me $100. But since I’ve worked out that you won’t, I’ll leave you alone.” then I’ll be damn glad I wouldn’t agree.
This really does seem like pointless speculation.
Of course, I live in a world where there is no being like Omega that I know of. If I knew otherwise, and knew something of their properties, I might govern myself differently.
I think my answer would be “I would have agreed, had you asked me when the coin chances were .5 and .5. Now that they’re 1 and 0, I have no reason to agree.”
Seriously, why stick with an agreement you never made? Besides, if Omega can predict me this well he knows how the coin will come up and how I’ll react. Why then, should I try to act otherwise. Somehow, I think I just don’t get it.
I have to admit, I’ve never understood Hanson’s Near-Far distinction either. As described it just doesn’t seem to mesh at all with how I think about thinking. I keep hoping someone else will post their interpretation of it from a sufficiently different viewpoint that I can at least understand it well enough to know if I agree with it or not.