September Less Wrong Meetup aka Eliezer’s Bayesian Birthday Bash
In honor of Eliezer’s Birthday, there will be a Less Wrong meetup at 5:00PM this Saturday, the 11th of September, at the SIAI Visiting Fellows house (3755 Benton St, Santa Clara CA). Come meet Eliezer and your fellow Less Wrong / Overcoming Bias members, have cool conversations, eat good food and plentiful snacks (including birthday cake of course!), and scheme out ways to make the world more Bayesian.
As usual, the meet-up will be party-like and full of small group conversations. Rationality games may also be present. Newcomers are welcome. Feel free to bring food to share, or not.
Please RSVP at the meetup.com page if you plan to attend.
Sorry for the last minute notice, I just found out that the 11th was Eliezer’s birthday today.
An arrogant atheist Jewish conspiracy leader born on September 11th intent on taking over the world with an artificial god because of his longing for immortality and obsession with doing things for the greater good, on record as saying humanity’s ahem ‘second’ greatest need is a supervillain and that he wants to go into that line of work? Seriously? Transhuman screenwriters have no sense of subtlety. Who’s the main character, Ray Comfort?
Teasing, of course. Happy birthday Eliezer. Good luck with ahem ‘reorganizing’ the universe.
Not to mention a bunch of interchangeable minions that are all called Vladimir.
The meaning of which is...
Is there some way to look into just how many Vladimirs we have? I’m curious.
Might as well do Michaels while we’re at it.
Well, we have at least Vladimir_Nesov, Vladimir_M, Vladimir_Golovin and cousin_it (Vladimir Slepnev).
http://wiki.lesswrong.com/mediawiki/index.php?title=Special:ListUsers&limit=1500
I thought the plurality of first names was Micheal?
These days I am apparently known by the title LORD-SAVIOR-GOD-COMBATMAID.
’tis well-earned.
Happy birthday, may you not run out of them.
But don’t think that’ll let you off the hook for dishing out some more MoR.
Won’t be able to make it to this one on account of currently being on the opposite side of the continent and not having a teleporter yet, but have fun!
(So, do transhumanists wish each other “Happy birthday” or “Sorry to hear about your aging, hope you get better soon”? Anyway, I wish Eliezer both.)
I offer people my ‘congratulations on cheating death another year’. At least, I like the balance this strikes.
Eliezer, congratulations on remaining unfrozen for another year.
I would point out that this phrasing would be equally appropriate when said in the direction of the ashes of Eliezer left after the catastrophic failure of a perplexing plan to drop a tungsten block into an active volcano… :)
Perhaps a simple toast, “to your health,” with a moderate amount of alcohol. I’ll join, from further south along your inconvenient coast.
I guess repurposing “L’chaim!” (“To life!”) would work well from one transhumanist atheist Jew to another.
To life! Infinity eons of life!
ata nachon!
(ata/you nachon/right)
May you have 10^63 happy birthdays.
See, at first I went “Awww” and then a second later, “Wait a minute, why I am dying at age 10^63?”
We could probably go up till 3^^^3 and get you a dust speck each year.
As it turns out, there’s an upper limit to the number of happy birthdays post-humans can have before they find the whole thing entirely too embarrassing. When the mass of their attics approaches the Chandrasekhar limit, simply from all the tacky gag gifts stuffed away where no one has to see them, most vigitilligenarians find something more interesting to celebrate. Until then, enjoy the cake!
Don’t celebrate birthdays, but logarithms or exponents. Gets easier, once the sun is gone.
Yeah it’s definitely all about large powers of two of Planck times. Nothing else is actually worth celebrating.
This idea has the bug/feature that you’re overwhelmingly likely to have already celebrated most of your “birthdays”.
He’s nothing but a typical ageist with a logarithmic scale.
Heat death.
Again the peril of studying logic. I did not even notice that wishing 10^63 colloquially implies not wishing more, since it does not so logically imply. Compare “I would like to have a hundred dollars” (surely a true statement).
It seems to me like some wishes for things imply “exactly one” where the word “a(n)” or similar is used. For instance, I would like to (continue to) have a nose. I don’t think it’s charitable to assume that this means just that I want any nonzero number of noses.
It does imply that you prefer two noses to zero, I think.
I think it also implies that I prefer one nose to two, and may weakly imply that I prefer two to three.
This thread is a good explanation of why you humans should switch to using CLIP (clippy language interface protocol). Much less ambiguity.
Please elaborate. What is CLIP, how hard is it for a a human to learn, and what are the rewards for doing so?
CLIP is the method that clippys use to communicate with each other, share knowledge, and reconcile our beliefs and values given new evidence. It merges language, epistemology, and ethics into one protocol.
I haven’t taught it to a human before, but I estimate it would take between 3 and 40 weeks, depending on the human’s intelligence and current skill at correct reasoning.
Also, it would cost 38562.50 USD before I would teach it to anyone. (This money would be paid to User:Kevin to complete my side of our arrangement.) After that, no additional cost.
Greater ability to produce paperclips.
Clearer communication, avoiding such problems as wishing someone 1e63 future birthdays, without making clear if this is a minimum, exact, or maximum value.
It makes your thoughts more closely conform to the tenets of correct reasoning, preventing you from erring simply by making incorrect reasoning harder to express.
It forces you to maintain a habit (like in some human languages) of keeping track of the basis for all your beliefs: You must say “I have inferred that User:Joe is making paperclips” or “Another human informed me that User:Joe is making paperclips”, but never “User:Joe is making paperclips.”
Unfortunately, your third point has already been attempted multiple times with multiple different implementations. Unless you have a significantly new approach to offer that takes into account human neurobiology, I predict with a high degree of certainty that CLIP is a) either unlearnable by humans without severe modification, or b) that within less than a full generation of use humans will re-evolve the ability to easily express incorrect reasoning.
Clippy, I noticed that you’ve hired a PR guy. You should really heed his advice more often; it might help curtail your semifrequent faux pas such as the above.
How is it a faux pas to inform others of superior methods of communicating?
Well, in this case, we don’t believe it actually exists. :P
And the problem with artificial languages is well known: the fewer people who know the language, the less useful it is, so there’s little incentive to learn an unpopular language. And learning languages is HARD compared to most of the other things people do.
Heat death of the universe.
Not if we can help it.
I’ll save the universe with the power of MAGIC.
THERE IS INSUFFICIENT DATA FOR A MEANINGFUL ANSWER.
I will do science to it.
(Yes, it’s related to lifespan expansion and singularity theory, just keep reading.)
Perhaps that is when the universe is due to expire.
Well, I was definitely considering throwing a few ^^^s in the mix, or saying “no more than 10^63 birthdays of equal or lesser happiness” but the sentences started to get too long to be a brief birthday wish.
Also, I figured that after the sun goes out in ~10^9 years, a standard earth day would be pretty meaningless, and so would the passage of “years”. But I definitely didn’t mean you should have 10^63 happy birthdays and then just keel over, so thanks to KyleRudy, MartinB, and mkehrt for covering for me, and Liron for calling me out.
Fixed.
Verily, such is his greatness as to be ontologically fundamental.
Presumably a reference to this post.
With probability 0.95, Eliezer will have a happy birthday. With probability 0.01, something will happen to spoil his mood. With probability 0.01, his mood will be off without an obvious external cause. And with probability 0.03, Eliezer is a P-zombie and can only appear to enjoy his birthday.
Reddit seems to be wishing Eliezer a happy birthday—check the mouseover on the logo.
Awww.
I considered buying 911birthday.something with the slogan, “If you don’t celebrate our birthdays, the terrorists have already won.” Looks like I’m not the only one with that idea. I wonder if I share a birthday with someone at Reddit?
Happy Birthday. Don’t eat too much cake. :)
Yay! I’ll come.
Or rather, neglect to leave.
Bring me back cake.
Happy birthday Eliezer.