My guess is that Eliezer Yudkowsky feels that nobody can convince him to publish the transcripts.
How about, with the same protocols as the original experiment, someone wagers $10 over IRC Chat to convince him to publish the transcripts? Somebody as the AI and Eliezer as the gatekeeper.
I have ten bucks for the first AI that defeats a gatekeeper (while risking some dough) and posts a link to the transcript here.
How about this one:
My guess is that Eliezer Yudkowsky feels that nobody can convince him to publish the transcripts.
How about, with the same protocols as the original experiment, someone wagers $10 over IRC Chat to convince him to publish the transcripts? Somebody as the AI and Eliezer as the gatekeeper.
Any takers?
-Erik
Edit: Nevermind.
I would like to play an AI.
Is this still true? I want to be gatekeeper, message me.
Are you offering to prove a point, or just for fun?
I’m sublimating my urge to get into fights and hurt people.
Doesn’t sound healthy. I was going to offer to be an AI but forget it.
I’m laughing so hard at this exchange right now (As a former AI who’s played against MixedNuts)
What was the result?
http://lesswrong.com/lw/gej/i_attempted_the_ai_box_experiment_and_lost/
And yet I have not found this post until this very day, how unfortunate...