Super Sapiens! …I mean sapience.
Adroit Acumen
Elevated Erudition
Superb Sagacity
Crack Contemplation
Super Sapiens! …I mean sapience.
Adroit Acumen
Elevated Erudition
Superb Sagacity
Crack Contemplation
In this case it’s redirecting minds. That’s the ultimate goal isn’t it?
Now that would be completely unacceptable indeed. Is, say, being on the business end of the mental health system in the worst way possible something like that? For myself, I don’t consider a life with something like that to be worth living.
So, the only reason you’re still alive is that you haven’t bothered (or been able) to verify whether you’ve forgotten thoughts you don’t remember having had? My sympathies.
Born and raised in Price Hill on the edge of Delhi. I have no recent close-up photos of myself, but you can probably find an old one by googling my username. Otherwise I’ll be the pale, nearsighted ginger with a ponytail, and some pi on his shirt.
Holding off on proposing locations. I am not familiar with the northern half of Cincinnati.
Ooh! This excites me. I’ll start looking at possible venues here in Cincy when I get off work today. I can also ping the local skeptic and atheist meet-up groups to see if there are any LessWrong readers among them who missed the poll and this posting (as I almost did) and have them reply.
Elena Huston—Future In My Hands: An anthem against status quo bias, the sunk costs fallacy and appeals to authority (interpreting each even quatrain as a denigration of the prior odd quatrain).
I don’t know what donating my time to SI would entail other than writing, so find it difficult to imagine in a positive frame. I may be able to get around this by training myself on the five-second level to instead mentally contrast a charity’s desired future outcomes with the present (or your favorite charity’s desired future outcomes, when tempted to switch) when asked, but how many others in my position will do so?
So where can I find anecdotes about how awesome and fun it is to be saving the world through FAI research and how rewarding it is to see your work have a direct impact, so I have something vicariously available to imagine when you ask me to donate my time?
If you have three arbiters and require at least two of them to be party to any transactions and the creation of new arbiters, one can be a trusted or paid third party without risking theft, account freeze or unauthorized arbiter creation and you can safely recover from losing a single device.
I am ignorant of the details necessary to implement this and how difficult it might be.
There is no problem with “Munchkinism.” The problem is that in old RPG’s the rules imply poorly designed (see lack of challenge upon full understanding of the system) tactical battle simulation games with some elements of strategy, while the advertising implies a social interaction and story-telling game without giving the necessary rules to support it. Thus different people think they’re playing different games together and social interaction devolves into what people imagine they would do given a hypothetical situation without consequences (at least until the consequences are made explicit, violating their expectations as you note in your example).
Yes, that would be fair. Are you aware of any good methods for learning and practicing to be more concise?
On top of that, I expect there are already plenty of non-native, dedicated translators and interpreters for a given language gap. Oops, thank you both.
Oops. I realize now that I was confusing the definition of belief used here with the definition used for the game (a principled to-do list), so the idea isn’t as applicable as I originally thought, but I’ll try to answer you anyway.
As a player you can change your character’s beliefs almost as often as you like and the game rewards you for tailoring them to the context of each scene you enact, with different rewards depending on whether you act in accordance with them or undermine them (this encourages you to have conflicting beliefs, which increases the drama of the shared story). Then, between game sessions, all players involved nominate those beliefs you appear never to undermine for promotion to trait-hood (indicating you’ve fulfilled your character’s goals and they no longer need testing), and those you appear always to undermine for changing. Traits often give game mechanical bonuses and penalties, but can take almost a full story arc of deliberate undermining before being nominated for change.
Conflict in the game is handled in a very specific way. You describe your intent (what you want your character to achieve in the story) and how it is achieved, the GM declares the skill rolls or other game mechanics required and sets the stakes (consequences for failure). If the GM and none of the players can think of an interesting direction a failed roll could take the story in then no roll is made, you get what you wanted and the group moves on to the next, more interesting, conflict. Otherwise, the stakes are negotiated and you choose whether to roll or change your mind. Once the roll is made it’s results are irreversible within the fiction.
To a large degree it is up to the GM to create interesting and painful stakes with which to challenge your beliefs, so your mileage will vary.
for just about any language there are huge numbers of native speakers who speak professional-level English
Exception: Sign Languages, though they have relatively small populations.
Re-reading this post reminded me of Burning Wheel, a table top role playing game that’s reward system actively encourages questioning, testing and acting upon the goals and beliefs of a fictional character created by the player, but simultaneously and subversively places the character in complex situations that force the player to change those beliefs over time as a result of the conflicts they cause (and somewhat according to chance). The player has to accept that his character may become something completely alien to how it started during the course of play, yet continue to empathize with it in order to be rewarded for acting out it’s actions in the fiction.
Would (re)designing such a game around further encouraging elements of rationality be too close to Dark Arts? (Luke Crane, the game’s creator, sometimes speaks about game design as a form of mind control at the gaming conventions he frequents.)
As would I.
I like how SIAI’s name references both the event you’re working toward and method of achieving it. Is there a single word that describes a watershed event that would indicate the rationality institute’s direct success like “Singularity” does an intelligence explosion? That supporters could rally around and label themselves by (singularitarian)? A word for approximating the ideal Bayesian updater, for felling akrasia, for actually changing one’s mind? Can we create or annex one?
Exaltation, Transcendence, Apotheosis, Enlightenment, Upload, Elevation, Laudation, Upgrade, Epiphanic, and Ideate come to mind, but what I’m looking for is something more like “the act (event) of becoming your best self” in a word. Too many of these have strong religious connotations for me.