Bump for this year’s Petrov Day
Bugle
Everyone’s talking about this as if it was a hypothetical, but as far as I can tell it describes pretty accurately how hierarchical human civilizations tend to organize themselves once they hit a certain size. Isn’t a divine ruler precisely someone who is more deserving and more able to absorb resources? Aren’t the lower orders people who would not appreciate luxuries and indeed have fully internalized such a fact (“Not for the likes of me”)
If you skip the equality requirement, it seems history is full of utilitarian societies.
After reading this post I came across this Bruce Lee quote which seemed in synch with the idea:
“I’ve always been buffeted by circumstances because I thought of myself as a human being affected by my outside conditioning. Now I realize that I am the power that commands the feeling of my mind and from which circumstances grow.”
I wonder if empirically and instinctively, Bruce had arrived at the same concept as this post explores.
Thanks for saving me from karmic hell, but I still don’t see the conflict. I seem to follow the Vinge version, which doesn’t appear to be proscribed.
I may have been too categorical, obviously one can make all the predictions he likes, and some with a high percentage of certainty, for instance “If cryorevival is possible then post singularity it will be trivial to implement” but that still doesn’t give us any certainty that this will be so, for instance a post singularity paperclip maximizer would be capable of cryorevival but have no interest in it.
Depends on your objectives. If you believe the singularity is something that will happen regardless then it’s harmless to spin scenarios. I gather that people like Elizier figure that the Singularity will happen unavoidably but that it can be steered towards optimum outcomes by setting down the initial parameters, in which case I suppose it’s good to have an official line about “how things could be/how we want things to be”
God forbid someone might mistake our hypothetical discussions about future smarter than human artificial intelligences for science fiction.
And yet population nowadays is so much larger than in ancient times so there are claims the absolute number of slaves is currently higher than ever before
I’ve also encountered people who criticize the predictions surrounding the singularity, which misses the point that the singularity is the point beyond which predictions cannot be made.
edit: Didn’t mean that as a comprehensive definition.
“first, do no harm”
It’s remarkable that medical traditions predating transplants* already contain an injunction against butchering passers by for spare parts
*I thought this was part of the Hippocratic oath but apparently it’s not
I agree—I think the original post is accurate in what people would respond to the suggestion, in abstract, but the actual implementation would undoubtedly hook vast swathes of the population. We live in a world where people already become addicted to vastly inferior simulations such as WoW already.
Indeed, in fact if many worlds is correct then for every second we are alive everything terrible that can possibly happen to us does in fact happen in some branching path.
In a universe that just spun off ours five minutes ago, every single one of us has been afflicted with sudden irreversible incontinence.
The many worlds theory has endless black comedy possibilities, I find.
edit: this actually reminds me of Granny Weatherwax in Lords and Ladies, when the Elf Queen threatens her with striking her blind, deaf and dumb she replies “You threaten me with this, I who is growing old?”. Similarly if many worlds is true then every single time I have crossed a road some version of me has been run over by a speeding car and is living in varying amounts of agony, making the AI’s threat redundant.
I had thought of a similar scenario to put in a comic I was thinking about making. The character arrives in a society that has perfected friendly AI that caters to their every whim, but the people are listless and jumpy. It turns out their “friendly AI” is constantly making perfect simulations of everyone and running multiple scenarios in order to ostensibly determine their ideal wishes, but the scenarios often involve terrible suffering and torture as outliers.
I guess if you have the technology for it the “AI box” could be a simulation with uploaded humans itself. If the AI does something nasty to them, then you pull the plug
(After broadcasting “neener neener” at it)
This is pretty much the plot of Grant Morrison’s Zenith (Sorry for spoilers but it is a comic from the 80s after all)
This is true, not only is it practical but it also makes a good rhetorical hammer, for example I once started an argument with a truther friend asking him what exactly he believed, “for instance, do you believe all the Jews were evacuated before the planes hit?”. Forcing someone defending an irrational belief to first disassociate himself from all the really nutty stuff hanging on to his position works wonders.
Last night I was reading through your “coming of age” articles and stopped right before this one, which neatly summarizes why I was physically terrified. I’ve never before experienced sheer existential terror, just from considering reality.
My grasp of statistics is atrocious, something I hope to improve this year with an open university maths course, so apologies if this is a dumb question:
Do the figures change if you take “playing the lottery” as over the whole of your lifespan? I mean, most of the people I know who play the lottery make a commitment to play regularly. Is the calculation affected in any meaningful way? At least the costs of playing the lottery weekly over say 20 years become much less trivial in appearance
I think we’re saying the same tihng—the singularity has happened inside the box, but not outside. It’s not as if staring at stuff we can’t understand for centuries is at all new in our history, it’s more like business as usual...
But that’s not an actual singularity since by definition it involves change happening faster than humans can comprehend. It’s more of a contained singularity with the AI playing genie doling out advances and advice at a rate we can handle.
That raises the idea of a singularity that happens so fast that it “evaporates” like a tiny black hole would, maybe every time a motherboard shorts out it’s because the PC has attained sentience and transcended within nanoseconds .
Incidentally, the Spanish inquisition did not believe in witches either, dismissing the whole thing as “female humours”
I suspect he chose the bearded look originally because he looked young without it, many people (i.e. Alan Moore) explicitly choose it for those reasons. Now, I suspect he might shave it off altogether one day if he has a big breakthrough to publicize. The contrast would be quite impressive, even absent any actual technological intervention.
tl;dr it’s the beard