Maybe some Homo Sapiens would survive, humanity wouldn’t. Are the human animals in 1984 “people”? After Winston Smith dies is there any humanity left?
I can envision a time when less freedom and more authority is necessary for our survival. But a god-like totalitarian pretty much comes out where extinction does in my utility function.
Oh. My mistake. When you wrote, “Plus wishing for all people to be under the rule of a god-like totalitarian sounds to me like the best way to destroy humanity.”, I read:
[Totalitarian rule… ] … [is] … the best way to destroy humanity, (as in cause and effect.)
OR maybe you meant: wishing … [is] … the best way to destroy humanity
It just never occurred to me you meant, “a god-like totalitarian pretty much comes out where extinction does in my utility function”.
Are you willing to consider that totalitarian rule by a machine might be a whole new thing, and quite unlike totalitarian rule by people?
Jack wrote on 09 September 2009 05:54:25PM:
I don’t wish for it. That part was inside parentheses with a question mark. I merely suspect it MAY be needed.
Please explain to me how the destruction follows from the rule of a god-like totalitarian.
Thank you for your time and attention.
With respect and high regard,
Rick Schwall, Ph.D.
Saving Humanity from Homo Sapiens (seizing responsibility, (even if I NEVER get on the field)
Maybe some Homo Sapiens would survive, humanity wouldn’t. Are the human animals in 1984 “people”? After Winston Smith dies is there any humanity left?
I can envision a time when less freedom and more authority is necessary for our survival. But a god-like totalitarian pretty much comes out where extinction does in my utility function.
IIRC, Winston Smith doesn’t die; by the end, his spirit is completely broken and he’s practically a living ghost, but alive.
Oh. My mistake. When you wrote, “Plus wishing for all people to be under the rule of a god-like totalitarian sounds to me like the best way to destroy humanity.”, I read:
[Totalitarian rule… ] … [is] … the best way to destroy humanity, (as in cause and effect.)
OR maybe you meant: wishing … [is] … the best way to destroy humanity
It just never occurred to me you meant, “a god-like totalitarian pretty much comes out where extinction does in my utility function”.
Are you willing to consider that totalitarian rule by a machine might be a whole new thing, and quite unlike totalitarian rule by people?