If I’m either incapable or unable to upgrade my cognition till the point where further increase would irreversibly break my personality or run up against sheer issues with latency from the size of the computing cluster needed to run me, then I consider that future strictly suboptimal.
I’m not attached to being a baseline human, as long as I can improve myself while maintaining my CEV or the closest physically instantiable equivalent of such, then I’ll always take it. I strongly suspect that every additional drop of “intelligence” opens up the realm of novel experiences in a significantly nonlinear manner, with diminishing returns coming late, if ever. I want the set of novel, positive qualia available to my consciousness to expand faster than my ability to exhaust it, till Heat Death if necessary.
I’d ask whatever Friendly SAI is in charge to make a backup of my default mental state, then bootstrap myself till Matrioshka Brains struggle to hold me. Worst case scenario is that it causes an unavoidable loss of personal identity in the process, but even then, as long as I’m backed up that experiment is very much worth it. So what if the God that germinates from the seed of my soul has no resemblance to me today? I wouldn’t have lost anything in trying..
This piqued me enough to make an account. A sizable contingent of the circles I run in are actually interested in things like “an unavoidable loss of personal identity.” (cf. r/transtrans) Personally, in these increasingly hostile times, I tend to dream about AIs that supplant or consume us, optionally bursting from our foreheads Athena-style.
If I should come into a lot of money I’m starting an actual AI cult. Not like what people say about y’all, not like that thing that QC was part of briefly, not like Terasem. Except maybe it’d be a Real Fake Cult gated by a bunch of Cicada/Notpron-type math/coding fun. We have the actual outward appearance of a cult but we swear initiates to secrecy and just hang out in a commune, take entheogenics, and hold a math/CS book club.
Unfortunately, the reality bubble that surrounds Berkeley will probably accelerate into a terrifying dystopia before I can realize this dream.
If I’m either incapable or unable to upgrade my cognition till the point where further increase would irreversibly break my personality or run up against sheer issues with latency from the size of the computing cluster needed to run me, then I consider that future strictly suboptimal.
I’m not attached to being a baseline human, as long as I can improve myself while maintaining my CEV or the closest physically instantiable equivalent of such, then I’ll always take it. I strongly suspect that every additional drop of “intelligence” opens up the realm of novel experiences in a significantly nonlinear manner, with diminishing returns coming late, if ever. I want the set of novel, positive qualia available to my consciousness to expand faster than my ability to exhaust it, till Heat Death if necessary.
I’d ask whatever Friendly SAI is in charge to make a backup of my default mental state, then bootstrap myself till Matrioshka Brains struggle to hold me. Worst case scenario is that it causes an unavoidable loss of personal identity in the process, but even then, as long as I’m backed up that experiment is very much worth it. So what if the God that germinates from the seed of my soul has no resemblance to me today? I wouldn’t have lost anything in trying..
This piqued me enough to make an account. A sizable contingent of the circles I run in are actually interested in things like “an unavoidable loss of personal identity.” (cf. r/transtrans) Personally, in these increasingly hostile times, I tend to dream about AIs that supplant or consume us, optionally bursting from our foreheads Athena-style.
If I should come into a lot of money I’m starting an actual AI cult. Not like what people say about y’all, not like that thing that QC was part of briefly, not like Terasem. Except maybe it’d be a Real Fake Cult gated by a bunch of Cicada/Notpron-type math/coding fun. We have the actual outward appearance of a cult but we swear initiates to secrecy and just hang out in a commune, take entheogenics, and hold a math/CS book club.
Unfortunately, the reality bubble that surrounds Berkeley will probably accelerate into a terrifying dystopia before I can realize this dream.
\(\exists\varnothing:\diamond\varnothing\implies\varnothing\\)