Hey, all! An interesting discussion in this thread.
Regarding terminal/ end goals...
I’ve come up with a goal framework consisting of 3 parts:
1) TRUTH. Let’s get to know as much as we can, basing our decisions on the best available knowledge, never closing our eyes to the truth.
2) KINDNESS. Let’s be good to each other, for this is the only kind of life worth living for.
3) BLISS. Let’s enjoy this all, every moment of it.
(A prerequisite to them all is existence, survival.
For me, the idea of infinite or near-infinite survival of me/ humankind certainly has appeal, but I’d choose a somewhat shorter existence having more of the above-mentioned 3 things over a somewat longer existence with less of these things. But this is another longer discussion, let’s just say that IF existence already exists, for a shorter or longer time, then that’s what it should be like).
These 3 goals/values are axiomatic, they are what I consciously choose to want. What I want to want. Be ther humans, transhumans, AI, whatever—a world that consists more of these things is a better direction to head towards, a world that has less, a worse one.
Yet another longer discussion is, what would the trade-offs between each of these be, but let’s just say for now, that the goal is to find harmonious outcomes that have all three of these. (This way, wireheading-style happiness and harming-others-as-happiness, can easily be excluded).
Anyone wants to discuss something further from here, I’d be glad to.
Hey, all! An interesting discussion in this thread. Regarding terminal/ end goals...
I’ve come up with a goal framework consisting of 3 parts: 1) TRUTH. Let’s get to know as much as we can, basing our decisions on the best available knowledge, never closing our eyes to the truth. 2) KINDNESS. Let’s be good to each other, for this is the only kind of life worth living for. 3) BLISS. Let’s enjoy this all, every moment of it.
(A prerequisite to them all is existence, survival. For me, the idea of infinite or near-infinite survival of me/ humankind certainly has appeal, but I’d choose a somewhat shorter existence having more of the above-mentioned 3 things over a somewat longer existence with less of these things. But this is another longer discussion, let’s just say that IF existence already exists, for a shorter or longer time, then that’s what it should be like).
These 3 goals/values are axiomatic, they are what I consciously choose to want. What I want to want. Be ther humans, transhumans, AI, whatever—a world that consists more of these things is a better direction to head towards, a world that has less, a worse one. Yet another longer discussion is, what would the trade-offs between each of these be, but let’s just say for now, that the goal is to find harmonious outcomes that have all three of these. (This way, wireheading-style happiness and harming-others-as-happiness, can easily be excluded).
Anyone wants to discuss something further from here, I’d be glad to.