we can somewhat comfortably map our previous assumptions on to a new domain.
I’m not sure how comfortably.
I saw a bit of the movie her about the love affair between a guy and his operating system. It was horrifying to me, but I think for a different reason than everyone else in the room. I was thinking, “he might be falling in love with an automaton. How do we know if he is in a relationship with another mind or just an unthinking mechanism of gears and levers that looks like another mind from the outside?” The idea of being emotionally invested in an emotional void, bothers me. I want my relationships to be with other minds.
Some here see this as a meaningless distinction. The being acts the same as a mind, so for all intents and purposes it is a mind. What difference does it make to your utility if the relationship is with an empty chasm shaped like a person? The input-output is the same.
Perhaps. I’m still working around this, and perhaps my discomfort is a facet of a outdated worldview. I’ll note however, that this reduces charity to fuzzy-seeking. It doesn’t make any difference if you actually help as long as you feel like you help. If presented the choice, saving a life is indifference to saving every life.
In any case, I feel safe in presuming the consciousness of other humans, not because they resemble me in outputs, as because we were both produced by the same process of evolution, and it would be strange if the evolution made me conscious but not the beings that are genetically basically-identical to me. I do not so readily make that assumption for a non-human, even a human-brain-emulation running on hardware other than a brain.
I’m not sure how comfortably.
I saw a bit of the movie her about the love affair between a guy and his operating system. It was horrifying to me, but I think for a different reason than everyone else in the room. I was thinking, “he might be falling in love with an automaton. How do we know if he is in a relationship with another mind or just an unthinking mechanism of gears and levers that looks like another mind from the outside?” The idea of being emotionally invested in an emotional void, bothers me. I want my relationships to be with other minds.
Some here see this as a meaningless distinction. The being acts the same as a mind, so for all intents and purposes it is a mind. What difference does it make to your utility if the relationship is with an empty chasm shaped like a person? The input-output is the same.
Perhaps. I’m still working around this, and perhaps my discomfort is a facet of a outdated worldview. I’ll note however, that this reduces charity to fuzzy-seeking. It doesn’t make any difference if you actually help as long as you feel like you help. If presented the choice, saving a life is indifference to saving every life.
In any case, I feel safe in presuming the consciousness of other humans, not because they resemble me in outputs, as because we were both produced by the same process of evolution, and it would be strange if the evolution made me conscious but not the beings that are genetically basically-identical to me. I do not so readily make that assumption for a non-human, even a human-brain-emulation running on hardware other than a brain.