You hold not having your consciousness altered or manipulated or otherwise tinkered with as an extremely high value.
Insert “without my conscious, deliberate, informed consent, and ideally agency”.
You think you’ll probably be miserable in the future
Replace “you’ll probably” with “you are reasonably likely to”.
and you find it hard to believe that the FAI will find you a friend comparable to your current friends.
Add “with whom I could become sufficiently close within a brief and critical time period”.
You won’t want to accept any type of brain modification or enhancement that would make you not miserable.
See first adjustment. n.b.: without my already having been modified, the “informed” part would probably take longer than the brief, critical time period.
If you’re sufficiently miserable, it’s likely than a FAI could change you without your consent
Yes. Or, perhaps not change me, but prevent me from acting to end my misery in a non-brain-tinkery way.
and you prefer death to the chance of that happening.
Insert “without my conscious, deliberate, informed consent, and ideally agency”.
Replace “you’ll probably” with “you are reasonably likely to”.
Add “with whom I could become sufficiently close within a brief and critical time period”.
See first adjustment. n.b.: without my already having been modified, the “informed” part would probably take longer than the brief, critical time period.
Yes. Or, perhaps not change me, but prevent me from acting to end my misery in a non-brain-tinkery way.
For certain subvalues of “that”, yes.