I’d love it if people could try the basic precautions and see how harmless they are! Especially because they might be the minimum ask in order to avoid getting your brain and motivation/values hacked.
Wouldn’t this be useful only if one knows for certain their ‘brain and motivation/values’ are not already ‘hacked’ beforehand?
Otherwise it would just strengthen the pre-existing ‘hacks’.
The exploit I’m aware of that could make someone chain themselves into remaining vulnerable or compromised are exploits that drive people to continue exposure to high-risk environments that facilitate more exploits. For example, continuing to use social media or leave webcameras uncovered.
This is why the EV of ceasing social media use and covering up webcams is so high; they facilitate further manipulation to keep you and your friends vulnerable.
EDIT: I also think it’s worthwhile to think of things, e.g. planting ideas in people’s heads or setting them up to react a certain way if specific conditions are met. For example, persuading them that caring exclusively about their friends and family, instead of the future, is part of the maturation process, or insinuating that Yudkowsky is evil, preventing them from reading the Sequences or contributing to AI safety.
I tend to think that it will primarily chain them back to social media, because that’s where the magic happens (especially because humans on the smarter end will inevitably become OOD over time, and hopefully become truer to themselves).
Wouldn’t this be useful only if one knows for certain their ‘brain and motivation/values’ are not already ‘hacked’ beforehand?
Otherwise it would just strengthen the pre-existing ‘hacks’.
The exploit I’m aware of that could make someone chain themselves into remaining vulnerable or compromised are exploits that drive people to continue exposure to high-risk environments that facilitate more exploits. For example, continuing to use social media or leave webcameras uncovered.
This is why the EV of ceasing social media use and covering up webcams is so high; they facilitate further manipulation to keep you and your friends vulnerable.
EDIT: I also think it’s worthwhile to think of things, e.g. planting ideas in people’s heads or setting them up to react a certain way if specific conditions are met. For example, persuading them that caring exclusively about their friends and family, instead of the future, is part of the maturation process, or insinuating that Yudkowsky is evil, preventing them from reading the Sequences or contributing to AI safety.
I tend to think that it will primarily chain them back to social media, because that’s where the magic happens (especially because humans on the smarter end will inevitably become OOD over time, and hopefully become truer to themselves).