tFUS could be one of the best techniques for improving rationality, esp b/c [AT THE VERY MINIMUM] it is so new/variance-increasing and if the default outcome is not one that we want (as was the case of Biden vs Trump, and Biden dropping out was the desireable variance-increasing move) [and is the case now among LWers who believe in AI doom], we should be increasing variance rather than decreasing it. tFUS may be the avenue for better aligning people’s thought with actions, especially when their hyperactive DMN or rumination gets in the way of their ability to align with themselves (tFUS being a way to shut down this dumb self-talk).
Even Michael Vassar has said “eliezer becoming CEO of openwater would meaningfully increase humanity’s survival 100x” and “MIRI should be the one buying openwater early devices trying to use them to optimize for rationality”
[btw if anyone knows of tFUS I could try out, I’m totally willing to volunteer]
tFUS could be one of the best techniques for improving rationality, esp b/c [AT THE VERY MINIMUM] it is so new/variance-increasing and if the default outcome is not one that we want (as was the case of Biden vs Trump, and Biden dropping out was the desireable variance-increasing move) [and is the case now among LWers who believe in AI doom], we should be increasing variance rather than decreasing it. tFUS may be the avenue for better aligning people’s thought with actions, especially when their hyperactive DMN or rumination gets in the way of their ability to align with themselves (tFUS being a way to shut down this dumb self-talk).
Even Michael Vassar has said “eliezer becoming CEO of openwater would meaningfully increase humanity’s survival 100x” and “MIRI should be the one buying openwater early devices trying to use them to optimize for rationality”
[btw if anyone knows of tFUS I could try out, I’m totally willing to volunteer]