My government name is Mack Gallagher. Crocker’s Rules. I am an “underfunded” “alignment” “researcher”. DM me if you’d like to fund my posts, or my project.
I post some of my less-varnished opinions on my personal blog. In the past they went on my Substack.
If you like arguing with me on LessWrong, at present I’m basically free round the clock to continue interesting arguments in my Discord.
Yudkowsky’s sequences [/Rationality: AI to Zombies] provide both these things. People did not read Yudkowsky’s sequences and internalize the load-bearing conclusions enough to prevent the current poor state of AI theory discourse, though they could have. If you want your posts to have a net odds-of-humanity’s-survival-improving impact on the public discourse on top of Yudkowsky’s, I would advise that you condense your points and make the applications to concrete corporate actors, social contexts, and Python tools as clear as possible.
[ typo: ‘Merman’ → ‘Mermin’ ]