I also personally do straightforwardly think that most of the efforts of the extended EA-Alignment ecosystem are bad
Do you have a diagnosis of the root cause of this?
I have definitely taken actions within the bounds of what seems reasonable that have aimed at getting the EA community to shut down or disappear (and will probably continue to do so).
Why not try to reform EA instead? (This is related to my previous question. If we could diagnose what’s causing EA to be harmful, maybe we can fix it?)
I have spent like 40% of the last 1.5 years trying to reform EA. I think I had a small positive effect, but it’s also been extremely tiring and painful and I consider my duty with regards to this done. Buy in for reform in leadership is very low, and people seem primarily interested in short term power seeking and ass-covering.
The memo I mentioned in another comment has a bunch of analysis I’ll send it to you tomorrow when I am at my laptop.
Do you have a diagnosis of the root cause of this?
Why not try to reform EA instead? (This is related to my previous question. If we could diagnose what’s causing EA to be harmful, maybe we can fix it?)
I have spent like 40% of the last 1.5 years trying to reform EA. I think I had a small positive effect, but it’s also been extremely tiring and painful and I consider my duty with regards to this done. Buy in for reform in leadership is very low, and people seem primarily interested in short term power seeking and ass-covering.
The memo I mentioned in another comment has a bunch of analysis I’ll send it to you tomorrow when I am at my laptop.
For some more fundamental analysis I also have this post, though it’s only a small part of the picture: https://www.lesswrong.com/posts/HCAyiuZe9wz8tG6EF/my-tentative-best-guess-on-how-eas-and-rationalists