Maybe the lowest-hanging fruit was already picked. If someone tried to write Sequences 2.0, what would it be about? Cognitive biases that Eliezer skipped?
Something I feel Yudkowsky doesn’t really talks about enough in the Sequences is how to be rational in a group, as part of a group and as a group. There is some material in there and HPMOR also offers some stuff, but there’s very little that is as formalized as the ideas around “Politics is the Mindkiller/Spiders/Hard Mode,” or “the Typical Mind Fallacy.”
Something Yudkowsky also mentions is that what he writes about rationality is his path. Some things generalize (most people have the same cognitive biases, but in different amounts). From reading the final parts of the Sequences and the final moments of HPMOR I get the vibe that Yudkowsky really wants people to develop their own path. Alicorn did this and Yvain also did/does it to some extent (and I’m reading the early non-Sequence posts and I think that MBlume also did this a bit), but it’s something that could be written more about. Now, I agree that this is hard, the lowest fruit probably is already picked and it’s not something everyone can do. But I find it hard to believe that there are just 3 or 4 people who can actually do this. The bonobo rationalists on tumblr are, in their own, weird way, trying to find a good way to exist in the world in relation to other people. Some of this is formalized, but most of it exists in conversations on tumblr (which is an incredibly annoying medium, both to read and to share). Other people/places from the Map probably do stuff like that as well. I take this as evidence that there is still fruit low enough to pick without needing a ladder.
Slime mold can be used to map subway routes.
Edit: Markets can also be seen as a non-human optimizing actor, even if the smallest parts are human.