I’m currently writing a book on the Singularity and have consequently become extremely familiar with the organization’s work. I have gone through most of EY’s writings and have an extremely high opinion of them. His research on AI plays a big part in my book. I have also been ending my game theory classes with “rationality shorts” in which I present some of EY’s material from the sequences.
I also have a high opinion of Carl Shulman’s (an SI employee) writings including “How Hard is Artificial Intelligence? The Evolutionary Argument and Observation Selection Effects.” (Co-authored with Bostrom) and Shulman’s paper on AGI and arms races.
I’m currently writing a book on the Singularity and have consequently become extremely familiar with the organization’s work. I have gone through most of EY’s writings and have an extremely high opinion of them. His research on AI plays a big part in my book. I have also been ending my game theory classes with “rationality shorts” in which I present some of EY’s material from the sequences.
I also have a high opinion of Carl Shulman’s (an SI employee) writings including “How Hard is Artificial Intelligence? The Evolutionary Argument and Observation Selection Effects.” (Co-authored with Bostrom) and Shulman’s paper on AGI and arms races.