There’s a 2009 interview with a transhumanist Australian academic where Egan hints at some of his problems with transhumanism (even while stating elsewhere that human nature is not forever, that he expects conscious AI in his lifetime, that “universal immortality” might be a nice thing, and so forth). Evidently some of it is pure intellectual disagreement, and some of it is about not liking the psychological attitudes or subcultural politics that he sees.
There’s a 2009 interview with a transhumanist Australian academic where Egan hints at some of his problems with transhumanism (even while stating elsewhere that human nature is not forever, that he expects conscious AI in his lifetime, that “universal immortality” might be a nice thing, and so forth). Evidently some of it is pure intellectual disagreement, and some of it is about not liking the psychological attitudes or subcultural politics that he sees.