Do you still maintain the statement, in 2015 with ISIL attacks?
hargup
Postman said this in context of television and new age media, where even “news” other relevant information is shown for its entertainment value and not because it can help us take better decisions.
Facts push other facts into and then out of consciousness at speeds that neither permit nor require evaluation.
Neil Postman from Amusing ourselves to Death, p 70
It has more than three years from the date you commented. What is the status on the book? Is it in print now?
So basically are you saying Eliezer, gjm and others are falling for the fallacy fallacy ?
Hi I’m Harsh Gupta I’m an undergraduate student studying Mathematics and Computing at IIT Kharagpur, India. I became interested in Rationality when I came across the wikipedia article for Conformational Bias around 2 years ago. That was pretty intriguing, I searched more and read Dan Ariely’s book Predictably Irrational. Then also read his other book Upside of Irrationality and now I’m reading hpmor and Khaneman’s Thinking Fast and Slow. I also read The Art of Startegy around the same time as Arliey’s book and that was a life changer too. The basic background of Game Theory that I got from The Art of Startegy helped me learn to analyze complex real life situation from mathematical perspective. I came to know about lesswrong from grwern.net, which was suggested by friend who is learning functional programming. I want to get more involved with the community and I would like to contribute some articles in future. BTW is there any community todo list?
Eliezer wrote somewhere about what in HPMOR can and what cannot be taken as the author’s own views. I forget the exact criterion, but I’m sure it did not include “everything said by HP”.
This is mentioned at the beginning of the book
″ please keep in mind that, beyond the realm of science, the views of the characters may not be those of the author. Not everything the protagonist does is a lesson in wisdom, and advice offered by darker characters may be untrustworthy or dangerously double-edged.”
Fascinating article, my conclusion is that trying to create perfectly aligned LLM will make it easier for LLM to break into the anti-aligned LLM. I would say, alignment folks don’t bother. You are accelerating the timelines.