“Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn’t build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you’d like.”
Humanity’s alleged demise is not the only possible way he could be too late. I wonder where Eliezer would turn his attention if someone (or some group) solved the problems of FAI before him.
Eliezer has written a number of times about how comparing your intelligence and rationality to those around you is pointless (e.g. it’s never good enough to be good in comparison, etc.). This philosophy has thus far been directed at comparing one’s self to lower levels of cognition—but I don’t see why it shouldn’t work bottom up also. Learn from the levels above you, but do not lionize them. As we all aspire to embody the higher levels, I’m sure Jaynes must have also (an old vampire, but not old enough).
Eliezer: I don’t think we should worry about our particular positions on the bell curve, or set goals for where we want to be. Don’t fret over the possible limitations of your brain, doing so will not change them. Just work hard and try your best, always attempt to advance—push the limitations. Jaynes was struggling against his meat-brain too. It’s human—you both surpassed the village idiots and college professors, now the difference in levels becomes more and more negligible with each step approaching the limit. Everybody is working with meat designed by the “idiot god”. Push it to the limit, hate the limit, but don’t be self-conscious about it.
We all wish we had gotten an earlier start on things. The importance of them is perhaps something you have to learn as you grow.
“Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn’t build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you’d like.”
Humanity’s alleged demise is not the only possible way he could be too late. I wonder where Eliezer would turn his attention if someone (or some group) solved the problems of FAI before him.
Eliezer has written a number of times about how comparing your intelligence and rationality to those around you is pointless (e.g. it’s never good enough to be good in comparison, etc.). This philosophy has thus far been directed at comparing one’s self to lower levels of cognition—but I don’t see why it shouldn’t work bottom up also. Learn from the levels above you, but do not lionize them. As we all aspire to embody the higher levels, I’m sure Jaynes must have also (an old vampire, but not old enough).
Eliezer: I don’t think we should worry about our particular positions on the bell curve, or set goals for where we want to be. Don’t fret over the possible limitations of your brain, doing so will not change them. Just work hard and try your best, always attempt to advance—push the limitations. Jaynes was struggling against his meat-brain too. It’s human—you both surpassed the village idiots and college professors, now the difference in levels becomes more and more negligible with each step approaching the limit. Everybody is working with meat designed by the “idiot god”. Push it to the limit, hate the limit, but don’t be self-conscious about it.
We all wish we had gotten an earlier start on things. The importance of them is perhaps something you have to learn as you grow.