In re FAI vs. snoozing: What I’d hope from an FAI is that it would know how much rest I needed. Assuming that you don’t need that snoozing time at all strikes me as a cultural assumption that theories (in this case, possibly about willpower, productivity, and virtue) should always trump instincts.
A little about hunter-gatherer sleep. What I’ve read elsewhere is that with an average of 12 hours of darkness and an average need for 8 hours of sleep, hunter-gathers would not only have different circadian rhythms (teenagers tend to run late, old people tend to run early), but a common pattern was to spend some hours in the middle of the night for talk, sex, and/or contemplation. To put it mildly, this pattern in not available for the vast majority of modern people, and we don’t know what if anything this is costing.
I think of FAI as being like gorillas trying to invent a human—a human which will be safe for gorillas, but I may be unduly pessimistic.
I’m inclined to think that raising the sanity waterline is more valuable than you do for such a long range project—FAI is so dependent on a small number of people, and I think it will continue to be so. Improved general conditions means that the odds of someone who would be really valuable not having their life screwed up early are improved.
On the other hand, this is a “by feel” argument, and I’m not sure what I might be missing.
I think of FAI as being like gorillas trying to invent a human—a human which will be safe for gorillas, but I may be unduly pessimistic.
Leave out “artificial”—what would constitute a “human-friendly intelligence”? Humans don’t. Even at our present intelligence we’re a danger to ourselves.
I’m not sure “human-friendly intelligence” is a coherent concept, in terms of being sufficiently well-defined (as yet) to say things about. The same way “God” isn’t really a coherent concept.
In re FAI vs. snoozing: What I’d hope from an FAI is that it would know how much rest I needed. Assuming that you don’t need that snoozing time at all strikes me as a cultural assumption that theories (in this case, possibly about willpower, productivity, and virtue) should always trump instincts.
A little about hunter-gatherer sleep. What I’ve read elsewhere is that with an average of 12 hours of darkness and an average need for 8 hours of sleep, hunter-gathers would not only have different circadian rhythms (teenagers tend to run late, old people tend to run early), but a common pattern was to spend some hours in the middle of the night for talk, sex, and/or contemplation. To put it mildly, this pattern in not available for the vast majority of modern people, and we don’t know what if anything this is costing.
I think of FAI as being like gorillas trying to invent a human—a human which will be safe for gorillas, but I may be unduly pessimistic.
I’m inclined to think that raising the sanity waterline is more valuable than you do for such a long range project—FAI is so dependent on a small number of people, and I think it will continue to be so. Improved general conditions means that the odds of someone who would be really valuable not having their life screwed up early are improved.
On the other hand, this is a “by feel” argument, and I’m not sure what I might be missing.
Leave out “artificial”—what would constitute a “human-friendly intelligence”? Humans don’t. Even at our present intelligence we’re a danger to ourselves.
I’m not sure “human-friendly intelligence” is a coherent concept, in terms of being sufficiently well-defined (as yet) to say things about. The same way “God” isn’t really a coherent concept.
4 + 4 hours of sleep in relatively recent history