Replace “threshold” with “critical point.” I’m using this terminology because EY himself uses it to frame his arguments. See Cascades, Cycles, Insight, where Eliezer draws an analogy between a fission reaction going critical and an AI FOOMing.
It thinks way faster than a human, remembers more, can make complex plans … but is it smarter than a human?
This seems to be tangential, but I’m gonna say no, as long as we assume that the rat brain doesn’t spontaneously acquire language or human-level abstract reasoning skills.
Replace “threshold” with “critical point.” I’m using this terminology because EY himself uses it to frame his arguments. See Cascades, Cycles, Insight, where Eliezer draws an analogy between a fission reaction going critical and an AI FOOMing.
This seems to be tangential, but I’m gonna say no, as long as we assume that the rat brain doesn’t spontaneously acquire language or human-level abstract reasoning skills.