Because I am not just saying it’s not obvious an AI would recursively self-improve, I’m also referring to Eliezer’s earlier claims that such recursive self-improvement (aka FOOM) is what we’d expect given our shared assumptions about intelligence. I’m sort-of quoting Eliezer as saying FOOM obviously falls out of these assumptions.
I’m worried about the “sort-of quoting” part. I get nervous when people put quote marks around things that aren’t actually quotations of specific claims.
Noted, and thanks for asking. I’m also somewhat over-fond of scare quotes to denote my using a term I’m not totally sure is appropriate. Still, I believe my clarification above is sufficient that there isn’t any ambiguity left now as to what I meant.
Why is the word obviously in quotes?
Because I am not just saying it’s not obvious an AI would recursively self-improve, I’m also referring to Eliezer’s earlier claims that such recursive self-improvement (aka FOOM) is what we’d expect given our shared assumptions about intelligence. I’m sort-of quoting Eliezer as saying FOOM obviously falls out of these assumptions.
I’m worried about the “sort-of quoting” part. I get nervous when people put quote marks around things that aren’t actually quotations of specific claims.
Noted, and thanks for asking. I’m also somewhat over-fond of scare quotes to denote my using a term I’m not totally sure is appropriate. Still, I believe my clarification above is sufficient that there isn’t any ambiguity left now as to what I meant.