For a while I got into the habit of talking about systems that optimize their environment for certain values, rather than talking about intelligences (whether NI, AI, or AGI). I haven’t found that it significantly alters the conversations I’m in, but I find it gives me more of a sense that I know what I’m talking about. (Which might be a bad thing, if I don’t.)
For a while I got into the habit of talking about systems that optimize their environment for certain values, rather than talking about intelligences (whether NI, AI, or AGI). I haven’t found that it significantly alters the conversations I’m in, but I find it gives me more of a sense that I know what I’m talking about. (Which might be a bad thing, if I don’t.)