AI systems developed today are instead created by machine learning. This means that the computer learns to produce certain desired outputs, but humans do not tell the system how it should produce the outputs. We often have no idea how or why an AI behaves in the way that it does. [...]
The AI systems made in 2024 are different. Instead of being carefully built piece by piece, they’re created by repeatedly tweaking random systems until they do what we want. This means the people who make these AIs don’t fully understand how they work on the inside.
I think Claude’s version of this point is better, mainly due to it not using the word “output”; that’s a programming/computer science term that I expect the average person to not understand (at least not in this kind of a context). “Do what we want” is much clearer.
Great post!
I think Claude’s version of this point is better, mainly due to it not using the word “output”; that’s a programming/computer science term that I expect the average person to not understand (at least not in this kind of a context). “Do what we want” is much clearer.