When you say “They do, however, have the potential to form simulacra that are themselves optimizers, such as GPT modelling humans (with pretty low fidelity right now) when making predictions”
do you mean things like “write like Ernest Hemingway”?
Yep. I think it happens on a much lower scale in the background too—like if you prompt GPT with something like the occurrence of an earthquake, it might write about what reporters have to say about it, simulating various aspects of the world that may include agents without our conscious direction.
Thanks!
When you say “They do, however, have the potential to form simulacra that are themselves optimizers, such as GPT modelling humans (with pretty low fidelity right now) when making predictions”
do you mean things like “write like Ernest Hemingway”?
Yep. I think it happens on a much lower scale in the background too—like if you prompt GPT with something like the occurrence of an earthquake, it might write about what reporters have to say about it, simulating various aspects of the world that may include agents without our conscious direction.