The interpretations are usually far from trivial and most aspire to provide an inspiration for building a testable model some day. Some even have, and been falsified. That’s quite different from last thursdayism.
Well, I probably don’t know enough about QM to judge if they’re correct; but it’s certainly a claim made fairly regularly.
Why would it? A paperclip maximizer is already instrumental, it has one goal in mind, maximizing the number of paperclips in the universe (which it presumably can measure with some sensors). It may have to develop advanced scientific concepts, like General Relativity, to be assured that the paperclips disappearing behind the cosmological horizon can still be counted toward the total, given some mild assumptions, like the Copernican principle.
Let’s say it simplifies the equations not to model the paperclips as paperclips—it might be sufficient to treat them as a homogeneous mass of metal, for example. Does this mean that they do not, in fact, exist? Should a paperclipper avoid this at all costs, because it’s equivalent to them disappearing?
Removing the territory/map distinction means something that wants to change the territory could end up changing the map … doesn’t it?
I’m wondering because I care about people, but it’s often simpler to model people without treating them as, well, sentient.
Anyway, I’m quite skeptical that we are getting anywhere in this discussion.
Well, I’ve been optimistic that I’d clarified myself pretty much every comment now, so I have to admit I’m updating downwards on that.
Well, I probably don’t know enough about QM to judge if they’re correct; but it’s certainly a claim made fairly regularly.
Let’s say it simplifies the equations not to model the paperclips as paperclips—it might be sufficient to treat them as a homogeneous mass of metal, for example. Does this mean that they do not, in fact, exist? Should a paperclipper avoid this at all costs, because it’s equivalent to them disappearing?
Removing the territory/map distinction means something that wants to change the territory could end up changing the map … doesn’t it?
I’m wondering because I care about people, but it’s often simpler to model people without treating them as, well, sentient.
Well, I’ve been optimistic that I’d clarified myself pretty much every comment now, so I have to admit I’m updating downwards on that.