What you are describing is an accidental Clippy, just like humans are accidental CO2 maximizers. Which is a fair point, if we meet what looks like an alien Clippy, we should not jump to conclusions that paperclip maximizing is its terminal value.
Also, just to nitpick, if you have a lot of mass available, it would make sense to lump all this iron together and make a black hole, as you can extract a lot more energy from throwing stuff toward it than from the nuclear fusion proper. Or you can use fusion first, then throw the leftover iron bricks into the accreting furnace.
So the accidental Clippy would likely present as a black hole maximizer.
You have a point, even if it is expressed in a hostile manner.
However, from the outside it is often hard to tell whether something is a goal or a side effect. It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible, even though fossil fuel burning sometimes has to be delayed to produce enough tech to get access to more fossil fuels. Or you can pick some other artifact of human development, and in the right light it would look like maximizing it is a terminal goal. Like the number of heavy objects in the air at any given time.
I’m sorry, I’m not trying to be hostile. But words have meanings. If you could equate “Y has been observed to produce some X among a myriad of other effects” with “Y is an X maximiser”, what’s the point in having a word like “maximiser”? Hell, even remove the “myriad of other effects”—a paperclip-making machine or paperclip factory isn’t a paperclip maximiser either.
It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible
It certainly does not.
Well, maybe to a very contrived observer: you’d have to have all the knowledge about our planet necessary to realize that a CO2 increase is happening (not trivial) and that it’s not a natural effect of whatever changes the planet naturally undergoes (even less trivial), and somehow magically be ignorant of any other details, to even entertain such a notion. Any more knowledge and you’d immediately begin noticing that our civilisation produces a myriad of effects that it would not bother producing if it were a CO2 maximiser, and that for all the effort and ingenuity that it puts into its works, the effects in terms of C02 increase are actually rather pathetic.
You’re closer to a reasonable use of the term by calling the paperclip-producing advanced civilisation an incidental paperclip maximiser, because the end result will be the same—all matter eventually converted into paperclips. It’s still a stretch, though, because a maximiser would take the shortest route towards tiling the universe with paperclips, while the advanced civilisation will be actively trying to minimise paperclipping in proportion to its actual goals—it will try to extract as much usefulness out of every bit of matter converted as it can. So it’s still an incidental producer, not maximiser. Would an outside observer be able to tell the difference? I don’t know, but I suggest the way this civilisation would be doing a myriad of interesting things instead of simply focusing on the most efficient way to produce paperclips would be an easy giveaway.
Of course if we only look at end results to decide if we call something a “maximiser”, then any agent actively pursuing any goals is an “entropy maximiser”. At this point I stop feeling like language conveys useful meaning.
if we meet what looks like an alien Clippy, we should not jump to conclusions that paperclip maximizing is its terminal value.
Yes, certainly. It seems to me that the thought you were expressing is actually the opposite of those of your words I’ve been objecting to: if something looks like a maximiser, it’s possibly it isn’t one.
Yes, good point that I hadn’t thought of, thanks. It’s very easy to imagine far-future technology in one respect and forget about it entirely in another.
To rescue my scenario a little, there’ll be an energy cost in transporting the iron together; the cheapest way is to move it very slowly. So maybe there’ll be paperclips left for a period of time between the first pass of the harvesters and the matter ending up at the local black hole harvester.
And of course you can throw black holes into black holes as well, and extract even more energy. The end game is when you have just one big black hole, and nothing left to throw in it. At that point you then have to change strategy and wait for the black hole to give off Hawking radiation until it completely evaporates.
But all these things can happen later—there’s no reason for not going through a paperclip maximization step first, if you’re that way inclined...
What you are describing is an accidental Clippy, just like humans are accidental CO2 maximizers. Which is a fair point, if we meet what looks like an alien Clippy, we should not jump to conclusions that paperclip maximizing is its terminal value.
Also, just to nitpick, if you have a lot of mass available, it would make sense to lump all this iron together and make a black hole, as you can extract a lot more energy from throwing stuff toward it than from the nuclear fusion proper. Or you can use fusion first, then throw the leftover iron bricks into the accreting furnace.
So the accidental Clippy would likely present as a black hole maximizer.
“humans are accidental CO2 maximizers”
You’re abusing words. There’s a big difference between a producer of X and a maximiser of X.
You have a point, even if it is expressed in a hostile manner.
However, from the outside it is often hard to tell whether something is a goal or a side effect. It certainly looks like whatever afflicted this planet intends to produce as much CO2 as possible, even though fossil fuel burning sometimes has to be delayed to produce enough tech to get access to more fossil fuels. Or you can pick some other artifact of human development, and in the right light it would look like maximizing it is a terminal goal. Like the number of heavy objects in the air at any given time.
Not sure about that.
I’m sorry, I’m not trying to be hostile. But words have meanings. If you could equate “Y has been observed to produce some X among a myriad of other effects” with “Y is an X maximiser”, what’s the point in having a word like “maximiser”? Hell, even remove the “myriad of other effects”—a paperclip-making machine or paperclip factory isn’t a paperclip maximiser either.
It certainly does not.
Well, maybe to a very contrived observer: you’d have to have all the knowledge about our planet necessary to realize that a CO2 increase is happening (not trivial) and that it’s not a natural effect of whatever changes the planet naturally undergoes (even less trivial), and somehow magically be ignorant of any other details, to even entertain such a notion. Any more knowledge and you’d immediately begin noticing that our civilisation produces a myriad of effects that it would not bother producing if it were a CO2 maximiser, and that for all the effort and ingenuity that it puts into its works, the effects in terms of C02 increase are actually rather pathetic.
You’re closer to a reasonable use of the term by calling the paperclip-producing advanced civilisation an incidental paperclip maximiser, because the end result will be the same—all matter eventually converted into paperclips. It’s still a stretch, though, because a maximiser would take the shortest route towards tiling the universe with paperclips, while the advanced civilisation will be actively trying to minimise paperclipping in proportion to its actual goals—it will try to extract as much usefulness out of every bit of matter converted as it can. So it’s still an incidental producer, not maximiser. Would an outside observer be able to tell the difference? I don’t know, but I suggest the way this civilisation would be doing a myriad of interesting things instead of simply focusing on the most efficient way to produce paperclips would be an easy giveaway.
Of course if we only look at end results to decide if we call something a “maximiser”, then any agent actively pursuing any goals is an “entropy maximiser”. At this point I stop feeling like language conveys useful meaning.
Yes, certainly. It seems to me that the thought you were expressing is actually the opposite of those of your words I’ve been objecting to: if something looks like a maximiser, it’s possibly it isn’t one.
I think I’d call it incidental clippy. It’s not creating paperclips accidentally. It’s just that the paperclips are only incidental to its true goal.
Right, incidental fits better.
Yes, good point that I hadn’t thought of, thanks. It’s very easy to imagine far-future technology in one respect and forget about it entirely in another.
To rescue my scenario a little, there’ll be an energy cost in transporting the iron together; the cheapest way is to move it very slowly. So maybe there’ll be paperclips left for a period of time between the first pass of the harvesters and the matter ending up at the local black hole harvester.
And of course you can throw black holes into black holes as well, and extract even more energy. The end game is when you have just one big black hole, and nothing left to throw in it. At that point you then have to change strategy and wait for the black hole to give off Hawking radiation until it completely evaporates.
But all these things can happen later—there’s no reason for not going through a paperclip maximization step first, if you’re that way inclined...
Except if you’re really a paperclip maximizer.