The height and beauty of Sid’s wall were side effects. They were not what he was optimizing for, but they were consequences of the actions he took to maximize strength, and he may or may not have been aware of them.
The strength and height of In Tent’s wall might be labeled side effects, but I think a distinction should be made because they were specifically chosen. Where Sid’s wall became beautiful haphazardly, he had no bias in either direction, nor a bias towards ‘reasonably tall’. In Tent’s wall grew tall in a precisely chosen way. Every inch of height was required to maximize beauty. (And if he had not been strictly maximizing beauty, he could have chosen any other balance.)
I don’t think an In-Tent-style no-side effects model is possible outside of mathematical systems like certain puzzles or programs.
And of course evolution doesn’t intend anything, so you could say all the design was a side effect of the process, or that ‘side effect’ doesn’t apply because there was no goal.
But I’m looking for a more technically explicit definition so we could look at the process leading up to a specific ‘design choice’ and say it has an intentionality of .0002 or .96.
But I don’t think I gave good examples of processes with moderate levels of intentionality. this story was very either-or. A spectrum of genetic algorithms might do the trick.
Evolution, Civilization, Intent. Yeah.
I think you meant Side Effect, not Civilization.
Ah! I’m an idiot!
Aren’t they all side effects, though?
I’d say no.
The strength of Sid’s wall and the beauty of In Tent’s walls were the things they were optimizing for. See: http://lesswrong.com/lw/tx/optimization/
The height and beauty of Sid’s wall were side effects. They were not what he was optimizing for, but they were consequences of the actions he took to maximize strength, and he may or may not have been aware of them.
The strength and height of In Tent’s wall might be labeled side effects, but I think a distinction should be made because they were specifically chosen. Where Sid’s wall became beautiful haphazardly, he had no bias in either direction, nor a bias towards ‘reasonably tall’. In Tent’s wall grew tall in a precisely chosen way. Every inch of height was required to maximize beauty. (And if he had not been strictly maximizing beauty, he could have chosen any other balance.)
I don’t think an In-Tent-style no-side effects model is possible outside of mathematical systems like certain puzzles or programs.
And of course evolution doesn’t intend anything, so you could say all the design was a side effect of the process, or that ‘side effect’ doesn’t apply because there was no goal.
But I’m looking for a more technically explicit definition so we could look at the process leading up to a specific ‘design choice’ and say it has an intentionality of .0002 or .96.
But I don’t think I gave good examples of processes with moderate levels of intentionality. this story was very either-or. A spectrum of genetic algorithms might do the trick.