Thank you for this link and also for your through response “the gears to ascension” I have some reading to do! Including “Overcoming Bias” which I am interested in as the basis for the answer in Value is Fragile.
One of the first points Eliezer makes in “Value is Fragile” is that we almost certainly create paperclip generators if we take our hand of the steering wheel. One of the things that is special about humans is that some claim that the human brain is the most complex structure in the universe—i.e. the opposite of entropy. Is the pursuit of complexity itself a goal (an “alignment”?) that by definition protects against entropy?
I grant that this may be a naïve thought, but I wonder—if things that are not paperclip generators are so hard to come by—how humans and all of the other complex structures that we know of in the universe arose at all....
Thank you for this link and also for your through response “the gears to ascension” I have some reading to do! Including “Overcoming Bias” which I am interested in as the basis for the answer in Value is Fragile.
One of the first points Eliezer makes in “Value is Fragile” is that we almost certainly create paperclip generators if we take our hand of the steering wheel. One of the things that is special about humans is that some claim that the human brain is the most complex structure in the universe—i.e. the opposite of entropy. Is the pursuit of complexity itself a goal (an “alignment”?) that by definition protects against entropy?
I grant that this may be a naïve thought, but I wonder—if things that are not paperclip generators are so hard to come by—how humans and all of the other complex structures that we know of in the universe arose at all....