Now that these advanced systems exist, they’ve been observed to compete with each other for scarce resources, and — especially at high frequencies — they appear to have become somewhat apathetic to human economies. They’ve decoupled themselves from the human economy because events that happen on slower human time scales — what might be called market “fundamentals” — have little to no relevance to their own success.
I’m curious about this, and specifically what’s meant by this “decoupling”. Anyone have a link to research about that?
It sounds somewhat like “financial AIs are paperclipping the economy” or possibly “financial AIs are wireheading themselves”, or both. If either is true, that means my previous worries about unfriendly profit-optimizers were crediting the financial AIs with too much concern for their owners’ interests.
Computer programs which maximize entropy show intelligent behavior.
Kevin Kelly linked to it, which means it might make sense, but I’m not sure.
It sounds like Prigogine (energy moving through a system causes local organization), but I’m not sure about Prigogine, either.
I’m curious about this, and specifically what’s meant by this “decoupling”. Anyone have a link to research about that?
It sounds somewhat like “financial AIs are paperclipping the economy” or possibly “financial AIs are wireheading themselves”, or both. If either is true, that means my previous worries about unfriendly profit-optimizers were crediting the financial AIs with too much concern for their owners’ interests.
Previously: http://lesswrong.com/lw/h96/link_causal_entropic_forces/ http://lesswrong.com/lw/h7r/open_thread_april_1530_2013/8th1