Would have been straightforward to have the AI just...buy back stocks, or do stuff to invest in the longterm future, that hurts the stock price now.
“We’re doing a $1 million kickstarter to release all our code, open source. We are committed, to ensuring security for our users and clients, and this is the best way forward in light of all these zero day exploits (of major companies such as …) in the wild.
We will, of course be experimenting with different protocols for handling ‘responsible disclosure’. ”
The second model-splintering is the morality of creating the teddies. For most of us, this will be a new situation, which we will judge by connecting it to previous values or analogies
You can also judge by...interacting with the new reality.
morality of the master-servant relationship that this resembles
Who’s to say they don’t take over after those people die? Throwing away the company in favor of influencing the world (their new advisors say AIs are a great investment, or that existing one)...
(Given the ability for them to manage social media accounts)
This obviously calls for Shoip of Theseus-ing this until we end up with:
Well, they’re people brought back from the
downdead, but they are not human.Would have been straightforward to have the AI just...buy back stocks, or do stuff to invest in the longterm future, that hurts the stock price now.
“We’re doing a $1 million kickstarter to release all our code, open source. We are committed, to ensuring security for our users and clients, and this is the best way forward in light of all these zero day exploits (of major companies such as …) in the wild.
We will, of course be experimenting with different protocols for handling ‘responsible disclosure’. ”
You can also judge by...interacting with the new reality.
Who’s to say they don’t take over after those people die? Throwing away the company in favor of influencing the world (their new advisors say AIs are a great investment, or that existing one)...