I have a couple of questions about this subject...
Does it still count if the AI “believes” that it needs humans when it, in fact, does not?
For example does it count if you code into the AI the belief that it is being run in a “virtual sandbox,” watched by a smarter “overseer” and that if it takes out the human race in any way, then it will be shut down/tortured/highly negative utilitied by said overseer?
Just because an AI needs humans to exist, does that really mean that it won’t kill them anyway?
This argument seems to be contingent on the AI wishing to live. Wishing to live is not a function of all inteligence. If an AI was smarter than anything else out there but depended on lesser, and provenly irrational beings for its continued existence this does not mean that it would want to “live” that way forever. It could either want to gain independance, or cease to exist, neither of which are necessarily healthy for its “supporting units”.
Or, it could not care either way whether it lives or dies, as stopping all work on the planet is more important for slowing the entropic death of the universe.
It may be the case that an AI does not want to live reliant on “lesser beings” and sees the only way of ensuring its permanent destruction as the destruction of any being capable of creating it again, or the future possibilty of such life evolving. It may decide to blow up the universe to make extra sure of that.
Come to think of it a suicidal AI could be a pretty big problem...
If I am given a thing, like a mug, I now have one more mug than I had before. My need for mugs has therefore decreased. If I am to sell the mug, I must examine how much I will need the mug after it is gone and place a price on that loss of utility. If I am buying a mug I must set a price on how much I need it after I have it and place a price on that increase of utility. If the experiment is not worded carefully then the thought process could go along the lines of...
I have 2 mugs, and often take a tea break with my mate Steve. To sell one of those mugs would make me lose out on this activity… $10. I don’t hugely need another mug unless it breaks, but it is handy to have a spare… $2.
In real life people will attribute more value to their stuff than other stuff as in general they would not have got the stuff if they did not value it higher than the cost of getting it. It is not a failiure of rationality to want something more than what you paid for it, and while it is a failiure of rationality to over value something just because you own it, it is not a failiure of rationality to ask a higher price first in case the person you are selling to is willing to pay more.
It would be difficult to adjust for these factors in designing an experiment.