Good points. I’d imagine the Powers have chosen the “drop knowledge into Grant’s mind” because he wouldn’t object to that; they may have other means for other people.
As for the general point, I don’t know, but the Powers can be exceedingly manipulative when they need to, and respecting baseline human autonomy, even when made more rigorous, is not much of an obstacle for a superintelligence. Standard economics forces, without direction, have been very effective even in our world...
See we know the powers didn’t just drop knowledge into Grant’s mind because he’d be okay with it. The whole reason he found out about it was because someone else was bringing up the fact that the powers are doing this to presumably everybody by default. I just wouldn’t make sense for them to bring it up unless it was default.
Also the powers clearly seem to care about human autonomy, otherwise they would just use super persuasion to get everybody to agree to live the best possible human life. Plus how exactly is economics in a post scarcity world going to stop religious fanaticism?
That is the default, because most people are OK with that :-)
The point about economics is that this is something that already undermines current religious cults; so there are tools, broadly within human autonomy, that can undermine this way of thinking.
On the question of autonomy, it seems they generally respect human autonomy, but do override this when they judge it necessary (eg resetting Grant). But when they do, they try and violate autonomy the least they can. If they judge that self-reinforcing unhappy socially conservative communities, without forks, are detrimental to human flourishing, then they can subtly undermine them (until at least the possibility of forks is accepted). The question is whether the situation is sufficiently dire to require such interventions.
That is the default, because most people are OK with that :-)
See that makes it kind of a weird thing for that character to have brought up then. After all why bring something up if it only applies to people who wouldn’t care by definition.
Also you don’t really explain how religious fanaticism is going to effectively be suppressed via economics. Like how you still haven’t really given any plausible way for that to happen, like what are you going to try to starve them to death if they don’t deconvert?, that wouldn’t work for a number of reasons.
Also I’m not just talking about a small number of cultists. Barring unprecedented cultural changes between now and 2064 there will still be millions if not billions of people interested in maintaining oppressive religious cultures by the time the singularity rolls around. As for them dwindling out that seems unlikely given 1. their high birth rates and 2. the fact that since the outside world is so alien it will be easy to demonize. Plus they will likely become more extreme because many people will perceive this as the end times, and the prior mentioned isolation such an alien external world would cause. Not to mention they would see how religious communities that had contact with the outside invariably fell to sin, thus making the need for isolation even more important as far as they’re concerned.
Caveat: I hadn’t thought all these things through as much at the time, so there are ideas I’m developing during this conversation.
So I’d see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.
And I’m not arguing that the world will become perfectly enlightened and agreeing with my values (or the extended cluster of my values) by 2064, just that AIs will have tools to achieve their goals by them. Religious ideologies change all the time, and economic power is one strong thing that changes this (note how the truly stable religious communities only work by maintaining themselves at the dunbar number). When it becomes exceedingly expensive to not be an upload, while all uploads run cognitive circles around you while boasting of properly stable religious communities within them, the temptation to join will be large. And the possibility of anonymous forks can be offered then or later.
So I’d see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.
That works fine though my problem with it is that it makes that whole interaction weird since there’s no reasonable way the person bringing up that information could have known that. It does kind of make things slightly weird in retrospect. I also have some degree of issue with the whole idea of people still having “choice” in any meaningful way in the world of the powers. I mean it kind of reminds me of the entity in newcomb’s problem, if humans are utterly predictable to them, than any appearance of a choice that wasn’t made for you by the AI seems impossible, with the possible exception of sufficiently strong precommitments made presingularity. When you understand perfectly how every action will affect a human’s causal mental process, then there’s no choosing to not interfere, you are forced to choose exactly which final outcome you want. Still I could easily see this being the case in your scenario but i’m not sure it is.
As for oppressive cultures, making their lives expensive isn’t likely to do much good, starving them out wouldn’t be a good idea for really obvious reasons(PR mainly), and denying them luxuries is going to be of limited use as well. I mean when they probably view the powers as demons using subtle methods to convince them to join you is going to be pretty ineffective it would seem. Also again while these cultures would normally change over time, the fact that the outside world is totally evil as far as their concerned is going to make them extremely insular and may make them more extreme. As for the uploads making them jealous that’s kind of laughable given they probably view them as demons or puppets of the devil anyway. Any kind of negotiations with a culture that views you as literally the devil is probably doomed to failure. Also keep in mind even if you can get some people out of the communities, the rest of the community will only get worse via evaporative cooling: http://lesswrong.com/lw/lr/evaporative_cooling_of_group_beliefs/
Also keep in mind they will have a pretty high birthrate in all likelihood so they can replace some losses.
Good points. I’d imagine the Powers have chosen the “drop knowledge into Grant’s mind” because he wouldn’t object to that; they may have other means for other people.
As for the general point, I don’t know, but the Powers can be exceedingly manipulative when they need to, and respecting baseline human autonomy, even when made more rigorous, is not much of an obstacle for a superintelligence. Standard economics forces, without direction, have been very effective even in our world...
See we know the powers didn’t just drop knowledge into Grant’s mind because he’d be okay with it. The whole reason he found out about it was because someone else was bringing up the fact that the powers are doing this to presumably everybody by default. I just wouldn’t make sense for them to bring it up unless it was default.
Also the powers clearly seem to care about human autonomy, otherwise they would just use super persuasion to get everybody to agree to live the best possible human life.
Plus how exactly is economics in a post scarcity world going to stop religious fanaticism?
That is the default, because most people are OK with that :-)
The point about economics is that this is something that already undermines current religious cults; so there are tools, broadly within human autonomy, that can undermine this way of thinking.
On the question of autonomy, it seems they generally respect human autonomy, but do override this when they judge it necessary (eg resetting Grant). But when they do, they try and violate autonomy the least they can. If they judge that self-reinforcing unhappy socially conservative communities, without forks, are detrimental to human flourishing, then they can subtly undermine them (until at least the possibility of forks is accepted). The question is whether the situation is sufficiently dire to require such interventions.
See that makes it kind of a weird thing for that character to have brought up then. After all why bring something up if it only applies to people who wouldn’t care by definition.
Also you don’t really explain how religious fanaticism is going to effectively be suppressed via economics. Like how you still haven’t really given any plausible way for that to happen, like what are you going to try to starve them to death if they don’t deconvert?, that wouldn’t work for a number of reasons. Also I’m not just talking about a small number of cultists. Barring unprecedented cultural changes between now and 2064 there will still be millions if not billions of people interested in maintaining oppressive religious cultures by the time the singularity rolls around. As for them dwindling out that seems unlikely given 1. their high birth rates and 2. the fact that since the outside world is so alien it will be easy to demonize. Plus they will likely become more extreme because many people will perceive this as the end times, and the prior mentioned isolation such an alien external world would cause. Not to mention they would see how religious communities that had contact with the outside invariably fell to sin, thus making the need for isolation even more important as far as they’re concerned.
Caveat: I hadn’t thought all these things through as much at the time, so there are ideas I’m developing during this conversation.
So I’d see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.
And I’m not arguing that the world will become perfectly enlightened and agreeing with my values (or the extended cluster of my values) by 2064, just that AIs will have tools to achieve their goals by them. Religious ideologies change all the time, and economic power is one strong thing that changes this (note how the truly stable religious communities only work by maintaining themselves at the dunbar number). When it becomes exceedingly expensive to not be an upload, while all uploads run cognitive circles around you while boasting of properly stable religious communities within them, the temptation to join will be large. And the possibility of anonymous forks can be offered then or later.
That works fine though my problem with it is that it makes that whole interaction weird since there’s no reasonable way the person bringing up that information could have known that. It does kind of make things slightly weird in retrospect.
I also have some degree of issue with the whole idea of people still having “choice” in any meaningful way in the world of the powers. I mean it kind of reminds me of the entity in newcomb’s problem, if humans are utterly predictable to them, than any appearance of a choice that wasn’t made for you by the AI seems impossible, with the possible exception of sufficiently strong precommitments made presingularity. When you understand perfectly how every action will affect a human’s causal mental process, then there’s no choosing to not interfere, you are forced to choose exactly which final outcome you want. Still I could easily see this being the case in your scenario but i’m not sure it is.
As for oppressive cultures, making their lives expensive isn’t likely to do much good, starving them out wouldn’t be a good idea for really obvious reasons(PR mainly), and denying them luxuries is going to be of limited use as well. I mean when they probably view the powers as demons using subtle methods to convince them to join you is going to be pretty ineffective it would seem.
Also again while these cultures would normally change over time, the fact that the outside world is totally evil as far as their concerned is going to make them extremely insular and may make them more extreme. As for the uploads making them jealous that’s kind of laughable given they probably view them as demons or puppets of the devil anyway.
Any kind of negotiations with a culture that views you as literally the devil is probably doomed to failure. Also keep in mind even if you can get some people out of the communities, the rest of the community will only get worse via evaporative cooling: http://lesswrong.com/lw/lr/evaporative_cooling_of_group_beliefs/ Also keep in mind they will have a pretty high birthrate in all likelihood so they can replace some losses.