how do you define “being fair” to the potential of linear regression software?
That’s a big question. How much of the galaxy (or even universe) does humanity ‘deserve’ to control, compared to any other species that might be out there, or any other species that we create?
I don’t know how many answers there are that lie somewhere between “Grab it all for ourselves, if we’re able!” and “Foolishly give away what we could have grabbed, endangering ourselves.”. But I’m pretty sure the two endpoints are not the only two options.
Luckily for me, in this discussion, I don’t have to pick a precise option and say “This! This is the fair one.” I just have to demonstrate the plausibility of there being at least one option that is unfair OR that might be seen as being unfair by some group who, on that basis, would then be willing and able to take action influencing the course of humanity’s future.
Because if I can demonstrate that, then how ‘fair’ the constraint is, does become a factor that should be taken into account.
That’s a big question. How much of the galaxy (or even universe) does humanity ‘deserve’ to control, compared to any other species that might be out there, or any other species that we create?
I don’t know how many answers there are that lie somewhere between “Grab it all for ourselves, if we’re able!” and “Foolishly give away what we could have grabbed, endangering ourselves.”. But I’m pretty sure the two endpoints are not the only two options.
Luckily for me, in this discussion, I don’t have to pick a precise option and say “This! This is the fair one.” I just have to demonstrate the plausibility of there being at least one option that is unfair OR that might be seen as being unfair by some group who, on that basis, would then be willing and able to take action influencing the course of humanity’s future.
Because if I can demonstrate that, then how ‘fair’ the constraint is, does become a factor that should be taken into account.