I don’t think that you get a lot of motivation from surface thoughts like the one you listed.
What motivates you at the moment to do the thing you are doing? What are your greatest fears? Explore those questions and see how the answers relate to your agenda.
Some motivations: I’m negotiating to order a custom hiking stick, and I want to be able to actually use the thing properly without ending up dragging it behind me after so many kilometers because it’s too heavy. I want to be the sort of person who could be described as a PC instead of a mere NPC; someone who is at least vaguely capable in a wide variety of situations. (Eg, I’m trying to figure out if it’s worth the effort to try to swing the tuition costs for St. John’s Ambulance first-aid training.) I want to be able to pull my own weight—literally—if I have to.
Some fears: I don’t trust my understanding of quantum theory and the MWI to rely on Everett Immortality keeping me alive; if I’m not ready to deal with whatever comes close to killing me next, then I could very well end up permanently dead, in all branches of the future leading from this point. There are few enough people who even think about x-risks; how can I consider myself competent to even start thinking about ways to avoid those if I’m not competent enough to do a few push-ups? If ancestor simulations turn out to be feasible, then if this is the original version of history, won’t all my future copies be rather annoyed at me for leaving them as un-exercised weaklings (and won’t the simulators laugh at them)?; and if this is a simulation, then if I can’t do better than my original, then what’s the point of me?
(… Okay, so not all of those are actual /fears/ per se, but if an extremely hypothetical stick is enough of one to kick me in the rear to keep going, I’m willing to work with it.)
I don’t trust my understanding of quantum theory and the MWI to rely on Everett Immortality keeping me alive; if I’m not ready to deal with whatever comes close to killing me next, then I could very well end up permanently dead, in all branches of the future leading from this point.
Fear of death is a strong one. You could go associate that fear more and use it to push you to take action. Where do you feel that fear in your body?
Deassociated fear freezes people. Associating emotions generally produces action. Alternatively people can also find a way to deassociate them to escape them.
The kind of fear that can’t be felt as movement in some part of the body is deassociated that why I ask where he can feel it in his body.
That’s suggest the feeling is either disassociated or you just try to construct something intellectually that isn’t there. In that form it won’t help much with motivation.
While I think about it, trying to get strangers on the internet that I don’t really know to associate a strong fear of death to motivate themselves might be an infohazard.
I don’t think that you get a lot of motivation from surface thoughts like the one you listed.
Yeah, really.
Besides, exercise is fun. You get a nice high off exercising. I start feeling awful whenever I skip it, these days. The improved health and the nice thought that I’m officially Fixing My Life are just bonuses.
To fix your life, fix your emotions, at a deep level. You need to train your lizard brain to prefer what your ape brain prefers, and train your ape brain to prefer what your conscious human brain prefers.
Correction; /you/ get a nice high off exercising. /I/ have yet to get a runner’s high, or anything of the sort. (Imagine trying to keep up an exercise program if you /didn’t/ get that nice high; how would you motivate yourself?)
To fix your life, fix your emotions, at a deep level. You need to train your lizard brain to prefer what your ape brain prefers, and train your ape brain to prefer what your conscious human brain prefers.
I’m willing to give it a shot. Got any specific suggestions on the how?
Correction; /you/ get a nice high off exercising. /I/ have yet to get a runner’s high, or anything of the sort. (Imagine trying to keep up an exercise program if you /didn’t/ get that nice high; how would you motivate yourself?)
Hmm… I get general, overall improvements to mood and focus from exercising, as much from weight-lifting as from cardio. Have you tried weight-lifting? Running on an elliptical machine or biking as opposed to just going out and hitting the pavement?
I know people say it’s very athletic and cool to just go out and run, but that always clobbered the hell out of my knees—especially because I was out of shape in the first place. It took me months of elliptical training (which is softer on the knees, which means you can go longer without utterly exhausting yourself even at a high heart rate) before I could enjoy just running.
I’m willing to give it a shot. Got any specific suggestions on the how?
What mostly works for me is blatant rationalization, and training the damn thing like a dog. Reward it when it’s good and punish it when it’s bad. Manage its diet. Blah blah blah.
It is a much larger problem than I gave credit for when I posed it, though.
EDIT: Also, I motivate myself during exercise by strapping on some headphones and listening to all my favorite music. Currently: the Kill la Kill and Gatchaman soundtracks, with some Shingeki no Kyojin thrown in. (I’m a nerd, deal with it.)
Also, since I notice you are apparently motivated to improve your health by wanting to gain eternal life, I would figure that eternal life would be enough of a motivation. Are you really failing to find immortality emotionally moving? I can see why one would, but you claimed that was your conscious motivation. Need something else, perhaps?
Have you tried weight-lifting? Running on an elliptical machine or biking as opposed to just going out and hitting the pavement?
I’ve started this whole thing within the past month, and as of yet, have bought no equipment (and due to wintery weather, have barely managed my daily hikes, let alone increasing the pace to jogging or running). I am planning on picking up a bicycle come the spring, though.
My mind kinda processes exercise this way.
I tend to prefer GURPS, myself, but the thread’s title should point out I’m thinking at least roughly on the same lines. :)
blatant rationalization
Hrm. For some time now, I’ve been picking a deliberately flimsy excuse as my reason to go out for the daily walk/hike, such as “I’ll go check the price of basil at distant store X”, to the point of it now being a running joke, so that part seems easy enough. And I do recall theorizing about the ‘mahout theory of consciousness’, where most of the brain is just this clumsy elephant that wants to trundle along in its own way, and needs some sharp whacks for the consciousness-mahout to get it going in the right direction… I’m going to have to see if I can dig up my notes on how well that worked out at the time.
What sort of rewards and punishments seem to work for you?
Also, since I notice you are apparently motivated to improve your health by wanting to gain eternal life, I would figure that eternal life would be enough of a motivation. Are you really failing to find immortality emotionally moving? I can see why one would, but you claimed that was your conscious motivation. Need something else, perhaps?
There’s a somewhat involved chain of causality between ‘increasing the odds of immortality’ and ‘do another one of this painful thing’, said chain involving all sorts of probabilities and estimations and outright guesses. Sometimes, that chain alone is enough to keep me keeping on. Sometimes, the odds that the effort will result in the reward seem low enough that it doesn’t seem worth the cost. Thus, I’m quite open to whatever other motivations could help get over the hump.
Yes, you could get killed by a mutated pandemic, an asteroid strike, a car crash, a mugger, or a UFAI. But really, probabilistically speaking, what will probably kill you is either cancer or heart disease. Exercise prevents and fights heart disease, thus decreasing the largest single preventable mortality factor.
Eternal life is gained one year at a time! You want to make it as far as trying to tile the Solar System in copies of yourself and end up in an epic war with my army of mecha? Survive long enough for the technology to exist!
It may please you to hear that I read your post shortly before arriving at the stores that were the theoretical destination for my walk today… and after some consideration, I bought a set of adjustable wrist/ankle weights to help improve the effectiveness of my exercising.
(Even if I were to entirely drop the actually-painful parts of my routine—the push-ups and planks—I seem to be doing well with the gradual increase of the other exercises, and such weights seem reasonably likely to improve those exercises’ effectiveness.)
PS: I only want to tile the Solar System with myself to the degree necessary to mine He-3, or whatever other fuel source is most viable to start launching myself elsewhere. As long as you don’t want to hoard so many resources for your mecha that you keep me from sporing, I’m sure we can get along, and that our epic wars will involve boffer nukes. ;)
For just a few minutes, I’ve earned the privilege of wearing Kamina-sama’s flag as a cape. Thank you.
So, you’re just aiming for a bit of lonely star sailing, eh? No need to tile, then. We’ll have plenty of infrastructure to give you a lift. Seems a pity, going out into the black all alone, though. You should take a friend.
Seems a pity, going out into the black all alone, though. You should take a friend.
If anyone else wants to come with, I don’t particularly mind. I also don’t mind going it alone. (I’ve seriously looked into whether or not I’d be qualified as a lighthouse keeper or fire-tower lookout.) But at least at the moment, the number of people who are willing to even agree that “yeah, that sounds like it’d be fun”, let alone seriously consider it as a future possibility… are quite small, eclectic, and tend to be downright weirdos.
I wonder how many folk would be willing to accept handing over root access for the duration of the voyage, to ensure any interpersonal conflicts don’t get so out of hand that survival starts getting threatened? What sort of trust would that require? … What sort of person should I have already started being by now, in order to help gain that level of trust?
But at least at the moment, the number of people who are willing to even agree that “yeah, that sounds like it’d be fun”, let alone seriously consider it as a future possibility… are quite small, eclectic, and tend to be downright weirdos.
Excuse me, sorry, which website are we on? Oh, right, the quite small website full of eclectic downright weirdos, many of whom actually think emming is a good idea (mind, admittedly, if you’re already living in a world where that sort of thing is safe, secure, and commonly available, it’s certainly one valid way to live among many).
I don’t have anything planned for after turning 120 other than “being dead”. Why not? I should talk to my fiancee about moving out of the Solar System one of these days.
On the other hand, even the nearest star at very high speed is generation-ship territory (for non-ems). Are you just planning to sleep the whole way there, or will there be party games?
I wonder how many folk would be willing to accept handing over root access for the duration of the voyage, to ensure any interpersonal conflicts don’t get so out of hand that survival starts getting threatened?
Why are you having the piloting done by a person rather than by software? I figure once you’ve got the hardware for a spaceship, it’s really mostly the same sort of software task done by self-driving cars. Some well-made narrow AI ought to take care of it.
Also, root access? Unix? Bro, do you even cryptographically-signed capability security?
What sort of trust would that require?
Quite a lot, if you’re really proposing Unix-level root access.
… What sort of person should I have already started being by now, in order to help gain that level of trust?
Well, obviously: a trustworthy one, who makes good on his obligations and his words almost all the time. Also, a likeable one, since you’re proposing to spend quite possibly hundreds of years at a time with someone. Are you married, or seeing someone, perhaps?
Are you just planning to sleep the whole way there, or will there be party games?
Depends on the power generation compared to power requirements per em. A really low-cost trip, such as to spam spores to every star in reach, is one option; something more along the lines of an STL version of a Culture GSV is another.
Why are you having the piloting done by a person rather than by software?
Because plans can change mid-voyage.
Also, root access? Unix? Bro, do you even cryptographically-signed capability security?
… Not yet? (I haven’t even started lifting, yet...)
Because you have noticeable loner tendencies in this thread. You don’t seem like the “many” type.
Because plans can change mid-voyage.
Ah. But if you have to run a person, you should let the software run the ship, and let the person enjoy themselves.
I mean, it could be more complicated than a self-driving car in a way that requires real cognitive work to handle, but I can’t imagine you swerve to avoid debris on a minute-by-minute basis in interstellar space.
Because you have noticeable loner tendencies in this thread. You don’t seem like the “many” type.
Maybe not—but even with a straight brain-to-emulated-brain conversion, living in VR means that if I want, I could spend a decade in a cabin in an enormous forest without a single other emulated-brain entering that VR, while still keeping up with email. So it would still be possible for me to enjoy my preferred hermiting lifestyle even in a starship containing a great many emulated minds. :)
you should let the software run the ship
The point I was trying to focus on—who /tells/ the software that runs the ship that the ship should change course? Who has the authority? If the ship’s emulations get into a conflict and start trying to throw virii at each other, who has the power to limit the violent minds’ access to dangerous software, or even to the processing power needed to run at full speed, in order to prevent those software weapons from posing a risk to the ship’s low-level software?
Even if the OS is formally mathematically verified, there are still going to be opportunities for unexpected side-channel attacks...
Maybe not—but even with a straight brain-to-emulated-brain conversion, living in VR means that if I want, I could spend a decade in a cabin in an enormous forest without a single other emulated-brain entering that VR, while still keeping up with email. So it would still be possible for me to enjoy my preferred hermiting lifestyle even in a starship containing a great many emulated minds. :)
As to the matter of finding yourself a crew, you could try answering my PM on that other website! There’s a whole forum full of interesting freaks and geeks over there who would hop on your em-ship in an instant.
The point I was trying to focus on—who /tells/ the software that runs the ship that the ship should change course? Who has the authority? If the ship’s emulations get into a conflict and start trying to throw virii at each other, who has the power to limit the violent minds’ access to dangerous software, or even to the processing power needed to run at full speed, in order to prevent those software weapons from posing a risk to the ship’s low-level software?
This problem is called politics. It remains unsolved. The general problem is that in order to keep peace among intelligent agents who have a rather utility-idiotic tendency to break out fighting over petty spats… you would generally need an intelligent agent who wants to keep peace, or some very unbreakable fences to separate the humans from each-other.
Our current nation-state system is one of very-difficult-to-break fences, with layers of mutually-beneficial-and-necessary cooperation built on top, all designed to prevent major wars from occurring. And currently failing, due largely to the too-closeness of some borders, due to trade and financial policies that take away the incentives against war by ruining economies, etc.
So if you want to do better than that, you need to either think of something better, or get someone to do your thinking for you. Since we’re talking software, having the ship’s core systems run by an AI sounds convenient, but of course we’re on LessWrong so we all know how easy it is to get that just plain wrong. Of course, if you’re at this level of technology already, perhaps we already have Friendly AIs that can easily manufacture Highly Intelligent Utility AIs with Narrow Domains to do this sort of thing.
Or maybe people will just have to get along for once, which is really the simplest but most difficult solution, knowing people.
I didn’t realize that was you—the user-names are rather different.
This problem is called politics. It remains unsolved.
In the general case, yes; but certain extremely limited subsets do seem amenable to game theory. For example—say that future-me builds and/or buys a bunch of spacefaring spores, capable of carrying my emmed consciousness to other stars (and, upon arrival, can build the infrastructure to build more of them, etc, etc). I’d suggest that ‘personal property’ is a reasonably solved problem, in that few people would argue that the spores’ computers are mine, and I have the right to choose what software runs on them. (There may be some quibbling about what ‘I’ actually means once I start splitting into multiple copies, but I’m already working on how to handle that issue. :) )
If any other ems want to come along, then would it really be such a big issue if I make it clear ahead of time that the spores will remain under my control, and that if the passengers behave in such a way that I deem them a threat to the voyage, I reserve the right to limit their various privileges and accesses to each other, up to and including putting them on ‘pause’ for the remainder of the trip? Or, put another way—that I’m claiming the traditional rights of both owner-aboard and captain of a vessel?
(… And might we gain a few more people contributing to this conversational thread if we started a new topic?)
If any other ems want to come along, then would it really be such a big issue if I make it clear ahead of time that the spores will remain under my control, and that if the passengers behave in such a way that I deem them a threat to the voyage, I reserve the right to limit their various privileges and accesses to each other, up to and including putting them on ‘pause’ for the remainder of the trip? Or, put another way—that I’m claiming the traditional rights of both owner-aboard and captain of a vessel?
I’m sure you would still find people who agree to that.
(… And might we gain a few more people contributing to this conversational thread if we started a new topic?)
Hey, I’ve taken ems copying themselves seriously enough to try to figure out a workable system for them to divvy up their property and debts—and dropped a few details of those into the Orion’s Arm SF setting.
I get the high only from strenuous exercise that lasts about an hour or more, like soccer for example. Half hour runs or weight lifting do not have such an effect, and I don’t find the reward worth the pain in those activities which means I do them in a more reasonable pace.
This suggests you might have to reach a certain level of fitness to be able to strain yourself enough to get the high and this level varies between activities and people. There are activities like swimming that don’t give me the high at all no matter how hard I try, but oddly enough swimming is my favorite form of exercise.
I don’t think that you get a lot of motivation from surface thoughts like the one you listed.
What motivates you at the moment to do the thing you are doing? What are your greatest fears? Explore those questions and see how the answers relate to your agenda.
Some motivations: I’m negotiating to order a custom hiking stick, and I want to be able to actually use the thing properly without ending up dragging it behind me after so many kilometers because it’s too heavy. I want to be the sort of person who could be described as a PC instead of a mere NPC; someone who is at least vaguely capable in a wide variety of situations. (Eg, I’m trying to figure out if it’s worth the effort to try to swing the tuition costs for St. John’s Ambulance first-aid training.) I want to be able to pull my own weight—literally—if I have to.
Some fears: I don’t trust my understanding of quantum theory and the MWI to rely on Everett Immortality keeping me alive; if I’m not ready to deal with whatever comes close to killing me next, then I could very well end up permanently dead, in all branches of the future leading from this point. There are few enough people who even think about x-risks; how can I consider myself competent to even start thinking about ways to avoid those if I’m not competent enough to do a few push-ups? If ancestor simulations turn out to be feasible, then if this is the original version of history, won’t all my future copies be rather annoyed at me for leaving them as un-exercised weaklings (and won’t the simulators laugh at them)?; and if this is a simulation, then if I can’t do better than my original, then what’s the point of me?
(… Okay, so not all of those are actual /fears/ per se, but if an extremely hypothetical stick is enough of one to kick me in the rear to keep going, I’m willing to work with it.)
Fear of death is a strong one. You could go associate that fear more and use it to push you to take action. Where do you feel that fear in your body?
Fear can also “freeze” people.
Deassociated fear freezes people. Associating emotions generally produces action. Alternatively people can also find a way to deassociate them to escape them.
The kind of fear that can’t be felt as movement in some part of the body is deassociated that why I ask where he can feel it in his body.
Er… Nowhere, at least that I notice?
That’s suggest the feeling is either disassociated or you just try to construct something intellectually that isn’t there. In that form it won’t help much with motivation.
While I think about it, trying to get strangers on the internet that I don’t really know to associate a strong fear of death to motivate themselves might be an infohazard.
Yeah, really.
Besides, exercise is fun. You get a nice high off exercising. I start feeling awful whenever I skip it, these days. The improved health and the nice thought that I’m officially Fixing My Life are just bonuses.
To fix your life, fix your emotions, at a deep level. You need to train your lizard brain to prefer what your ape brain prefers, and train your ape brain to prefer what your conscious human brain prefers.
Correction; /you/ get a nice high off exercising. /I/ have yet to get a runner’s high, or anything of the sort. (Imagine trying to keep up an exercise program if you /didn’t/ get that nice high; how would you motivate yourself?)
I’m willing to give it a shot. Got any specific suggestions on the how?
Hmm… I get general, overall improvements to mood and focus from exercising, as much from weight-lifting as from cardio. Have you tried weight-lifting? Running on an elliptical machine or biking as opposed to just going out and hitting the pavement?
I know people say it’s very athletic and cool to just go out and run, but that always clobbered the hell out of my knees—especially because I was out of shape in the first place. It took me months of elliptical training (which is softer on the knees, which means you can go longer without utterly exhausting yourself even at a high heart rate) before I could enjoy just running.
My mind kinda processes exercise this way.
What mostly works for me is blatant rationalization, and training the damn thing like a dog. Reward it when it’s good and punish it when it’s bad. Manage its diet. Blah blah blah.
It is a much larger problem than I gave credit for when I posed it, though.
EDIT: Also, I motivate myself during exercise by strapping on some headphones and listening to all my favorite music. Currently: the Kill la Kill and Gatchaman soundtracks, with some Shingeki no Kyojin thrown in. (I’m a nerd, deal with it.)
Also, since I notice you are apparently motivated to improve your health by wanting to gain eternal life, I would figure that eternal life would be enough of a motivation. Are you really failing to find immortality emotionally moving? I can see why one would, but you claimed that was your conscious motivation. Need something else, perhaps?
I’ve started this whole thing within the past month, and as of yet, have bought no equipment (and due to wintery weather, have barely managed my daily hikes, let alone increasing the pace to jogging or running). I am planning on picking up a bicycle come the spring, though.
I tend to prefer GURPS, myself, but the thread’s title should point out I’m thinking at least roughly on the same lines. :)
Hrm. For some time now, I’ve been picking a deliberately flimsy excuse as my reason to go out for the daily walk/hike, such as “I’ll go check the price of basil at distant store X”, to the point of it now being a running joke, so that part seems easy enough. And I do recall theorizing about the ‘mahout theory of consciousness’, where most of the brain is just this clumsy elephant that wants to trundle along in its own way, and needs some sharp whacks for the consciousness-mahout to get it going in the right direction… I’m going to have to see if I can dig up my notes on how well that worked out at the time.
What sort of rewards and punishments seem to work for you?
There’s a somewhat involved chain of causality between ‘increasing the odds of immortality’ and ‘do another one of this painful thing’, said chain involving all sorts of probabilities and estimations and outright guesses. Sometimes, that chain alone is enough to keep me keeping on. Sometimes, the odds that the effort will result in the reward seem low enough that it doesn’t seem worth the cost. Thus, I’m quite open to whatever other motivations could help get over the hump.
Healthy exercise is easily one of the most effective everyday measures you can take to decrease your chances of mortality.
Yes, you could get killed by a mutated pandemic, an asteroid strike, a car crash, a mugger, or a UFAI. But really, probabilistically speaking, what will probably kill you is either cancer or heart disease. Exercise prevents and fights heart disease, thus decreasing the largest single preventable mortality factor.
Eternal life is gained one year at a time! You want to make it as far as trying to tile the Solar System in copies of yourself and end up in an epic war with my army of mecha? Survive long enough for the technology to exist!
It may please you to hear that I read your post shortly before arriving at the stores that were the theoretical destination for my walk today… and after some consideration, I bought a set of adjustable wrist/ankle weights to help improve the effectiveness of my exercising.
(Even if I were to entirely drop the actually-painful parts of my routine—the push-ups and planks—I seem to be doing well with the gradual increase of the other exercises, and such weights seem reasonably likely to improve those exercises’ effectiveness.)
PS: I only want to tile the Solar System with myself to the degree necessary to mine He-3, or whatever other fuel source is most viable to start launching myself elsewhere. As long as you don’t want to hoard so many resources for your mecha that you keep me from sporing, I’m sure we can get along, and that our epic wars will involve boffer nukes. ;)
For just a few minutes, I’ve earned the privilege of wearing Kamina-sama’s flag as a cape. Thank you.
So, you’re just aiming for a bit of lonely star sailing, eh? No need to tile, then. We’ll have plenty of infrastructure to give you a lift. Seems a pity, going out into the black all alone, though. You should take a friend.
If anyone else wants to come with, I don’t particularly mind. I also don’t mind going it alone. (I’ve seriously looked into whether or not I’d be qualified as a lighthouse keeper or fire-tower lookout.) But at least at the moment, the number of people who are willing to even agree that “yeah, that sounds like it’d be fun”, let alone seriously consider it as a future possibility… are quite small, eclectic, and tend to be downright weirdos.
I wonder how many folk would be willing to accept handing over root access for the duration of the voyage, to ensure any interpersonal conflicts don’t get so out of hand that survival starts getting threatened? What sort of trust would that require? … What sort of person should I have already started being by now, in order to help gain that level of trust?
Excuse me, sorry, which website are we on? Oh, right, the quite small website full of eclectic downright weirdos, many of whom actually think emming is a good idea (mind, admittedly, if you’re already living in a world where that sort of thing is safe, secure, and commonly available, it’s certainly one valid way to live among many).
I don’t have anything planned for after turning 120 other than “being dead”. Why not? I should talk to my fiancee about moving out of the Solar System one of these days.
On the other hand, even the nearest star at very high speed is generation-ship territory (for non-ems). Are you just planning to sleep the whole way there, or will there be party games?
Why are you having the piloting done by a person rather than by software? I figure once you’ve got the hardware for a spaceship, it’s really mostly the same sort of software task done by self-driving cars. Some well-made narrow AI ought to take care of it.
Also, root access? Unix? Bro, do you even cryptographically-signed capability security?
Quite a lot, if you’re really proposing Unix-level root access.
Well, obviously: a trustworthy one, who makes good on his obligations and his words almost all the time. Also, a likeable one, since you’re proposing to spend quite possibly hundreds of years at a time with someone. Are you married, or seeing someone, perhaps?
Depends on the power generation compared to power requirements per em. A really low-cost trip, such as to spam spores to every star in reach, is one option; something more along the lines of an STL version of a Culture GSV is another.
Because plans can change mid-voyage.
… Not yet? (I haven’t even started lifting, yet...)
Who said it would be just some/one/?
Because you have noticeable loner tendencies in this thread. You don’t seem like the “many” type.
Ah. But if you have to run a person, you should let the software run the ship, and let the person enjoy themselves.
I mean, it could be more complicated than a self-driving car in a way that requires real cognitive work to handle, but I can’t imagine you swerve to avoid debris on a minute-by-minute basis in interstellar space.
Maybe not—but even with a straight brain-to-emulated-brain conversion, living in VR means that if I want, I could spend a decade in a cabin in an enormous forest without a single other emulated-brain entering that VR, while still keeping up with email. So it would still be possible for me to enjoy my preferred hermiting lifestyle even in a starship containing a great many emulated minds. :)
The point I was trying to focus on—who /tells/ the software that runs the ship that the ship should change course? Who has the authority? If the ship’s emulations get into a conflict and start trying to throw virii at each other, who has the power to limit the violent minds’ access to dangerous software, or even to the processing power needed to run at full speed, in order to prevent those software weapons from posing a risk to the ship’s low-level software?
Even if the OS is formally mathematically verified, there are still going to be opportunities for unexpected side-channel attacks...
As to the matter of finding yourself a crew, you could try answering my PM on that other website! There’s a whole forum full of interesting freaks and geeks over there who would hop on your em-ship in an instant.
This problem is called politics. It remains unsolved. The general problem is that in order to keep peace among intelligent agents who have a rather utility-idiotic tendency to break out fighting over petty spats… you would generally need an intelligent agent who wants to keep peace, or some very unbreakable fences to separate the humans from each-other.
Our current nation-state system is one of very-difficult-to-break fences, with layers of mutually-beneficial-and-necessary cooperation built on top, all designed to prevent major wars from occurring. And currently failing, due largely to the too-closeness of some borders, due to trade and financial policies that take away the incentives against war by ruining economies, etc.
So if you want to do better than that, you need to either think of something better, or get someone to do your thinking for you. Since we’re talking software, having the ship’s core systems run by an AI sounds convenient, but of course we’re on LessWrong so we all know how easy it is to get that just plain wrong. Of course, if you’re at this level of technology already, perhaps we already have Friendly AIs that can easily manufacture Highly Intelligent Utility AIs with Narrow Domains to do this sort of thing.
Or maybe people will just have to get along for once, which is really the simplest but most difficult solution, knowing people.
I didn’t realize that was you—the user-names are rather different.
In the general case, yes; but certain extremely limited subsets do seem amenable to game theory. For example—say that future-me builds and/or buys a bunch of spacefaring spores, capable of carrying my emmed consciousness to other stars (and, upon arrival, can build the infrastructure to build more of them, etc, etc). I’d suggest that ‘personal property’ is a reasonably solved problem, in that few people would argue that the spores’ computers are mine, and I have the right to choose what software runs on them. (There may be some quibbling about what ‘I’ actually means once I start splitting into multiple copies, but I’m already working on how to handle that issue. :) )
If any other ems want to come along, then would it really be such a big issue if I make it clear ahead of time that the spores will remain under my control, and that if the passengers behave in such a way that I deem them a threat to the voyage, I reserve the right to limit their various privileges and accesses to each other, up to and including putting them on ‘pause’ for the remainder of the trip? Or, put another way—that I’m claiming the traditional rights of both owner-aboard and captain of a vessel?
(… And might we gain a few more people contributing to this conversational thread if we started a new topic?)
I’m sure you would still find people who agree to that.
You really take this that seriously? I dunno.
Hey, I’ve taken ems copying themselves seriously enough to try to figure out a workable system for them to divvy up their property and debts—and dropped a few details of those into the Orion’s Arm SF setting.
FWIW, I found ellipticals much harder to use than treadmills when I was first starting out. Don’t know why.
I get the high only from strenuous exercise that lasts about an hour or more, like soccer for example. Half hour runs or weight lifting do not have such an effect, and I don’t find the reward worth the pain in those activities which means I do them in a more reasonable pace.
This suggests you might have to reach a certain level of fitness to be able to strain yourself enough to get the high and this level varies between activities and people. There are activities like swimming that don’t give me the high at all no matter how hard I try, but oddly enough swimming is my favorite form of exercise.