Ahh, thankyou. In that case the question of whether comrades could be easily liberated from prisons is basically too trivial to even contemplate. Even without their physical capabilities the intellectual ability of vampires makes such things simple. With absurdly fast processing speed and a flawless memory a would be liberator could walk in with overwhelming and specialised technology.
This leads my speculation to a barely related tangent: If a vampire has something important that they want to achieve and is rational then timeframes of beyond a couple of years would be out of the question. They would obviously dedicate a small amount of time to a quick read of wikipedia and the references provided for any topics that seem important. A coven made up of individuals of slightly above average intelligence (any of the big players) that had something to protect would create an FAI in short order.
That isn’t an idiot ball that an author could be expected to avoid. The only options would be to throw in extreme time constraints or artificially handicap vampire’s learning ability in some areas to below human level in a way that isn’t apparent in either canon or Twilight.
I never suggested they were and nor does my observation require it. They do have ridiculously inhuman processing speed and memory. Inhuman intelligence beyond that is in no way required. It would only be if the vampires were outright inferior thinkers to humans in other ways (that are not in evidence in the books) that those traits would not default to rapid pace FAI development if desired.
Bella, for example, would be sufficient. She has more than enough rationality and does not seem particularly unintelligent. She has access to peers of sufficient intelligence too—even if some of them seem to be getting dead. She also has access to the rest of the world. Including an awful lot of intelligent rational humans who would jump at the opportunity to become immortal, have few qualms about the stigma around ‘vampire’ and already have an interest in creating FAIs.
It’s a trivial task unless the author introduces an idiot ball or some artificial weakness of the kind that is often used to thwart the protagonists (and powerful allies) in the early parts of fantasy stories.
I agree, with two modifications: (1) rationality of the “taking ideas seriously” kind is critical, without it one can spend any amount of time without getting anywhere, and (2) FAI is not a likely outcome, random AGI could come out of this easily as well, given that you are only assuming fast processing speed and not necessarily self-reinforcing rationality, i.e. essentially future ems.
I concur with both. I add the caveat that the ‘taking ideas seriously’ rational vampire would clearly be best served by vamping willing FAI researchers. That could be expected to raise p(FAI | GAI) up to well above real world values. I say above just because it eliminates some significant contributors towards error (memory failure, fatigue, time pressure and cognitive decline from aging).
If the policy is to introduce LW-related conclusions to Luminosity on the grounds of plausibility in the real world, then there’s an easy fix to keep the plot constrained:
Luminosity takes place in a vampire!ancestor-simulation, and if it approaches GAI (which might figure out how to hack the simulation and then cause trouble for the parent universe) then it is shut down and reset with different random seeds so something else happens.
Ahh, thankyou. In that case the question of whether comrades could be easily liberated from prisons is basically too trivial to even contemplate. Even without their physical capabilities the intellectual ability of vampires makes such things simple. With absurdly fast processing speed and a flawless memory a would be liberator could walk in with overwhelming and specialised technology.
This leads my speculation to a barely related tangent: If a vampire has something important that they want to achieve and is rational then timeframes of beyond a couple of years would be out of the question. They would obviously dedicate a small amount of time to a quick read of wikipedia and the references provided for any topics that seem important. A coven made up of individuals of slightly above average intelligence (any of the big players) that had something to protect would create an FAI in short order.
That isn’t an idiot ball that an author could be expected to avoid. The only options would be to throw in extreme time constraints or artificially handicap vampire’s learning ability in some areas to below human level in a way that isn’t apparent in either canon or Twilight.
Vampires can learn lots of stuff without error, but it doesn’t necessarily follow that they are inhumanly smart or rational as well.
I never suggested they were and nor does my observation require it. They do have ridiculously inhuman processing speed and memory. Inhuman intelligence beyond that is in no way required. It would only be if the vampires were outright inferior thinkers to humans in other ways (that are not in evidence in the books) that those traits would not default to rapid pace FAI development if desired.
Bella, for example, would be sufficient. She has more than enough rationality and does not seem particularly unintelligent. She has access to peers of sufficient intelligence too—even if some of them seem to be getting dead. She also has access to the rest of the world. Including an awful lot of intelligent rational humans who would jump at the opportunity to become immortal, have few qualms about the stigma around ‘vampire’ and already have an interest in creating FAIs.
It’s a trivial task unless the author introduces an idiot ball or some artificial weakness of the kind that is often used to thwart the protagonists (and powerful allies) in the early parts of fantasy stories.
I agree, with two modifications: (1) rationality of the “taking ideas seriously” kind is critical, without it one can spend any amount of time without getting anywhere, and (2) FAI is not a likely outcome, random AGI could come out of this easily as well, given that you are only assuming fast processing speed and not necessarily self-reinforcing rationality, i.e. essentially future ems.
I concur with both. I add the caveat that the ‘taking ideas seriously’ rational vampire would clearly be best served by vamping willing FAI researchers. That could be expected to raise p(FAI | GAI) up to well above real world values. I say above just because it eliminates some significant contributors towards error (memory failure, fatigue, time pressure and cognitive decline from aging).
If the policy is to introduce LW-related conclusions to Luminosity on the grounds of plausibility in the real world, then there’s an easy fix to keep the plot constrained:
Luminosity takes place in a vampire!ancestor-simulation, and if it approaches GAI (which might figure out how to hack the simulation and then cause trouble for the parent universe) then it is shut down and reset with different random seeds so something else happens.