Thanks Nancy, I’ve made a note of that just so I can reference the diagrams.
ETA: And taking a glance at the article he references I now share his outrage. She lists many of the benefits of treatment, the consequences of not having treatment and then goes and explains that she denies access to treatment for her children. Letting that woman reproduce was a crime against humanity. There are very few things I call unmitigated evil but for some reason this is one of them.
I suspect you’re ranting, but I’ll bring up some practical issues.
I doubt it’s possible (except, perhaps in some extreme cases) to tell years in advance what people’s child-raising policies will be.
I’d be extremely cautious about giving an authority permission to say who will reproduce and who won’t.
And I’m tempted to reread her earlier hovels (as Megan Lindholm) to see whether there was a weird authoritarianism (she’s also come out strongly against most fanfiction in them.
It would be more reasonable to read (or reread) her more recent work, except that I got bored by it after the first trilogy, while I liked the earlier stuff.
I suspect you’re ranting, but I’ll bring up some practical issues.
I’m wouldn’t call it ‘ranting’ but I certainly don’t expect “should not be allowed to reproduce” to be taken literally, nor do I often (ever?) observe cases where people mean such claims as anything other than “I disapprove of that behavior and the type of genetic or cultural heritage that produces it”.
But following up on on the topic of eugenics. Any authority who considered they had the right to say who will reproduce and who will not is unlikely to pass my ‘kill test’. That is to say I would (if convenient) kill them. And kill anyone who tried to stop me from killing them if necessary. The means by which they gained the power in question would not necessarily matter (ie. it would not pass the kill test just because people voted on it).
Mind you, there are situations in which I would approve of eugenics. Most of them do not involve ‘authority’ in any conventional human sense. For example… bizarre situations in which:
FAI is not possible (or available in time)
I personally have access to advanced nanotechnology (eg. I have an Asgard core
There is something which provokes the need for me to take overwhelming unilateral action.
If reproduction is not limited it will contribute to existential threat. Perhaps:
Unconstrained breeding will produce people who are likely to create a uFAI before an FAI is possible.
We are progressing along the inevitable competitive equilibrium of a hardscrabble frontier.
Unconstrained breeding will result in humans devolving and losing that which is valuable about our species (with current selection pressure it probably would, not that it matters.)
Without breeding constraints (either number or in quality) humanity will not even survive to reach for the stars or use the universe in some sort of eudemonic manner.
Basically I consider the ability to dictate reproduction over the course of several generations to be equivalent to seizing absolute control and forming a stable singularity. And then act accordingly.
I suspect you’re ranting, but I’ll bring up some practical issues.
I’m wouldn’t call it ‘ranting’ but I certainly don’t expect “should not be allowed to reproduce” to be taken literally, nor do I often (ever?) observe cases where people mean such claims as anything other than “I disapprove of that behavior and the type of genetic or cultural heritage that produces it”.
But following up on on the topic of eugenics. Any authority who considered they had the right to say who will reproduce and who will not is unlikely to pass my ‘kill test’. That is to say I would (if convenient) kill them. And kill anyone who tried to stop me from killing them if necessary. The means by which they gained the power in question would not necessarily matter (ie. it would not pass the kill test just because people voted on it).
Mind you, there are situations in which I would approve of eugenics. Most of them do not involve ‘authority’ in any conventional human sense. For example… bizarre situations in which:
FAI is not possible (or available in time)
I personally have access to advanced nanotechnology (eg. I have an Asgard core
There is something which provokes the need for me to take overwhelming unilateral action.
If reproduction is not limited it will contribute to existential threat. Perhaps one of:
Unconstrained breeding will produce people who are likely to create a uFAI before an FAI is possible.
We are progressing along the inevitable competitive equilibrium of a hardscrabble frontier.
Unconstrained breeding will result in humans devolving and losing that which is valuable about our species (with current selection pressure it probably would, not that it matters.)
Without breeding constraints (either number or in quality) humanity will not even survive to reach for the stars or use the universe in some sort of eudemonic manner.
Basically I consider the ability to dictate reproduction over the course of several generations to be equivalent to seizing absolute control and forming a stable singularity. And then act accordingly.
A rant about the effects of serious untreated ADHD
Thanks Nancy, I’ve made a note of that just so I can reference the diagrams.
ETA: And taking a glance at the article he references I now share his outrage. She lists many of the benefits of treatment, the consequences of not having treatment and then goes and explains that she denies access to treatment for her children. Letting that woman reproduce was a crime against humanity. There are very few things I call unmitigated evil but for some reason this is one of them.
I suspect you’re ranting, but I’ll bring up some practical issues.
I doubt it’s possible (except, perhaps in some extreme cases) to tell years in advance what people’s child-raising policies will be.
I’d be extremely cautious about giving an authority permission to say who will reproduce and who won’t.
And I’m tempted to reread her earlier hovels (as Megan Lindholm) to see whether there was a weird authoritarianism (she’s also come out strongly against most fanfiction in them.
It would be more reasonable to read (or reread) her more recent work, except that I got bored by it after the first trilogy, while I liked the earlier stuff.
I’m wouldn’t call it ‘ranting’ but I certainly don’t expect “should not be allowed to reproduce” to be taken literally, nor do I often (ever?) observe cases where people mean such claims as anything other than “I disapprove of that behavior and the type of genetic or cultural heritage that produces it”.
But following up on on the topic of eugenics. Any authority who considered they had the right to say who will reproduce and who will not is unlikely to pass my ‘kill test’. That is to say I would (if convenient) kill them. And kill anyone who tried to stop me from killing them if necessary. The means by which they gained the power in question would not necessarily matter (ie. it would not pass the kill test just because people voted on it).
Mind you, there are situations in which I would approve of eugenics. Most of them do not involve ‘authority’ in any conventional human sense. For example… bizarre situations in which:
FAI is not possible (or available in time)
I personally have access to advanced nanotechnology (eg. I have an Asgard core
There is something which provokes the need for me to take overwhelming unilateral action.
If reproduction is not limited it will contribute to existential threat. Perhaps:
Unconstrained breeding will produce people who are likely to create a uFAI before an FAI is possible.
We are progressing along the inevitable competitive equilibrium of a hardscrabble frontier.
Unconstrained breeding will result in humans devolving and losing that which is valuable about our species (with current selection pressure it probably would, not that it matters.)
Without breeding constraints (either number or in quality) humanity will not even survive to reach for the stars or use the universe in some sort of eudemonic manner.
Basically I consider the ability to dictate reproduction over the course of several generations to be equivalent to seizing absolute control and forming a stable singularity. And then act accordingly.
I’m wouldn’t call it ‘ranting’ but I certainly don’t expect “should not be allowed to reproduce” to be taken literally, nor do I often (ever?) observe cases where people mean such claims as anything other than “I disapprove of that behavior and the type of genetic or cultural heritage that produces it”.
But following up on on the topic of eugenics. Any authority who considered they had the right to say who will reproduce and who will not is unlikely to pass my ‘kill test’. That is to say I would (if convenient) kill them. And kill anyone who tried to stop me from killing them if necessary. The means by which they gained the power in question would not necessarily matter (ie. it would not pass the kill test just because people voted on it).
Mind you, there are situations in which I would approve of eugenics. Most of them do not involve ‘authority’ in any conventional human sense. For example… bizarre situations in which:
FAI is not possible (or available in time)
I personally have access to advanced nanotechnology (eg. I have an Asgard core
There is something which provokes the need for me to take overwhelming unilateral action.
If reproduction is not limited it will contribute to existential threat. Perhaps one of:
Unconstrained breeding will produce people who are likely to create a uFAI before an FAI is possible.
We are progressing along the inevitable competitive equilibrium of a hardscrabble frontier.
Unconstrained breeding will result in humans devolving and losing that which is valuable about our species (with current selection pressure it probably would, not that it matters.)
Without breeding constraints (either number or in quality) humanity will not even survive to reach for the stars or use the universe in some sort of eudemonic manner.
Basically I consider the ability to dictate reproduction over the course of several generations to be equivalent to seizing absolute control and forming a stable singularity. And then act accordingly.
Thanks Nancy, I’ve made a note of that just so I can reference the diagrams.