What about fish? I’m pretty sure many fish are significantly more functional than one-month-old humans, possibly up to two or three months. (Younger than that I don’t think babies exhibit the ability to anticipate things. Haven’t actually looked this up anywhere reputable, though.)
I don’t know enough about them—given they’re so different to us in terms of gross biology I imagine it’s often going to be quite difficult to distinguish between functioning and instinct—this:
Frequently. It’s scary. But if I were in a body in which intelligence was not easy to express, and I was killed by someone who didn’t think I was sufficiently functional to be a person, that would be a tragic accident, not a moral wrong.
The legal definition of an accident is an unforeseeable event. I don’t agree with that entirely because, well everything’s foreseeable to an arbitrary degree of probability given the right assumptions. However, do you think that people have a duty to avoid accidents that they foresee a high probability-adjusted harm from? (i.e. the potential harm modified by the probability they foresee of the event.)
The thought here being that, if there’s much room for doubt, there’s so much suffering involved in killing and eating animals that we shouldn’t do it even if we only argue ourselves to some low probability of their being people.
About age four, possibly a year or two earlier. I’m reasonably confident I had introspection at age four; I don’t think I did much before that. I find myself completely unable to empathize with a ‘me’ lacking introspection.
Do you think that the use of language and play to portray and discuss fantasy worlds is a sign of introspection?
OK. So the point of this analogy is that newborns seem a lot like the script described, on the compilation step. Yes, they’re going to develop advanced, functioning behaviors eventually, but no, they don’t have them yet. They’re just developing the infrastructure which will eventually support those behaviors.
I agree, if it doesn’t have the capabilities that will make it a person there’s no harm in stopping it before it gets there. If you prevent an egg and a sperm combining and implanting, you haven’t killed a human.
I know the question I actually want to ask: do you think behaviors are immoral if and only if they’re maladaptive?
No, fitness is too complex a phenomena for our relatively inefficient ways of thinking and feeling to update on it very well. If we fix immediate lethal response from the majority as one end of the moral spectrum, and enthusiastic endorsement as the other, then maladaptive behaviour tends to move you further towards the lethal response end of things. But we’re not rational fitness maximisers, we just tend that way on the more readily apparent issues.
I don’t know enough about them—given they’re so different to us in terms of gross biology I imagine it’s often going to be quite difficult to distinguish between functioning and instinct—this:
http://news.bbc.co.uk/1/hi/england/west_yorkshire/3189941.stm
Says that scientists observed some of them using tools, and that definitely seems like people though.
Yes.
Shared attention, recognition, prediction, bonding -
The legal definition of an accident is an unforeseeable event. I don’t agree with that entirely because, well everything’s foreseeable to an arbitrary degree of probability given the right assumptions. However, do you think that people have a duty to avoid accidents that they foresee a high probability-adjusted harm from? (i.e. the potential harm modified by the probability they foresee of the event.)
The thought here being that, if there’s much room for doubt, there’s so much suffering involved in killing and eating animals that we shouldn’t do it even if we only argue ourselves to some low probability of their being people.
Do you think that the use of language and play to portray and discuss fantasy worlds is a sign of introspection?
I agree, if it doesn’t have the capabilities that will make it a person there’s no harm in stopping it before it gets there. If you prevent an egg and a sperm combining and implanting, you haven’t killed a human.
No, fitness is too complex a phenomena for our relatively inefficient ways of thinking and feeling to update on it very well. If we fix immediate lethal response from the majority as one end of the moral spectrum, and enthusiastic endorsement as the other, then maladaptive behaviour tends to move you further towards the lethal response end of things. But we’re not rational fitness maximisers, we just tend that way on the more readily apparent issues.