Meta: You appear to have various negative responses; I am not completely clear as to why.
I found this idea useful to discover; while I can’t really see its applications in modifying the way I access the real world, it certainly does raise some interesting ethical ideas.
I immediately thought of person X that I know in relation to the idea of ethics and consciousness. X (is real and) does not have the same ethics model as commonly found in people. They value themselves over and beyond other humans, both near and far (while not unlike many people; this is particularly abundant in their life). A classic label for this is “having a big ego” or “narcissism”. If consciousness is reduced to “nothing but brain chemicals”, the value of other entities is considerably lower than the value an entity might put on itself (because it can). Although this does seems like an application of fundamental attribution error (kinda also reverse-typical mind fallacy [AKA everyone else does not have a mind like me]) being the value one places internally is higher than that which is places on other external entities.
when adding the idea of “not much makes up consciousness”, potentially unethical narcissism actions turn into boring, single-entity self-maximisation actions.
an entity which lacks capacity to reflect outwardly in the same capacity that it reflects inwards would have a nacissism problem (if it is a problem).
Should we value outside entities as much as we do ourselves? Why?
Should we value outside entities as much as we do ourselves? Why?
Nate Soares recently wrote about problems with using the word “should” that I think are relevant here, if we assume meta-ethical relativism (if there are no objective moral shoulds). I think his post “Caring about something larger than yourself” could be valuable in providing a personal answer to the question, if you accept meta-ethical relativism.
Meta: You appear to have various negative responses; I am not completely clear as to why.
I found this idea useful to discover; while I can’t really see its applications in modifying the way I access the real world, it certainly does raise some interesting ethical ideas.
I immediately thought of person X that I know in relation to the idea of ethics and consciousness. X (is real and) does not have the same ethics model as commonly found in people. They value themselves over and beyond other humans, both near and far (while not unlike many people; this is particularly abundant in their life). A classic label for this is “having a big ego” or “narcissism”. If consciousness is reduced to “nothing but brain chemicals”, the value of other entities is considerably lower than the value an entity might put on itself (because it can). Although this does seems like an application of fundamental attribution error (kinda also reverse-typical mind fallacy [AKA everyone else does not have a mind like me]) being the value one places internally is higher than that which is places on other external entities.
when adding the idea of “not much makes up consciousness”, potentially unethical narcissism actions turn into boring, single-entity self-maximisation actions.
an entity which lacks capacity to reflect outwardly in the same capacity that it reflects inwards would have a nacissism problem (if it is a problem).
Should we value outside entities as much as we do ourselves? Why?
Nate Soares recently wrote about problems with using the word “should” that I think are relevant here, if we assume meta-ethical relativism (if there are no objective moral shoulds). I think his post “Caring about something larger than yourself” could be valuable in providing a personal answer to the question, if you accept meta-ethical relativism.
Again a negative, not clear as to why.