DO NOT let a state of fear and anxiety that a community/group of people/general zeitgeist has cultivated in you cut you off and discard relationships with people who are outside of said community, who are not hurting you, who share your values.
Also do not try and enforce your beliefs by holding someone’s important relationships hostage to buy-in to the group.
Read up on the BITE model of authoritarian control, determine the extent to which your considered actions are cultlike, and turn down the dial. Shared fears are not shared values, and prioritizing shared fears over shared values is concerning.
The world will not end because there’s smart people you’re close to who don’t happen to be afraid of the same things you are. Separate out fears and values. You are conflating the fear of a particular xRisk with the value of “concern for humanity’s long-term future”. Nothing you’ve said suggests to me that she lacks concern for humanity’s long-term future; just that she doesn’t buy into the xRisk, after perusing a few articles, that you’ve had a longer time frame to arrive at. I know shared actual values are important. Fear of AI is not in fact a value. I say this sharing a certain amount of fear myself.
On a larger scale than one relationship: Don’t isolate yourself.
However important and valid your actual concerns,
DO NOT let a state of fear and anxiety that a community/group of people/general zeitgeist has cultivated in you cut you off and discard relationships with people who are outside of said community, who are not hurting you, who share your values.
Also do not try and enforce your beliefs by holding someone’s important relationships hostage to buy-in to the group.
Read up on the BITE model of authoritarian control, determine the extent to which your considered actions are cultlike, and turn down the dial. Shared fears are not shared values, and prioritizing shared fears over shared values is concerning.
The world will not end because there’s smart people you’re close to who don’t happen to be afraid of the same things you are. Separate out fears and values. You are conflating the fear of a particular xRisk with the value of “concern for humanity’s long-term future”. Nothing you’ve said suggests to me that she lacks concern for humanity’s long-term future; just that she doesn’t buy into the xRisk, after perusing a few articles, that you’ve had a longer time frame to arrive at. I know shared actual values are important. Fear of AI is not in fact a value. I say this sharing a certain amount of fear myself.
On a larger scale than one relationship:
Don’t isolate yourself.