the *real* problem is the huge number of prompts clearly designed to create CSAM images
So, people with harmful and deviant from the social norm taste instead of causing problems in the real world try to isolate themselves in the digital fantasies and that is a problem...exactly how?
I mean, obviously, it’s coping mechanism, not trying to fix the problem, but also our society isn’t known to be very understanding to people coming out with this kind of deviations when they want to fix it.
I am taking as a given people’s revealed and often very strongly stated preference that CSAM images are Very Not Okay even if they are fully AI generated and not based on any individual, to the point of criminality, and that society is going to treat it that way.
I agree that we don’t know that it is actually net harmful—e.g. the studies on video game use and access to adult pornography tend to not show the negative impacts people assume.
So, people with harmful and deviant from the social norm taste instead of causing problems in the real world try to isolate themselves in the digital fantasies and that is a problem...exactly how?
I mean, obviously, it’s coping mechanism, not trying to fix the problem, but also our society isn’t known to be very understanding to people coming out with this kind of deviations when they want to fix it.
I am taking as a given people’s revealed and often very strongly stated preference that CSAM images are Very Not Okay even if they are fully AI generated and not based on any individual, to the point of criminality, and that society is going to treat it that way.
I agree that we don’t know that it is actually net harmful—e.g. the studies on video game use and access to adult pornography tend to not show the negative impacts people assume.