Yeah, this seems like it fundamentally springs from “people don’t always want what’s good for them/society.” Hard to design a system to enforce epistemic rigor on an unwilling user base.
Yeah, this seems like it fundamentally springs from “people don’t always want what’s good for them/society.” Hard to design a system to enforce epistemic rigor on an unwilling user base.