You didn’t mention the policy implications, which I think are one of if not the most impactful reason to care about misuse. Government regulation seems super important long-term to prevent people from deploying dangerous models publicly, and the only way to get that is by demonstrating that models are actually scary.
You didn’t mention the policy implications, which I think are one of if not the most impactful reason to care about misuse. Government regulation seems super important long-term to prevent people from deploying dangerous models publicly, and the only way to get that is by demonstrating that models are actually scary.
Agreed. However, in this case, building countermeasures to prevent misuse doesn’t particularly help. The evaluations for potentially dangerous capabilities are highly relevant.
You didn’t mention the policy implications, which I think are one of if not the most impactful reason to care about misuse. Government regulation seems super important long-term to prevent people from deploying dangerous models publicly, and the only way to get that is by demonstrating that models are actually scary.
Agreed. However, in this case, building countermeasures to prevent misuse doesn’t particularly help. The evaluations for potentially dangerous capabilities are highly relevant.