They commit to not using your data to train their models without explicit permission.
I’ve just registered on their website because of this article. During registration, I was told that conversations marked by their automated system that overlooks if you are following their terms of use are regularly overlooked by humans and used to train their models.
We will not use your Inputs or Outputs to train our models, unless: (1) your conversations are flagged for Trust & Safety review (in which case we may use or analyze them to improve our ability to detect and enforce our Usage Policy, including training models for use by our Trust and Safety team, consistent with Anthropic’s safety mission), or (2) you’ve explicitly reported the materials to us (for example via our feedback mechanisms), or (3) by otherwise explicitly opting in to training.
Notably, this doesn’t provide an opt out method, and the same messaging is repeated across similar articles/questions. The closest thing to an opt out seems to be “you have the right to request a copy of your data, and object to our usage of it”.
I think I’ve figured out what you meant, but for your information, in standard English usage, to “overlook” something means to not see it. The metaphor is that you are looking “over” where the thing is, into the distance, not noticing the thing close to you. Your sentence would be better phrased as “conversations marked by their automated system that looks at whether you are following their terms of use are regularly looked at by humans”.
I’ve just registered on their website because of this article. During registration, I was told that conversations marked by their automated system that overlooks if you are following their terms of use are regularly overlooked by humans and used to train their models.
In Anthropic’s support page for “I want to opt out of my prompts and results being used for training” they say:
Notably, this doesn’t provide an opt out method, and the same messaging is repeated across similar articles/questions. The closest thing to an opt out seems to be “you have the right to request a copy of your data, and object to our usage of it”.
I think I’ve figured out what you meant, but for your information, in standard English usage, to “overlook” something means to not see it. The metaphor is that you are looking “over” where the thing is, into the distance, not noticing the thing close to you. Your sentence would be better phrased as “conversations marked by their automated system that looks at whether you are following their terms of use are regularly looked at by humans”.