OpenAI’s cybersecurity is probably regulated by NIS Regulations

Link post

The EU and UK’s Network and Information Systems (NIS) Regulations aim to improve the cybersecurity of essential services and important digital providers. They set requirements around network security, information system security, physical security, incident handling, business continuity and security auditing.

I think OpenAI needs to comply with these regulations. This article sets out why, and what the implications are.

NIS scope

Disclaimer: this is commentary from a non-lawyer on the internet, not legal advice!

NIS covers “relevant digital service providers”, which are companies that:

  1. Provide an online search engine, an online marketplace, or a cloud computing service; and

  2. Have more than 50 staff or €10+ million annual turnover or balance sheet

  3. Operate in the EU or UK[1]

OpenAI clearly meets condition 2: their revenue is measured in billions, not millions!

They also meet condition 3, by operating in the EU and UK.

So, that brings us to condition 1. I claim that OpenAI is operating an online search engine, at least as defined in the regulations.

NIS section 1 defines an online search engine as “a digital service that allows users to perform searches of, in principle, all websites or websites in a particular language on the basis of a query on any subject in the form of a keyword, phrase or other input, and returns links in which information related to the requested content can be found.”

If you ask ChatGPT to, it will search the web based on a prompt. This looks very similar allowing searches based on a query in the form of a phrase. It then also provides a list of links and references, where people can find out more about the requested content.

Other providers

Google, Microsoft and Amazon are already considered relevant digital service providers given their cloud offerings. In addition, Google Gemini and Bing Chat also operate in the same way as described above, so are doubly likely to be in scope.

Perplexity, another AI service, probably meets the bar for being a search engine, as well as the employee count (a Bloomberg article put them at 55 people in April). Inflection is more borderline on employee count.

Given Anthropic’s Claude does not currently have browsing capabilities, I don’t think it would be considered a search engine. The API might be considered a cloud computing service, and the ICO has confirmed that some SaaS or PaaS type services do count here. But I think it’s less clear cut than OpenAI or Perplexity.

There are also updates planned to the NIS regulations, in the form of the Cyber Security and Resilience Bill. This suggested “Expanding the remit of the regulation to protect more digital services and supply chains,” so perhaps more AI providers will be in scope in future.

Implications

If OpenAI were to fall under NIS regulations, it would face new obligations relevant to AI safety. These include:

  1. Reporting significant incidents to the Information Commissioner’s Office (ICO). These are any incidents with a substantial impact on providing services, including incidents with non-cyber causes.

  2. Implementing appropriate cybersecurity measures. This includes network security, information system security, physical security, incident handling, business continuity and security auditing.

  3. Being subject to inspection by the ICO or a third party auditor. The ICO can also require them to provide information about their security.

  4. Having the above enforced, through a combination of information notices, enforcement notices and penalties.

The implications could be significant for AI safety and cybersecurity, particularly while there is a lack of other AI legislation in the UK.

  1. ^

    Technically, this reqiurement is about having a head office in the EU/​UK or a nominated EU/​UK representative (part 1, section 3ei in the UK). Given other parts of the regulation (part 1, section 14A in the UK) requires digital service providers meeting condition 2 above to have representatives, this is the result.