The ICO needs to do more than pay lip service to data processing complaints against search engines and the press
The Information Commissioner’s Office (ICO) is the UK’s ‘supervisory authority’, as required by the Data Protection Directive (Directive 95/46/EC) (superseded by the EU General Data Protection Regulation and later the UK General Data Protection Regulation) on all matters concerning personal data.
Amongst its roles, the ICO is obliged to consider complaints from members of the public about how organisations handle their data, and investigate those complaints. This obligation can be found in section 165 of the Data Protection Act 2018. Section 165(4)(a) states that the ICO must “take appropriate steps to respond to the complaint”, which includes (pursuant to section 165(5)(a)) “investigating the subject matter of the complaint, to the extent appropriate”.
In this article we look at two distinct areas where the ICO seems to be neglecting its duty to properly investigate complaints: firstly, against search engines and secondly against the press. We also raise concerns about its lack of transparency and accountability.
The ICO has, since Google Spain SL, Google Inc. v Agencia Espanola de Proteccion de Datos (AEPD) and Mario Costeja Gonzalez (Case C-131/12) (‘Google Spain’) considered claims from data subjects about the lawfulness of the processing of their personal data by search engines. In this article we use Google as an example throughout given it accounts for 93% of UK searches, but the principles apply equally to all search engines. ‘Right to be forgotten’ (‘RTF’) requests to search engines take many forms – some are submitted by individuals simply using a search engine’s contact form to ask that URLs be delisted for any reason or none, and some are submitted by specialist lawyers who prepare substantive representations which address the nuanced balance between public interest and freedom of expression on the one hand and an individual’s right to private and family life on the other. There are also various non-lawyer online outfits which offer assistance in making RTF submissions (although our advice is to use specialist lawyers where funds permit).
If Google refuses to delist following an RTF request, then a complaint can be made to the ICO who can be asked to carry out an assessment of whether the personal data is being processed lawfully. While Google describes this as an appeal, it is in fact a freestanding right data subjects have against most data controllers. If the ICO decides that data is probably being processed unlawfully (it expresses its initial decision in non-committal terms), it will provide Google with informal advice to delist. Google can accept this advice and delist or make counter-submissions. In the event of the latter, the ICO will review the position again, and reach a formal decision. If an agreement cannot then be reached, the ICO can ultimately serve an enforcement notice on Google requiring it to cease processing personal data in the manner complained of. If the enforcement notice is ignored, the ICO can impose a fine for the breach. The ICO lost some of its teeth in this respect in May 2018, when failure to comply with an enforcement notice ceased to be a criminal offence. However, in the nine years since Google Spain, the ICO has only served one enforcement notice on Google in this context, back in 2015 - a case where remarkably Google refused to delist content that reported on its delisting of the complainant's personal data (thus undermining its own original decision). We suspect this isn’t because of a shortage of complaints against Google, but we can’t be categorical about this as, oddly, it appears as though the ICO stopped publishing statistics relating to complaints against Google in 2015 (its latest Annual Report (here) is much more ‘top level’).
When considering complaints against Google, the ICO applies a bespoke and narrow test about what search terms must be used to retrieve the search results complained of. The ICO will only consider URLs that are returned on a search for the individual’s name alone (e.g. “Joe Bloggs”). The ICO will not consider search terms containing an additional identifier or modifier that is required to bring up the offending search results; for example, “Joe Bloggs Winchester”, “Joe Bloggs Accountant”, or, importantly given the increasing and casual transmission and onward publication of intimate images, “Joe Bloggs Nude” (except where such identifiers are ‘auto-suggested’ by Google). This test makes no sense in the face of the ICO’s own definition of ‘personal data’.
The ICO’s only explanation (that we are aware of) as to why it does not (or should not have to) consider such additional identifiers or modifiers, is that Google Spain only concerned search results for the claimant’s name. This means that people with popular names but reputations only in specific industries or locales are in some difficulty. A search for their name alone may not return anything about them (instead returning results for a more famous Joe Bloggs). However, a person wanting to find information about Joe Bloggs, the accountant in Winchester, would, it should be uncontroversial to say, use such modifiers to find relevant search results. This is certainly true for employment vetting purposes. There is no logical reason to suppose that just because Google Spain happened to refer only to the name, a search term that includes the name along with other personal data should be outside the scope of consideration. Indeed, “Joe Bloggs, the accountant in Winchester” is personal data in the purest sense.
This stance from the ICO is hypocritical in the extreme. The ICO itself says that any information that can cause the identification of an individual can be personal data, and, as a result, can be subject to the laws of processing of personal data. For example, in March 2023 the ICO posted on LinkedIn that:-
“You don’t have to know someone’s name for them to be directly identifiable, a combination of other identifiers may be sufficient to identify the individual. If an individual is directly identifiable from the information, this may constitute personal data.”
The ICO has also published guidance about what constitutes personal data, on its page ‘What is personal data?’, which quotes from the UK GDPR. The UK GDPR expressly provides that “location data” or one or more factors specific to the “physical, physiological, genetic, mental, economic, cultural, or social identity of that natural person” can constitute personal data.
The law, therefore, plainly anticipates information about an individual other than their name being important personal data. The ICO also sets out in straightforward terms that a combination of identifying factors that allow the identification of the data subject can be personal data. So why does it refuse to accept a combination of the data subject’s name and other identifying personal data about them when considering whether search results returned from those searches constitute ‘personal data’? Why should some personal data (the search terms) not count, when other personal data (the search results) otherwise would?
There is a logical failing here, and it creates a two-tiered level of access to legal process as a result. People with significant public profiles (such as celebrities), and those with unusual names, combinations of names, or spellings of names, are able to have decisions of search engines interrogated by the ICO. People with common names, who are not household names, simply cannot. Similarly, search engines often hide explicit results by default and do not flag the existence of explicit results (for example Google’s SafeSearch filter). However, if someone searches “Joe Bloggs nude” (i.e. using the ‘nude’ modifier), Google’s SafeSearch could (if enabled) prompt the user that SafeSearch had filtered out some results, alerting them to the existence of those results and a way in which to access them (by turning off the filter). The addition of the word “nude” to the search term, though, would prevent the ICO from even considering a complaint.
Beyond the logic-devoid decision to exclude search results that are returned when a modifier or identifier is used, the outcomes of challenges to accepted search terms suggests that the ICO objects to this part of its job. It seems to be increasingly rare for it to intervene in the Google’s decisions, even in cases where those decisions are obviously without reason or legal merit. By way of anecdotal example, we are aware of a retired businessman who was no longer in the workforce at all whose request to a search engine was rejected because the information is relevant to his “current and future clients”. He had no current clients and, given his retirement, would have no future clients. This had been explained to Google. The reasoning is obviously flawed and is potentially even a copy/paste error from Google’s content review team. The ICO upheld Google’s decision and provided no explanation as to why – only that it agreed with Google’s response for the reasons it gave - a stock response from the ICO that seems to be becoming increasingly standard (a cynic might suggest means that complaints are not actually being considered at all). Moreover, we have seen the ICO reach a decision which flatly contradicts its only RTF enforcement notice against Google (referred to above), despite a copy of the notice being provided.
Worse even than that, is that the ICO is almost immune to repercussions from individuals. While the ICO can be subject to judicial review proceedings, this is an unattractive proposition for most people. It carries the risk of a significant adverse costs order (on top of significant upfront costs) and drawing further attention to the very material that is, on the data subject’s case, being processed unlawfully. One would have a reasonable argument for seeking anonymity from the Court initially (especially where it was claimed the data was out of date/inherently private), but this would likely be stripped if the challenge failed. The threshold for a judicial review succeeding is very high and the remedy in a successful claim might simply be an order that the ICO carry out an assessment properly (with a risk the ICO would reject the complaint again, albeit with more considered reasons). But more significantly, any judicial review against the ICO is susceptible to fail because of the ‘alternative remedy’ principle. Judicial review is, by its nature, supposed to be an individual’s last resort. A litigant risks being told by the Court to sue Google instead. In 2015 in R (Khashaba) v Information Commissioner (CO/2399/2015) (unreported, but summarised by Anya Proops KC here), the ICO successfully argued that an alternative remedy was available to Mr Khashaba – that he could bring proceedings against Google. We understand that there was a similar outcome in R (AK) v Information Commissioner (also unreported). The ICO pre-empts this in its responses by advising complainants of their right to sue Google. However, suing Google is not a practical remedy for an overwhelming majority of people. The number of people who can afford to litigate against one of the most well-resourced companies in the world is extremely small, and a competent regulator should do its job properly in the first place, and not direct people to unrealistic alternative options. Such a stance is similar to the police telling a victim of serious and sustained harassment/threats of violence to “go and take out an injunction”, ignorant to the cost implications of such advice.
In Delo, R (On the Application Of) v Information Commissioner & Anor  EWHC 3046 (Admin), a dissatisfied complainant took a slightly different approach, seeking judicial review on the basis that the ICO had failed to discharge its obligations to investigate properly under section 165 of the Data Protection Act 2018. The Court was clear that under the UK GDPR the ICO was not obliged to investigate and reach a final conclusion on each and every complaint made to it, effectively endorsing streamlined and selective regulation. Whilst the legal basis for finding was set out in detail, one cannot help but suspect the decision is in part policy-driven. As Moyston J said, “The claim is certainly not academic so far as the Commissioner is concerned. If it succeeds it will have huge ramifications as the workload of the Information Commissioner's Office ("ICO") will be vastly increased. The resources of the ICO are presently stretched to the limit in dealing with the present workload.”. On a positive note for data subjects, it is understood that permission to appeal the judgment has been granted.
The ICO publishes guidance for journalists about how to legally process personal data (here, here, and here), and information about how to complain about the conduct of the media for data subjects (here).
Following a Freedom of Information Access request made by this firm the ICO confirmed to us that in the 12 months from 1 April 2021 to 31 March 2022:-
- 93 complaints had been made in relation to ‘right to be forgotten’ requests made against newspapers’ websites; and
- Of those 93, not a single one resulted in the ICO directing the removal of an article from the newspaper’s website.
This includes any complaints made to newspapers about inaccurate or outdated personal data processing.
An optimist might say that this means newspapers are processing the personal data of the subjects of their publications in a manner that is compliant with data protection legislation.
A sceptic might say, on the other hand, that the ICO is simply not interested in addressing the systemic failings of the traditional media to properly and carefully process personal data, despite it publishing extensive guidance on the importance of compliance within the sector.
A critical issue that is almost universally ignored by the media (and seemingly the ICO) is that any news website (whether a newspaper or television channel) which publishes historic content (i.e. news stories going back months or years) is processing data on an ongoing basis. Unlike defamation, there is no single publication rule in data protection. Properly interpreted, data protection legislation requires periodic data protection compliance assessments to be carried out (and recorded). An overriding public interest and/or exemption that might have been applicable at the point of publication, can cease to be relevant six years later. A similar approach to that take in Google Spain regarding search engines should be adopted. The media’s overreliance on the journalistic exemption (typically advanced as no more than a bald assertion) needs to be properly addressed by the regulator. The journalistic exemption is not absolute and on proper analysis often falls away.
Much of the ICO’s energy appears to be spent on imposing and publicising fines for data leaks, effectively the res ipsa loquitor of data protection breaches. More complex work has included investigating companies infringing the data rights of a class of people (e.g. TikTok’s recent fine for the unlawful processing of children’s data). While that is obviously an extremely important endeavour, an individual unable to exercise his or her data protection rights may face extreme prejudice and commercial and social disadvantages that are far more debilitating and severe than any individual in that larger class of people. A competent authority ought not to be disregarding one group of people in favour of enforcing the rights of another – if it cannot handle both elements, it is not a ‘competent’ authority.
In the first year following Google Spain, the ICO received 472 requests to review Google’s decision not to filter search results. In 40% of cases the request was deemed to be ineligible, in 40% the ICO agreed with Google, and in 20% the ICO asked Google to filter the search results. The article the ICO published on this has since been removed, but is reported here on the Inforrm blog.
It does not appear as though the ICO has published these statistics in the eight years since then (or, at the very least, has not publicised them).
The ICO has not explained why these statistics are no longer made public, and importantly why the public should not be able to see these statistics. Given the technology available to it, and presumably the accurate records it must keep anyway, it does not seem as though these statistics should be difficult or time-consuming to publish. Google itself keeps live statistics. Up to 1 January 2023, Google states that it has received requests from within the UK in relation to 740,000 webpages and has agreed to delist 49.4% of these.
Plainly, the ICO has limited resources and the scope of data protection law is such that it has an enormous and unenviable task. That said, it cannot eschew certain of its obligations in order to focus on others.
If this all sounds a bit familiar, we have been here before. In The Law Society & Ors v Kordowski  EWHC 3185 (QB) (in which this firm acted for the Claimants), Mr Justice Tugendhat had to correct a gross misstatement by the then Information Commissioner to the effect that the ICO had no role in regulating inaccurate information published online. Following this judgment, the ICO had to introduce guidance on social networking, online forums and the applicability of data protection legislation. As Tugenthat J said at the time, one can be sympathetic to the regulator, but its obligations are clear.
With that sympathy in mind – nobody is questioning the scale of the job the ICO must do – it is reasonable to ask whether it is doing significant parts of that job satisfactorily. And if it is not, who is responsible? The ICO is an independent public body, and reports directly to Parliament. It was funded by the Department for Culture, Media and Sport until February 2023, when its sponsoring department became the Department for Science, Innovation and Technology: does that make the Secretary of State – presently Michelle Donelan MP – responsible, with the associated Parliamentary checks and balances applying? Is Parliament directly responsible? These are important questions of accountability, especially when the ICO has itself made the argument that its decisions about individual’s rights are not susceptible to judicial review because there is an alternate remedy. Where should an individual not of sufficient means to sue Google or the press turn when the data regulator does not do its job?
Articles are intended as an introduction to the topic and do not constitute legal advice.