UK: Statewatch submission to consultation on reform of the Data Protection Act 2018

Topic
Country/Region
UK

Submission by Statewatch to the Department of Culture, Media and Sport’s consultation on reforms to the UK’s Data Protection Act 2018.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

Statewatch’s answers to questions 1.5.17 is adapted from the Open Rights Group briefing on the consultation.

Statewatch is a non-profit-making voluntary group founded in 1991 comprised of lawyers, academics, journalists, researchers and community activists. Our European network of contributors is drawn from 18 countries. We undertake and encourage the publication of investigative journalism and critical research in Europe in the fields of the state, justice and home affairs, civil liberties, accountability and openness.

We welcome the opportunity to provide input to the Consultation “Data: a new direction”. We have prepared answers to the following questions from the Department of Culture, Media and Sport’s consultation document:

Q1.5.17 - the Taskforce on Innovation, Growth and Regulatory Reform’s recommendation that Article 22 of UK GDPR should be removed

Q4.4.8 - ‘There is an opportunity to streamline and clarify rules on police collection, use and retention of data for biometrics in order to improve transparency and public safety’

Q4.5.1 - the proposal to standardise the terminology and definitions used across UK GDPR, Part 3 (Law Enforcement processing) and Part 4 (Intelligence Services processing) of the Data Protection Act 2018?

Q5.2.11 - the proposal for the Secretary of State for DCMS to periodically prepare a statement of strategic priorities for the ICO

Q5.3.5 - the salary for the Information Commissioner (i.e. the proposed chair of the ICO in the future governance model) should not require Parliamentary approval?


Q1.5.17. To what extent do you agree with the Taskforce on Innovation, Growth and Regulatory Reform’s recommendation that Article 22 of UK GDPR should be removed and solely automated decision making permitted where it meets a lawful ground in Article 6(1) (and Article 9-10 (as supplemented by Schedule 1 to the Data Protection Act 2018) where relevant) and subject to compliance with the rest of the data protection legislation?

We strongly disagree with “the Taskforce on Innovation, Growth and Regulatory Reform’s recommendation that Article 22 of UK GDPR should be removed”.

It is particularly important that this provision is not removed, as the removal of the right to human review would shift this burden from organisations to individuals. This is fundamentally contradictory, as individuals would be asked to actively monitor and scrutinise life-changing decisions that are taken about them, by systems that are out of their control or understanding, and which are often not transparent.

Article 22 has proven to be an effective safeguard, that protected individuals against some of the most egregious abuses. While it may not provide a comprehensive remedy for harms related to AI, this does not justify its removal. Rather, specific legislation in the field of AI could be implemented to expand, rather than reduce, the protections offered by the UK GDPR.

Q1.5.17a. Please explain your answer, and provide supporting evidence where possible, including: (a) The benefits and risks of the Taskforce’s proposal to remove Article 22 and permit solely automated decision making where (i) it meets a lawful ground in Article 6(1) (and, Articles 9 and 10, as supplemented by Schedule 1 to the Data Protection Act 2018) in relation to sensitive personal data, where relevant) and subject to compliance with the rest of the data protection legislation. (b) Any additional safeguards that should be in place for solely automated processing of personal data, given that removal of Article 22 would remove the safeguards currently listed in Article 22 (3) and (4)

Scrapping Article 22 of the UK GDPR would impose an unbearable burden on individuals. The transparency, accountability and fairness of automated systems depend on the organisations that implement them. Removing the right to human review would shift this burden from organisations to individuals. In turn, it would mark a fundamental departure from the principle set out in the GDPR that data protection should uphold human dignity and be “designed to serve mankind”.

While we recognise that this does not provide comprehensive nor sufficient protection from the harms associated with algorithmic decision making, this does not constitute a reason to remove Article 22. On the contrary, it calls for a dedicated legal regime that introduces further and stronger protections against the harms associated with AI, thus complimenting existing standards under the UK GDPR.

Furthermore, the effectiveness of Article 22 of the UK GDPR in providing remedies for individuals who were harmed by automated systems has been invaluable:

1. Workers:

Article 22 has been used by workers to stand up to abusive practices, including racist facial recognition systems, such as that used by Uber. The system led to the unfair dismissal of a driver and a courier after it failed to recognise them.[1]

2. Immigration:

1. Visa applications

If the UK Government scrapped Article 22, automated decisions on sorting visa applications would be taken opaquely, without human review. Scrapping DPIA requirements would further allow these automated systems to be deployed without properly assessing the risks for individuals. The Home Office has set a precedent for processing visa applications using a “racist” algorithm, which sorted applications based on biased criteria, including nationality. An individual visa applicant allocated by the algorithm to the ‘red’ category because of their nationality had much lower prospects of a successful application than the prospect of an otherwise equivalent individual with a different nationality allocated to the ‘Green’ category. The ability to know about how these decisions were made allowed the launching of judicial review, leading to the Home Office agreeing to suspend use of the algorithm. Without the transparency allowed by Subject Access Requests, and the protections offered by Article 22 GDPR, this would not be possible.[2] Plans to introduce fees for data access requests will result in significant barriers for migrants who are willing to exercise their rights, but may not have the resources or even a bank account yet. In turn, this will replicate, in essence, the restrictions of the Immigration Exemption in the Data Protection Act 2018.

2. Immigration exemption

The Government implemented an “immigration exemption” in the Data Protection Act 2018, that prevents migrants from exercising their right to access their personal data. This prevented migrants from asking public or private organisations if and how they use their personal data to determine their eligibility for public benefits, credit, jobs, employment, housing, and other life necessities.

Open Rights Group and the 3Million challenged the immigration exemption in court. The Court of Appeal found that the immigration exemption was incompatible with the GDPR.

However, the UK government would impose a data access fee, making it expensive and dissuading migrants from exercising their rights. Even if they were paying, migrants’ requests may still be denied by an organisation if it is “too onerous for them”. In essence, this will reintroduce barriers to access migrants’ data that were first provided by the unlawful immigration exemption.

Q2.3.4. To what extent do you agree with the following statement: ‘There is a case for re-introducing a small nominal fee for processing subject access requests (akin to the approach in the Data Protection Act 1998)’?

We strongly disagree that there is a case for re-introducing a small nominal fee for processing subject access requests.

Q2.3.4a. Please explain your answer, and provide supporting evidence where possible, including what a reasonable level of the fee would be, and which safeguards should apply.

Please see our response to Q1.5.17.

Q4.4.8. To what extent do you agree with the following statement: ‘There is an opportunity to streamline and clarify rules on police collection, use and retention of data for biometrics in order to improve transparency and public safety’?

We strongly disagree with this statement, based on the presumption that “streamlined” and “clarified” rules would seek to make it easier for the police to collect, use and retain biometric data. It has been established that the police are already illegally retaining millions of ‘mugshots’ of individuals, a matter which has so far not been rectified. Police forces are also testing and using both live and retrospective facial recognition systems that raise questions regarding the rights to privacy, to freedom from discrimination, to the rights of assembly and association, and numerous others. There is no evidence that making it easier for such data to be collected and used will make the public any safer; in fact, it may well do the opposite, by increasing the risks of breaches and misuse of sensitive personal data and the opportunities for pervasive surveillance of particular social groups (ethnic minorities, protesters).

Q4.5.1. To what extent do you agree with the proposal to standardise the terminology and definitions used across UK GDPR, Part 3 (Law Enforcement processing) and Part 4 (Intelligence Services processing) of the Data Protection Act 2018?

We strongly disagree with the proposal to standardise parts 3 and 4 of the Data Protection Act 2018.

Q4.5.1a. Please explain your answer, and provide supporting evidence where possible.

The standardisation of terminology and definitions of the public and private sectors would facilitate greater cooperation on data processing by the police, intelligence agencies and what are referred to as “national security partners”.

The UK government consultation paper does not state directly whether the aim of the proposed reforms is to facilitate the increased use of public-private partnerships, nor provide any other significant detail on why such a reform would be necessary or proportionate. However, given the intention to remove many of the protections offered by the existing data protection regime – in particular, the plan to “eliminate human review from automatic-decision making” by ending the application of Article 22 of the GDPR – there is significant cause for concern about what these changes may mean.

Not only does the government plan to standardise terminology and definitions, it intends to amend provisions for joint controllership, to enable controllers operating under Parts 3 and 4 of the Data Protection Act 2018 to increase collaboration. Currently, entities designated as data controllers under different parts of the act are not able to act as joint controllers.

If they were able to do so, this would make it possible for the police, intelligence agencies and “national security partners” to jointly determine where, how and by whom personal data should be processed. This raises significant risks for individual rights, particularly given that it may be more difficult for individuals to exercise their rights to access, correction, rectification or deletion in the context of policing and national security, in comparison with, for example, commercial processing.

Q5.2.11. To what extent do you agree with the proposal for the Secretary of State for DCMS to periodically prepare a statement of strategic priorities which the ICO must have regard to when discharging its functions?

We strongly disagree with the proposal “to introduce a new power for the Secretary of State for DCMS to periodically prepare a statement of strategic priorities to which the ICO must have regard” (Q5.2.11).

Q5.2.11a. Please explain your answer, and provide supporting evidence where possible.

The ICO is a watchdog whose function is to monitor and enforce the law against public and private bodies, including the government. Giving such power to the Secretary of State for DCMS would create an outstanding conflict of interest and undermine the independence of the Information Commissioner.

Q5.3.5. To what extent do you agree that the salary for the Information Commissioner (i.e. the proposed chair of the ICO in the future governance model) should not require Parliamentary approval?

We strongly disagree “that the salary for the Information Commissioner should not require Parliamentary approval” (Q5.3.5).

Q5.3.5a. Please explain your answer, and provide supporting evidence where possible.

The ICO is a watchdog whose function is to monitor and enforce the law against public and private bodies, including the government. Providing such power would create a situation where the government can retaliate against Commissioners’ who do not ensure impunity to the government or otherwise condone their actions.

[1] https://www.adcu.org.uk/news-posts/adcu-initiates-legal-action-against-ubers-workplace-use-of-racially-discriminatory-facial-recognition-systems

[2] https://www.jcwi.org.uk/news/we-won-home-office-to-stop-using-racist-visa-algorithm


Image: Erica Fischer, CC BY 2.0

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error