Although the Artificial Intelligence Act is not yet in force, we have seen examples of how the use of AI is and will be heavily influenced by data protection law.
Join us as we dive into one of the examples with this case about the City of Copenhagen.
The municipality asked the Danish DPA to assess the legal basis for developing, operating, and re-training an AI solution for identifying citizens in need of training and rehabilitation.
That’s why you can look forward to reading about:
- The case in brief
- The Danish DPA’s assessment: Split in two
- The Danish DPA’s overall assessment
- Our 3 remarks
The case in brief
The City of Copenhagen wanted to develop, deploy, and re-train an AI solution based on their own data.
The purpose was to identify citizens in need of maintenance training and provide decision support for the municipality’s health and care administration.
The municipality wanted to minimize or delay the need for help from the administration by maintaining citizens’ functional level through training.The AI solution would support the employee’s assessment of which citizens could benefit from training programs.
The municipality would process personal data, coming from three primary sources:
- Municipal nursing cases
- Referrals for services under the Act on Social Services
- Personal data that was necessary to perform assigned services.
The processing would include personal data about the citizen’s social security number, health information, training programs and use of assistive devices.
Did you know that…
70 % of consumers are either very concerned or somewhat concerned about businesses’ use of AI tools?
(Source: Forbes)
The Danish DPA’s assessment: Split in two
The Danish DPA only considered the legal basis for the municipality’s request for processing.
For that reason, they split their assessment in two, i.e., 1) the development of the AI solution and 2) the operation of it.
The development, including re-training, of the AI solution
The Danish Data Protection Agency assessed that the development of the solution did not directly affect citizens. For that reason, the DPA assessed that the City of Copenhagen could design and develop AI solutions.
However, the municipality should conduct an overall assessment of the entire lifecycle of the AI solution to ensure the presence of a basis for the operation of the AI.
Operation of the AI solution
- Personal data would be processed to predict citizens’ need for training and rehabilitation according to the Danish Act on Social Services.
- The use of AI solutions in administrative case management was potentially intrusive for citizens as it could affect their health situation in terms of what healthcare services they were offered.
- There was a risk that employees would place more importance on the solution’s assessment than their own – and this would therefore lead to further risk for the citizen.
- Processing large amounts of personal data and sensitive information made processing intrusive, especially when using AI solutions. This required a clear additional national legal basis.
The DPA concluded that the provisions of the Danish Act on Social Services don’t give a sufficient legal basis for the type of processing that an AI solution would entail.
The Danish DPA's overall assessment
- The development and re-training of an AI solution to predict citizens’ need for and benefit from rehabilitation to avoid disability was compliant with the GDPR.
- Articles 6(1)(e) and 9(2)(g) allowed the processing of personal data and special categories of personal data. However, it required an additional national legal basis.
- There are existing provisions in the Danish Act on Social Services that require the municipality to decide on and provide maintenance training and rehabilitative efforts. These regulations could not form the basis for the operation of the AI solution, as the solution posed a higher risk to the data subject and the legal basis for the intended processing was not clear enough.
Our 3 remarks
From our point of view, there are three interesting perspectives on the case:
- Different phases of the development and use of AI involve different risks and require different legal bases. In this case, the DPA has higher requirements for the actual operation of an AI tool than for its development.
- If the legal basis for processing is a national legal basis, the clarity of the legal basis must be assessed based on how direct and intrusive the processing is for citizens.
- When using AI, a DPIA must be conducted if the processing poses a high risk to the rights of data subjects, cf. the Danish DPA’s guidelines on authorities’ use of artificial intelligence. This should be done early and continuously.
Several ways to get ready for the AI Act
If you want to get ready for the AI Act in time, the example of the City of Copenhagen gives you an idea of what to prepare for.
But it’s just as important to get the basics right, so you know what an AI solution is, what risk it poses, and when the different compliance deadlines come into effect.
You can get all that knowledge in this blog post.
Get the basics of the EU's AI Act in place
Are you stuck on AI Act questions like what, how, when, who and why?
Tell me more