GDPR365 Automated Decision Making and Profiling

SUMMARY
Article 22 – Automated individual decision-making, including profiling
(1) The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements

From the above we should observe that:

Automated processing is processing done by machine e.g. computers
Automated decision means a decision taken without human intervention
Automated decisions can be made on any type of data, not just personal data
'Profiling', under the GDPR comprises three elements:

it has to be an automated form of processing;
it has to be carried out on personal data; AND
the objective of the profiling must be to evaluate personal aspects about a natural person

Note that there are exceptions from the prohibition

…if the decision is…
(a) necessary for the performance of or entering into a contract;
(b) authorised by Union or Member State law to which the controller is subject; OR
(c) based on the data subject’s explicit consent
Note further, the prohibition in Article 22(1) will only apply when a decision based solely on automated processing, including profiling has a legal effect on or similarly significantly affects someone

Extremely important
Even where one of these exceptions does apply, the GDPR provides a further layer of protection for data subjects
Automated decision-making that involves special categories of personal data is only allowed under certain conditions (sensitive personal data)
A data protection impact assessment shall be required in the case of profiling
Various rights of the data subject must be protected
Therefore, certain safeguards must be established
Read the DETAIL section below

DETAIL

Be aware - A data protection impact assessment shall be required in the case of...a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person. - i.e. profiling

Profiling and automated decision-making are used in an increasing number of sectors, both private and public. Banking and finance, healthcare, taxation, insurance, marketing and advertising are just a few examples of the fields where profiling is being carried out more regularly to aid decision-making. Profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole, delivering benefits such as increased efficiency and resource savings.

However, profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards. These processes can be opaque. Individuals might not know that they are being profiled or understand what is involved.

Profiling

The GDPR says that profiling is automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals. Therefore, simply assessing or classifying individuals based on characteristics such as their age, sex, and height could be considered profiling, regardless of any predictive purpose.

Profiling comprises three elements:

it has to be an automated form of processing;
it has to be carried out on personal data***; AND
the objective of the profiling must be to evaluate personal aspects about a natural person

*** As opposed to automated decisions which can be made on any type of data

Broadly speaking, profiling means gathering information about an individual (or group of individuals) and analysing their characteristics or behaviour patterns in order to place them into a certain category or group, and/or to make predictions or assessments about, for example, their:

ability to perform a task;
interests; or
likely behaviour.

Example
A data broker collects data from different public and private sources, either on behalf of its clients or for its own purposes. The data broker compiles the data to develop profiles on the individuals and places them into segments. It sells this information to companies who wish to improve the targeting of their goods and services. The data broker is in fact carrying out profiling by placing a person into a certain category according to their interests. (Whether or not there is automated decision-making as defined in Article 22(1) will depend upon the circumstances)

Automated Decision Making

Automated decision-making has a different scope and may partially overlap with profiling. Solely automated decision-making is the ability to make decisions by technological means without human involvement. Automated decisions can be based on any type of data, for example:

data provided directly by the individuals concerned (such as responses to a questionnaire);
data observed about the individuals (such as location data collected via an application);
derived or inferred data such as a profile of the individual that has already been created (e.g. a credit score)

Example
Imposing speeding fines purely on the basis of evidence from speed cameras is an automated decision- making process that does not necessarily involve profiling.
It would, however, become a decision based on profiling if the driving habits of the individual were monitored over time, and, for example, the amount of fine imposed is the outcome of an assessment involving other factors, such as whether the speeding is a repeat offence or whether the driver has had other recent traffic violations.

Decisions that are not wholly automated might also include profiling. For example, before granting a mortgage, a bank may consider the credit score of the borrower, with additional meaningful intervention carried out by humans before any decision is applied to an individual.

How are these addressed in the GDPR?

Article 22

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

In summary, Article 22 provides that:
(i) as a rule, there is a prohibition on fully automated individual decision-making, including profiling that has a legal or similarly significant effect;
(ii) there are exceptions to the rule;
(iii) there should be measures in place to safeguard the data subject’s rights and freedoms and legitimate interests

Note that the prohibition in Article 22(1) will only apply when a decision based solely on automated processing, including profiling has a legal effect on or similarly significantly affects someone

…based solely on automated processing…
This means that there is no human involvement in the decision process.

Example
An automated process produces what is in effect a recommendation concerning a data subject. If a human being reviews and takes account of other factors in making the final decision, that decision would not be ‘based solely’ on automated processing.

Note that the controller cannot avoid the Article 22 provisions by fabricating human involvement. For example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would still be a decision based solely on automated processing.

Also, to qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision.

…legal effects…
A legal effect suggests a processing activity that has an impact on someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action. A legal effect may also be something that affects a person’s legal status or their rights under a contract. For example, automated decisions that mean someone is:

entitled to or denied a particular social benefit granted by law, such as child or housing benefit;
refused entry at the border;
subjected to increased security measures or surveillance by the competent authorities; or
automatically disconnected from their mobile phone service for breach of contract because they forgot to pay their bill before going on holiday.

…similarly significantly affects him or her…
Even if a decision-making process does not have an effect on people’s legal rights it could still fall within the scope of Article 22 if it produces an effect that is equivalent or similarly significant in its impact. For data processing to significantly affect someone the effects of the processing must be more than trivial and must be sufficiently great or important to be worthy of attention. At its most extreme, the decision may lead to the exclusion or discrimination of individuals.

Recital 71 provides the following typical examples: ‘automatic refusal of an online credit application’ or ‘e-recruiting practices without any human intervention’. These suggest that it is difficult to be precise about what would be considered sufficiently significant to meet the threshold.

Online advertising relies increasingly on automated tools and involves solely automated individual decision-making. In many typical cases targeted advertising does not have a significant effect on individuals, for example an advertisement for a mainstream online fashion outlet based on a simple demographic profile: ‘women in the Brussels region’. However, it is possible that it may do, depending upon the particular characteristics of the case, including:

the intrusiveness of the profiling process;
the expectations and wishes of the individuals concerned;
the way the advert is delivered; or
the particular vulnerabilities of the data subjects targeted

Processing that might have little impact on individuals generally may in fact have a significant effect on certain groups of society, such as minority groups or vulnerable adults. For example, someone in financial difficulties who is regularly shown adverts for on-line gambling may sign up for these offers and potentially incur further debt. Automated decision-making that results in differential pricing could also have a significant effect if, for example, prohibitively high prices effectively bar someone from certain goods or services.

Similarly significant effects may be positive or negative. These effects may also be triggered by the actions of individuals other than the one to which the automated decision relates – for example:

Hypothetically, a credit card company might reduce a customer’s card limit, based not on that customer’s own repayment history, but on non-traditional credit criteria, such as an analysis of other customers living in the same area who shop at the same stores.

This could mean that someone is deprived of opportunities based on the actions of others OR, in a different context using these types of characteristics might have the advantage of extending credit to those without a conventional credit history, who would otherwise have been denied.

Exceptions from the prohibition

The controller should not undertake the processing described in Article 22(1) unless one of the following Article 22(2) exceptions applies.
…if the decision is…
(a) necessary for the performance of or entering into a contract;
(b) authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; OR
(c) based on the data subject’s explicit consent

Special category data – Article 22(4) (sensitive personal data)

Automated decision-making (described in Article 22(1)) that involves special categories of personal data is only allowed under certain conditions provided for in the GDPR or by Union or Member State Law (Article 22(4), referring to Article 9(2), (a) or (g)), namely:

9(2) (a) - the explicit consent of the data subject; or
9(2) (g) - processing necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and interests of the data subject.

Data subject’s right to be informed

The data subject must be informed of the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

If the controller is making automated decisions, they must:

tell the data subject that they are engaging in this type of activity;
provide meaningful information about the logic involved; and
explain the significance and envisaged consequences of the processing

The controller should find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision without necessarily always attempting a complex explanation of the algorithms used or disclosure of the full algorithm. The information provided should, however, be meaningful to the data subject.

‘Significance’ and ‘envisaged consequences’ - This term suggests that information must be provided about intended or future processing, and how the automated decision-making might affect the data subject. In order to make this information meaningful and understandable, real, tangible examples of the type of possible effects should be given.

Example
An insurance company uses an automated decision-making process to set motor insurance premiums based on monitoring customers’ driving behaviour. To illustrate the significance and envisaged consequences of the processing it explains that dangerous driving may result in higher insurance payments and provides an app comparing fictional drivers, including one with dangerous driving habits such as fast acceleration and last-minute braking. It uses graphics to give tips on how to improve these habits and consequently how to lower insurance premiums.

Data subject’s right of access

Article 15(1) (h) entitles data subjects to have the same information about solely automated decision-making, including profiling, as required under Articles 13(2) (f) and 14(2) (g), namely:

the existence of automated decision making, including profiling;
meaningful information about the logic involved; and
the significance and envisaged consequences of such processing for the data subject.

(The controller should have already given the data subject this information in line with their Article 13 obligations)

Data subject’s right not to be subject to a decision based solely on automated decision-making

Article 22(1) acts as a prohibition on solely automated individual decision-making, including profiling with legal or similarly significant effects. Instead of the data subject having to actively object to the processing, the controller can only carry out the processing if one of the three exceptions covered in Article 22(2) applies.

Even where one of these exceptions does apply, the GDPR provides a further layer of protection for data subjects in Article 22(3) ‘at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision’. The controller must provide a simple way for the data subject to exercise these rights.

Human intervention is a key element. Any review must be carried out by someone who has the appropriate authority and capability to change the decision. The reviewer should undertake a thorough assessment of all the relevant data, including any additional information provided by the data subject.

Establishing appropriate safeguards

If the basis for processing is 22(2)(a) – ‘contract’ or 22(2)(c) – ‘explicit consent’, Article 22(3) requires controllers to implement suitable measures to safeguard data subjects’ rights freedoms and legitimate interests. Such measures should include as a minimum a way for the data subject to obtain human intervention, express their point of view, and contest the decision.

This emphasises the need for transparency about the processing. The data subject will only be able to challenge a decision or express their view if they fully understand how it has been made and on what basis. Errors or bias in collected or shared data or an error or bias in the automated decision-making process can result in:

incorrect classifications; and
assessments based on imprecise projections; that
impact negatively on individuals.

Controllers should carry out frequent assessments on the data sets they process to check for any bias, and develop ways to address any prejudicial elements, including any over-reliance on correlations. Systems that audit algorithms and regular reviews of the accuracy and relevance of automated decision-making including profiling are other useful measures. Controllers should introduce appropriate procedures and measures to prevent errors, inaccuracies or discrimination on the basis of special category data.

If you would like to know how our service might enable your organisation's GDPR compliance journey, please visit us here


The content herein is provided for your convenience and does not constitute legal advice.
Compliance Technology Solutions B.V. 2018

R
Russell is the author of this solution article.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.