The law across Europe now says a DPIA is required when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (GDPR Article 35(1)). Where the key term is “high risk”. To be clear not all processing requires a DPIA. A DPIA is mandatory for that subset of processing activities which meet the threshold of high risk. Since GDPR is a new law and there has been very little litigation so far, it is impossible to be definitive about what is or is not high risk. If you need specific legal advice please talk to your advisors.
A DPIA is a useful tool for any sensible organisation or investor. The concept behind a DPIA is to let you to make informed decisions about the acceptability of data protection risks, and communicate effectively with the individuals affected. In the real world (and DigiTorc are nothing if not realists) we know that all risks can never be eliminated. We need to figure out how to measure, manage and mitigate risks so we can all stay in business. A DPIA gives us a way to identify and mitigate against data protection risks, plan for the implementation of any solutions to those risks, and assess the viability of a project at an early stage. Below are five warning signs organisations should watch out for:
- Automated profiling is often picked out as a classic of processing that needs a DPIA (Recitals 71 and 91). Under this heading you might find a startup which does CX analytics or a major bank that is searching for money laundering.
- Automated decision making that can impact people (Article 35(3)(a)) is another indicator. Under this heading you might think about an artificial intelligence system screening CVs or a bank processing loan applications.
- A third heading is systematic monitoring (Article 35(3)(c)). Here you might want to watch out for CCTV installations in public places, increasingly organisations are seduced by the apparent low cost of CCTV equipment, without considering how to handle the personal data they generate. Also under this heading would be IoT, wearables etc. A smart meter can be used to detect when people wake up at and what time they get back from work at. Fitness watches know you location with pinpoint precision.
- Special data is defined in Article 9, covered items such as gender, race, sexuality also items like trade union membership and religious affiliation. There might be excellent reasons for an organisation to process this kind of data, but the law says that risk much also be considered.
- The last element we want to point to is scale. The legislation talks about “large scale” but doesn’t explain what that is. Recital 91 is a useful staring point and WP29 recommends that the following factors, in particular, be considered when determining if you meet this criteria.
- the number of data subjects concerned, either as a specific number or as a proportion of the relevant population;
- the volume of data and/or the range of different data items being processed;
- the duration, or permanence, of the data processing activity;
- the geographical extent of the processing activity.
Failure to carry out a DPIA when required to Article 35(1, 3 & 4), carrying out a DPIA in an incorrect way Article 35(2, 7 & 9), or failing to consult your SA if required Article 36(3)(e), are all now offences. The DPC or some other national SA can apply a fine of up to 10M€, or up to 2 % of the total worldwide annual turnover, whichever is higher. In addition the SA can order an organisation to stop processing personal data which will put most organisations out of business.
As a matter of good practice, DPIAs should be continuously reviewed and regularly re-assessed. Therefore, if a DPIA was or was not required before, it is necessary, for the controller to consider or conduct such a DPIA as part of its legal obligations. If you have any questions about DPIAs, please contact us!