GDPR: encryption, pseudonymisation and anonymisation – security as a Russian doll

The deadline on May 25th 2018 has passed, and even though the GDPR legislation has been public for well over a year, most businesses and organisations are only now beginning to realise that it actually applies to them too.

The private sector is playing catch-up and brand-owners arriving late are grappling to get their heads around concepts as obtuse as data encryption, pseudonymisation and anonymisation. Most understand that personal identifiable information (PII), i.e. one’s name, mobile phone number and email addresses to name a few, now enjoys special protection under GDPR and organisations in possession of it, would do well to ensure it is safe from not just prying eyes but from identity thieves. A line has been drawn and from here on, breaches of PII data will be severely dealt with. In summary, GDPR assigns PII ownership to the people while putting organisations in the frame to protect it.

ISO 27001, Building a layered defence

The measures organisations take to protect PII may be likened to the nested layers of a Matryoshka Russian doll. Best security practise (ISO 27001:2013) requires that outermost layer has an alarmed perimeter fence. External accesses should be video monitored, have fingerprint or swipe card readers in place. Inside, access to IT and communications rooms are restricted to 1 or 2 individuals. Roles-based access controls need to be in place to ensure that network users are given access, permission to view or make changes according to the principal of least privilege. Office PCs in each department can reside on their own subnet and VPN, with inter-department communications only possible through policy-controlled firewall bridges. Security can be further increased by enforcing a frequent password change policy and ensuring that screen-saver timeouts are set to a minimum on all desktops. Remote access to departmental VPNs might be prohibited entirely or controlled using a scheme of password generating dongles that allow only authorised users to tunnel through the Internet perimeter firewall to access files and Intranet services.

How safe is your personal data?

PII can be likened to the priceless museum artefact in a crime thriller movie. If it was not valuable, who would steal or attack it? Despite the best efforts of museums and security vendors, artefacts and indeed PII goes missing all the time. Data breaches are a crime. This is why, the inner-most Babushka or baby-most Matryoshka layer deserves special attention. Here we position all files and databases containing PII and two additional security measures can be applied to secure this data to the point it will be useless to anyone that steals it. Firstly, PII entering or leaving the innermost layer, can be encrypted, pseudonymised or anonymised while it is in motion. Secondly, PII at rest in files or databases can also be encrypted, pseudonymised or anonymised using software embedded in the file system or putting a hardware security appliance in front of the Hard Disk Drives (HDD).

GDPR &  encryption, pseudonymisation and anonymisation

So what exactly is encryption, pseudonymisation and anonymisation and how are these concepts linked to GDPR?

Encryption requires a key (a string of 16, 32 or 64 random characters) to convert PII payloads into an unintelligible cipher. The PII can only be deciphered when the same (symmetric) or different (asymmetric) key is applied. Encryption is therefore a reversible process but it has its downsides by being costly in terms of time and CPU resources necessary to cipher/decipher PII. GDPR states that if encrypted data is breached (and key/s remains safe), the organisation’s Data Controller must report the incident to the Supervisory Authority, i.e. the Data Protection Commissioner (DPC) in Ireland. However, the Data Controller is not obliged to inform each Data Subject that a breach has occurred as PII is considered safe.

Pseudonymisation uses fast hash functions to make PII unreadable and unintelligible to prying human eyes and visually indistinguishable from the more secure encrypted form. The downside is that it yields easily to computational recovery methods so PII is easily restored again to sometimes unauthorised viewers. GDPR states that if pseudonymised data is breached, the Data Controller must notify the DPC as well as Data Subjects that their PII has been breached. It is interesting that pseudonymised data is not considered to be in any more secure than human readable data under GDPR breach reporting rules. Organisations can expect to be fined after such data breaches.

Anonymisation is different. It is a one way process. Once data is anonymised there is no way to recover it even if infinite computing resources were brought to bear on the problem. So how useful can anonymised data be? One common use is that it facilitates the sharing of large datasets amongst the Data Science community to train and improve Machine Learning algorithms without having to obtain the consent of Data Subjects in advance. GDPR states that if anonymised PII data is breached, Data Controllers do not need to notify the DPC or inform Data Subjects of the event. Anonymised PII is considered to be safe.

In conclusion, best practices in data security suggest that the risk of a data breaching occurring can be reduced and de-risked by applying a multi-layer approach to protect valuable PII. Under GDPR, organisations should well consider PII as being in the same category of objects as the priceless museum artefact or indeed a living Babushka at the heart of a Russian nested doll.

One thought on “GDPR: encryption, pseudonymisation and anonymisation – security as a Russian doll

Leave a Reply