Biometric processing – Facial recognition / finger print scanning
How Can You Embrace Biometric Technology and Remain GDPR Compliant?
Biometric processing – in particular, the use of facial recognition and fingerprint scanning to identify individuals – is the subject of much debate. Many see it as an infringement on their privacy and ethically wrong; while others are not against it – they consider it as an advancement in technology and using it for mass surveillance to protect us from terrorist threats, or to make their business environments operate more efficiently – but acknowledge it needs some degree of control.
The use of biometric processing may appear appealing to organisations who need a robust system to monitor the activity of individuals in a specific area, or for a specific purpose, however, there are serious implications of installing such technology – and many organisations are unaware of these.
“Biometrics” is defined as the recognition of an individual based on their biological and behavioural characteristics. It will include:
- Retina recognition
- Facial recognition
- Hand and finger geometry
- Voice recognition
- Vein recognition
- DNA matching
Here, we will look at the use of biometrics in the workplace and private organisations – and cover the areas data controllers should consider ahead of installing such technologies. Failure to do so could land you with a higher tier GDPR fine – that’s 4% of your company turnover or €20m fine, whichever is greater…and you don’t even need to have suffered a breach – that fine could come from someone reporting you to the ICO if they believe you are unlawfully processing their special category data.
Facial recognition
Facial recognition is the processing of data in relation to the facial features of an individual. Anyone coming into an area covered by a facial recognition camera will have their faces scanned as the system searches for a match of their image on the internal database. When the system alerts to a match, the organisation running the system can react appropriately. If there is no match, the captured image is deleted.
What Is The Fuss Around Facial Recognition Technology?
Much of the controversy around this technology is due to the camera scanning every face that comes into its view – irrespective of your image being on the database. By default, your biometric data is being processed by the organisation operating that camera. If you are to follow the GDPR precisely, organisations running this system need to obtain explicit consent from each person affected by this processing – if the camera is in a place of high footfall, or a public environment, explicit consent is almost impossible to achieve. There are exemptions detailed in Article 9.2 of the GDPR, however, in areas of high footfall, this is difficult to rely on.
This presents data controllers who want to use the system with a legal and ethical debate as to how they can install this technology, but still comply with data protection legislation, or at least, build an argument to demonstrate they have considered factors associated with the processing. By this, we mean conducting a detailed data privacy impact assessment (DPIA) and exploring all possible alternatives to biometric processing – and having evidence to prove this if a data subject or the ICO request it.
Biometric data is classified as special category data under the GDPR and, unless there are extreme circumstances (which must be clearly demonstrated), under Article 9 you are required to gain explicit consent from those individuals.
Accuracy of Facial Recognition
There is much debate about the accuracy of facial recognition systems. Evidence suggests that unless a data subject is a white male or female (ie, not BAME or transgender), there is a greater chance of the system registering an inaccurate match. This could lead to an individual being apprehended or denied access due to being misidentified. Depending on the situation, this could highly embarrassing – and a breach of their rights and freedoms as an individual, which is a fundamental reason for data protection legislation being in place.
Should you decide to engage with a specialist consultant for privacy guidance around your system, then they will advise on ways you can accurately identify any inaccuracies in the system without having to rely on the sales pitch from the supplier. In the event your system is biased towards certain minorities, then you will need to explain this to the ICO.
Fingerprint scanning
Many companies are also turning to fingerprint scanning in place of the older style clocking-in systems as these are more reliable and prevent the system from being cheated – especially if employees try to clock in on behalf of their colleagues who are running late. Several companies we have spoken to have said this system improves employee punctuality and reduces absenteeism. It also cuts down on the cost of replacing lost swipe cards.
However, this system does come with its challenges – and companies suddenly realise that this processing activity comes with responsibilities and, as data controllers, if consent is not appropriate, then can they demonstrate the appropriate legal bases for this activity? In many cases, this is where specialist guidance is needed to help build a case to support its use – and this can be money well spent as in the event of a complaint by a data subject, this case will be essential to demonstrate the accountability principle of GDPR. If you are found to be unlawfully processing biometric data, you are likely to receive a substantial fine from the ICO – you don’t even need to have suffered a data breach.
Explicit consent
The reason explicit consent is so hard to achieve is due to the level of detail that the data subject needs to be informed of… and agree to. The data subject needs to be asked for consent in line with Article 7 of the GDPR – so a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subjects’ agreement to the processing of his or her data. Silence, pre-ticked boxes or inactivity do not constitute consent. There must also be the opportunity to revoke consent at any time, without detriment to the data subject.
Going a stage further, explicit consent requires you to inform the data subject EXACTLY how their data will be processed by the activity, who runs the system (internal, or 3rd party supplier), how the data will be stored, how long you intend to keep it for and what the inherent risks are (if any) to this activity – ie, if the data is breached, what are the implications for the individual? Also, are you able to demonstrate this is the most appropriate processing activity? – Could it have been achieved by an activity of lesser risk?
You are advised to get a dated signature, or other confirmation from the data subject that they have agreed to this processing activity but can change their mind at any time.
Why does this technology risk a huge fine?
As the name suggests, biometric processing is based on the use of a person’s biometrics. If you use facial recognition, the system matches your facial features to a photograph or other image of your face held on the database. If you use fingerprint technology to sign in/out of a workplace, then the system holds a copy of this fingerprint.
This personal data is classified as “special category data” under the GDPR – and if this data is breached, it is likely to have a significant impact on the rights and freedoms of that individual. While it is an inconvenience to have your bank account or credit card details breached – these can be changed and banks offer reasonable levels of protection against theft of your money – you are not able to change your facial features, voice or fingerprint. The risk to the individual of becoming a victim of fraud at any time in their lives greatly increases – all thanks to you suffering a breach in your processing activity, Quite simply, it is a huge risk and, as a data controller – you bear that risk.
The GDPR therefore requires organisations who wish to use this technology and process biometric data to do so under special conditions – extra protection when stored (and conducting a DPIA – data protection impact assessment, to gaining explicit consent from the data subjects (unless you are able to demonstrate specific exemptions).
Failure to demonstrate any of these considerations – whether you have a breach or not – is likely to result in an ICO investigation and a potential top-tier fine of 4% of your turnover / €20m – whichever is higher. Aside from the investigation and fine, you need to consider the fall-out from the data subject, an increase in subject access requests and loss of reputation.
Embrace technology
Technology is there to be embraced and, we openly encourage it, however, if that technology involves the processing of biometric (special category) personal data, then we urge you to do appropriate research into this.
The best advice is to engage with a data protection consultant who has specialist experience in biometric processing. This independent specialist will listen to your reasons for employing this technology – and how it has improved your business or helped you to comply with industry-specific legislation you have a legal obligation to adhere to. The specialist will build you a case and weigh up the risks to the data subject, versus the benefits and help you make an informed decision – this argument can then be presented to the ICO it is deemed they should be aware of the processing.
Contact us now for this guidance – our team have specialist experience in biometric processing and will provide expert, independent guidance you can rely on.