Updated: Dec 19, 2019
The latest information from the CCTV User Group & NASCAM on our views on the use of Facial Recognition systems is now available to our Members to download.
Question – Can Automatic Face Recognition (AFR) be used in the UK today?
To answer the above question, AFR can be used, but we would suggest that there are many factors to be considered before any investment in this technology is made.
There is little doubt that in today’s regulatory framework and political climate, the risks to the end-user organisation can be substantial (both financial and reputational).
We would suggest that investment in automatic face recognition should be put on hold until we have clear guidelines from the Government and Regulators.
However technically, there is no current legislation that prohibits the use of AFR, but the legislators (and regulators) are looking very closely at the technology and how it is being used.
Requirements to be considered (documentary evidence should be available) are:
Deployment for a narrowly defined purpose
Watchlists are to be limited in size, in line with the data protection principles requiring personal data to be adequate, relevant and not excessive in relation to the intended purpose and using images that are accurate, verifiable and lawfully held.
The End User organisation’s DPO needs to be involved as required by s70 and s71 DPA 2018, and it is very important that the DPO assists the data controller to:
Monitor internal compliance;
Inform and advise on data protection obligations;
Provide advice regarding DPIAs and Equality Impact Assessments; and
Act as a contact point for data subjects and the ICO.
The current guidelines state (mandated by DPA 2018 regulation) when using AFR that:
It involves the processing of personal data and therefore data protection law applies, whether it is for a trial or routine operational deployment.
Such sensitive processing relates to all facial images captured and analysed by the software; and must pay particular attention to the requirements of s35, s42 and s64 Data Protection Act 2018.
As such, a Data Protection Impact Assessment (DPIA) and an ‘appropriate policy document’ must be in place. Sensitive processing occurs irrespective of whether that image yields a match to a person on a watchlist or the biometric data of unmatched persons is subsequently deleted within a short space of time. Data protection law applies to the whole process of LFR, from consideration about the necessity and proportionality for deployment, the compilation of watchlists, the processing of the biometric data through to the retention and deletion of that data. Controllers must identify a lawful basis for the use of LFR. This should be identified and appropriately applied in conjunction with other available legislative instruments such as codes of practice. The controller must, at the time of processing, have an ‘appropriate policy document’ that the controller put in place (as described in either s35(4)(b) or s35(5)(c) as well as s42 DPA 2018).
More here for our Members only
Click here to download it
and the ICO's very first 'Opinion' document is available here