Below are the key extracts from the Biometrics and Forensics Ethics Group (BFEG) - January 2021.



Access the original article here


This is exactly the approach we have been calling for. Not in an attempt to stop the use of Automatic Facial Recognition (AFR), but to ensure that it is carried out in an ethical & proportionate manner and, when it is used, it has to be authorised by a senior police officer.


Overview

The Biometrics and Forensics Ethics Group (BFEG) was commissioned to investigate the ethical issues raised by the collaborative use of live (real-time) biometric facial recognition technology (LFR) by public (police) and private organisations. This briefing note provides a summary of the evidence gathered by the working group. It focuses on the use of LFR in a range of privately- owned spaces where people are gathered or are passing through (for example, shops and shopping centres) including those with clearly defined transit points where people are ‘channelled’ past the cameras (for example, within airports).


Summary

In gathering evidence, it was clear that the use of biometric recognition technologies (including LFR) in public–private collaborations (P–PCs) are likely to increase. The BFEG working group highlighted several ethical concerns generated by the collaborative use of LFR, including:

  • sharing data and technology;

  • the development of behavioural biometrics for use in LFR;

  • discrimination and bias in the use of LFR;

  • the construction of watchlists; and

  • the effect of using LFR in private spaces used by the public.

In the absence of regulation, the working group outlined a number of issues that should be addressed prior to setting up of P–PCs in the use of LFR. The working group also made a number of recommendations that should be followed by those involved in P–PCs, including that an independent ethics group should have oversight of the use of LFR by police forces and in P–PCs.


Ethical concerns

As public and private organisations increasingly collaborate in the development and deployment of live facial recognition (LFR) technologies a number of key issues need to be addressed. Many of these involve general questions about data uses – for example, how data are generated, who can access and share them, what are the purposes of data sharing and what are the ethical benefits and risks? As well as some more specific ethical concerns, such as:

The sharing of data and technology: A number of public-private collaborations (P-PC) in the use of LFR have involved the police supplying a ‘watchlist’ of facial images (i.e. data) to private organisations (for example, the Metropolitan Police and British Transport Police to King’s Cross Estate (Argent); South Yorkshire Police to Meadowhall shopping centre (British Land); and Greater Manchester Police to the Trafford Centre (INTU)).


Issues that should be addressed prior to the use of LFR in public–private collaborations.

The Biometrics and Forensic Ethics Group (BFEG) believes that the use of live facial recognition (LFR) in public–private collaborations (P–PCs) raises a number of issues, beyond those outlined in its earlier report (BFEG, 2019); these are briefly outlined below. In the absence of any specific regulation governing the use of biometric recognition technologies for law enforcement purposes, specifically those involving P–PCs, the BFEG suggests that the following should be addressed prior to the collaborative use of LFR:


Recommendations

Public–private collaborations (P–PCs) in the use of biometric recognition technologies, including live facial recognition (LFR), are predicted to grow over the next few years. The sharing of data/ technology between the private and public sectors raises a number of ethical issues over and above those generated by public (police) sector use of LFR.

In the absence of a legislative framework governing the use of LFR by the police or in P–PCs, the Biometrics and Forensic Ethics Group (BFEG) makes the following recommendations.


1. Police should only share data with trustworthy organisations that have been vetted. The police should only share data with trustworthy private organisations. Members of private organisations who will have access to police data should be vetted for their trustworthiness.


2. Data should be shared with, or accessed by, the minimum number of people. All public and private data should be shared with the minimum number of people. This means that suppliers of LFR technology should not be able to access the images/data compiled in a watchlist or the results of biometric transactions and image metadata for refining algorithms or other purposes.


3. Biometric data (including image data) must be safely and securely stored. Arrangements should be made for the safe and secure sharing and storage of data (including associated metadata) in P–PCs, such as those outlined in the Information Commissioner’s Office’s (ICO’s) Data Sharing Code of Practice (ICO, 2020, Security). Data should not be stored for any longer than is necessary.


4. Watchlists should be narrow and targeted. Private and public watchlists should be narrow, targeted and proportionate to the deployment to avoid the oversharing of personal data between private and public organisations.


5. A publicly accessible record of collaborative uses of LFR should be created. To ensure transparency, P–PCs should be publicly recorded for example, on the police force website, which documents for each deployment:

  • the purpose of the collaboration;

  • the identity of the collaborators; and

  • the types and amount of data that are being shared, with whom and for how long.

For example, Force A is collaborating with private organisation B by providing N images/ records, which are stored for X time and will used by Y actors now and in the future.


6. Collaborative use of LFR should be authorised by a senior police officer. P–PCs should proceed only if they have been authorised by a senior police officer (Superintendent or above).


7. An independent ethics group should oversee the use of LFR by the police and in P– PCs.

To maintain public confidence, the BFEG recommends that oversight mechanisms should be put in place. The BFEG suggests that an independent ethics group should be tasked to oversee a) individual deployments of biometric recognition technologies by the police and b) the use of biometric recognition technologies in P–PCs. This independent ethics group would require that any proposed deployments and P–PCs are reviewed when they are established and monitored at regular intervals during their operation.


Copyright: The Home Office

46 views0 comments

Recent Posts

See All
CCTV_User_Group_New-01.jpg

Our mission is to represent, support, advise and to serve our Members and the wider UK CCTV industry

© CCTV User Group Ltd 2021

C/o Bruton Charles, The Coach House, Greys Green Business Centre, Henley-On-Thames, Oxfordshire, England, RG9 4QG