Facial recognition
There are also various studies that demonstrate facialrecognition
software can be tricked. In February, software
company trinamiX shared details about its ‘skin-detection’
technology – an alternative that it says can detect the
material it is analysing. This prevents masks, for example,
from hiding someone’s identity. It remains to be seen
whether that modification will become widespread.
The regulatory puzzle
The regulatory landscape around the use of this technology
is complex and varies around the world. It is not possible
to delve into these intricacies in one article.
In the UK, an interim report of the Biometrics and
Forensics Ethics Group’s Facial Recognition Working
Group was published in February 2019. It was a response
to live facial recognition (LFR) trials undertaken by South
Wales Police and the Metropolitan Police Service. The
report found that there is a lack of independent oversight
and governance of the use of LFR. It recommended that
police trials of LFR should comply with usual standards
of experimental trials until a legislative framework is
developed. Given the Met Police has moved ahead with
operational use of LFR, it is clear the report needs to be
updated. A secretary for the working group advised that a
further report, in collaboration between police forces and
private entities, is expected this summer.
Then, in October 2019, Elizabeth Denham, the UK
information commissioner, published a blog that made her
feelings clear – “police forces need to slow down and justify
its use”. The Information Commissioner’s Office (ICO)
is the UK’s independent regulator for data protection
and information rights law; it has specific responsibilities
set out in the Data Protection Act 2018 and under the
General Data Protection Regulation (GDPR).
The ICO carried out its own research to understand the
thoughts of UK citizens, and found there is strong public
support for the use of LFR for law enforcement purposes
– some 72 per cent of those surveyed agreed or strongly
agreed that LFR should be used on a permanent basis in
areas of high crime. Denham links to that research in her
blog, so presents a balanced approach to this technology.
The ICO is not trying to prevent its use, it is saying it
needs to be used cautiously.
Denham said: “Moving too quickly to deploy
technologies that can be overly invasive in people’s lawful
daily lives risks damaging trust not only in the
technology, but in the fundamental model of policing by
consent. We must all work together to protect and enhance
that consensus.”
The following day, Tony Porter, the independent
surveillance camera commissioner, added to the ICO’s
findings. Porter said that the use of automatic facial
recognition (AFR) should be within the confines of
existing regulatory guidelines. He pointed to a recent
Cardiff judgment that clearly set out the Surveillance
Camera Code of Practice (SC Code) and section 33 of the
Protection of Freedoms Act 2012 as key elements of the
legal framework for the use of AFR.
While that ruling found that South Wales Police’s use of
facial recognition was proportionate, it made clear that the
force should be prepared to demonstrate its use is justified
Although most of
the public support
the usage of facial
recognition for
law enforcement,
those employing
it must still justify
its use in order
to assuage the
tech’s critics
according to the particular facts of individual cases. This
means that facial recognition should not be used without
clear justification.
In the US, the regulatory landscape is even more
fractured. Forrester’s Maxim points to various examples
of state-level regulations, starting in Illinois in 2008 with
the Biometric Information Privacy Act (BIPA). He says
these regulations “reflect the growing interest and concern
around facial recognition”. The political landscape in the US
means that “no two laws will necessarily be written exactly
the same way, so it does create some real challenges if you
are a national organisation trying to deal with this growing
patchwork of biometric laws. In the European case, GDPR
is providing a more holistic view, but certainly here in the
US, this continues to be a real problem and probably is
not going to get any better because there’s really no real
momentum or interest in a national equivalent of GDPR
right now. That means regulations are going to be at the
state or local level for the time being, unfortunately.”
What is interesting about that BIPA law, explains Maxim,
is that it “was written well before touch ID or face ID even
existed, yet it provides protections and consent if you are
collecting biometric data. There have been, in the last year,
several court cases against organisations that have potentially
violated the spirit of that law. Various courts have ruled
against organisations that have been collecting biometric
data. This is a law that gives some consumers some
protection and means to challenge what’s happening and
potentially get some relief out of the misuse or miscollection
of data over a period of time.”
Last summer, Somerville, Massachusetts became the
second US city to bar municipal use of facial-recognition
technology due to ethical concerns including the potential
for government misuse and its unequal performance across
racial and gender lines. At the time, Maxim explains that,
Adobe Stock/Alexander
March 2020 @CritCommsToday 29