Facial recognition
lack of resources. Recent statistics state that nearly half of all
crimes in the UK go without a suspect being identified, for
instance, with only nine per cent of all crimes resulting in a
charge or summons. It is clear that there is a pressing need for
new tools to help turn the tables.
Getting it right
Ensuring that the police continue to enjoy the support of
the public while adopting the technology in question will
probably be best achieved with a gradual approach. At the
same time, it will also need to be emphasised that such
technology is being used to address recent and pressing
problems (for instance, the ongoing population increase
coupled with an ever-diminishing number of officers on the
beat). Public acceptance is also likely to be aided by making it
clear where such technology is in use.
Another potential pain point is the need to ensure that if
we go down this path that there isn’t any further hollowing
out of police forces, with the technology being used as an
excuse. Clearly, officers will still need to respond to large and
serious incidents, as well as being able to gather information
from local communities in the traditional way.
Many of the types of crime that occupy a great deal of
officers’ time will also not be suitable when it comes to the use
of certain types of new technology. These include domestic
abuse – although body-worn video cameras naturally have a
role to play here – as well as dealing with sexual offences. Nor
can such systems address the increasing amount of time that
police officers currently spend as unofficial social workers.
Technology change on this scale is always a challenge.
However, the successful integration of the likes of ANPR
and CCTV into modern workflows – and the fact that such
systems can be introduced as a bolt-on to traditional practices
– give some grounds for cautious optimism.
That said, any system dependent on facial recognition at
scale will need to address the problem of false positives. Big
Brother Watch’s table of facial-recognition deployments makes
for sober reading in this regard, and highlights just how much
work has to be done in this area. It will also have to address
criminals’ use of simple but effective countermeasures, such
as hoods and other means of obscuring their faces, hence
the need to try and combine it with other types of data
where possible.
More colourful evidence regarding the technology’s
fallibility was reported last year in Zhejiang province, China,
where a facial-recognition system picked up the face of an airconditioning
company executive that appeared on an advert
on the side of a bus. The error was quickly fixed, but it does
emphasise the fact that such systems cannot be seen as ‘fire
and forget’ options.
Facial-recognition vendors will also need to ensure that the
accuracy of their underlying algorithms works well across all
ethnic groups, not just that of the person(s) developing the
algorithms and datasets.
Turning to the world of business, many low-level offences
and breaches of regulations currently fall under the radar.
For example, reporting of supermarket freezer temperatures,
sewage overflow events and similar activities is done
historically and is therefore prone to someone falsifying the
information. This could change in future, with IoT-style
sensors potentially allowing real-time reporting, again opening
Ensuring that the police continue
to enjoy the support of the public
while adopting the technology in question
will probably be best achieved with
a gradual approach
up the way for non-compliance to be automatically detected
and fined if necessary.
While this is some way away from the crimes that a typical
police officer concerns themselves with, it could add to the
narrative around stronger enforcement of laws and regulations
through the use of technology.
Ultimately, we have laws and those who enforce them
to balance the freedom and rights of the individual against
their ability to impact those of another person (and the
well-being of society as a whole). Given the rise of facial
recognition and social media analysis, there is no doubt that
privacy and data protection concerns will play an increasingly
important part of this equation. While new technologies may
one day allow people to be fined for minor crimes such as
littering, they must be fit for purpose. There also needs to be
a full public debate about any shift in this direction, alongside
plenty of education on how the technology will be used, as
well as both its benefits and its limitations.
The backlash has started
A beginning is a very delicate time – and this is
especially true in the world of technology. Just look at
how the promise of genetically engineered crops, which
have the potential to alleviate a vast amount of human
suffering, has not been fully realised thanks to activism
over health and environmental concerns, as well as fears
that they increase the power of corporations, such as
Monsanto. There are signs that the tide of public opinion
may already be turning against facial recognition – at
the time of writing, three cities in the US have banned
the use of the technology (San Francisco and Oakland
in California and Sommerville in Massachusetts). It
is also interesting to see that Axon, the body-worn
video camera vendor, has said that it “will not be
commercialising face-matching products on our body
cameras at this time”, although its AI team “will continue
to evaluate the state of face-recognition technologies”.
Part of the problem is that it could be argued that
public perception of a new technology can be heavily
influenced by its least ethical use, as this can galvanise
activism and polarise discussion. With this in mind,
the use of facial recognition, alongside the placing of
QR codes on homes that link to police files on their
occupants, in Xinjiang, China, as part of the widespread
programme of control that the government has
instigated against the minority Muslim Uighur population
is a matter of some concern.
22 www.criticalcomms.com July 2019
/www.criticalcomms.com