How facial recognition technology aids police

Police officers’ ability to recognize and locate individuals with a history of committing crime is vital to their work. In fact, it is so important that officers believe possessing it is fundamental to the craft of effective street policing, crime prevention and investigation. However, with the total police workforce falling by almost 20 percent since 2010 and recorded crime rising, police forces are turning to new technological solutions to help enhance their capability and capacity to monitor and track individuals about whom they have concerns.
One such technology is Automated Facial Recognition (known as AFR). This works by analyzing key facial features, generating a mathematical representation of them, and then comparing them against known faces in a database, to determine possible matches. While a number of UK and international police forces have been enthusiastically exploring the potential of AFR, some groups have spoken about its legal and ethical status. They are concerned that the technology significantly extends the reach and depth of surveillance by the state.
Until now, however, there has been no robust evidence about what AFR systems can and cannot deliver for policing. Although AFR has become increasingly familiar to the public through its use at airports to help manage passport checks, the environment in such settings is quite controlled. Applying similar procedures to street policing is far more complex. Individuals on the street will be moving and may not look directly towards the camera. Levels of lighting change, too, and the system will have to cope with the vagaries of the British weather.
[…]
As with all innovative policing technologies there are important legal and ethical concerns and issues that still need to be considered. But in order for these to be meaningfully debated and assessed by citizens, regulators and law-makers, we need a detailed understanding of precisely what the technology can realistically accomplish. Sound evidence, rather than references to science fiction technology --- as seen in films such as Minority Report --- is essential.
With this in mind, one of our conclusions is that in terms of describing how AFR is being applied in policing currently, it is more accurate to think of it as “assisted facial recognition,” as opposed to a fully automated system. Unlike border control functions -- where the facial recognition is more of an automated system -- when supporting street policing, the algorithm is not deciding whether there is a match between a person and what is stored in the database. Rather, the system makes suggestions to a police operator about possible similarities. It is then down to the operator to confirm or refute them.
By Bethan Davies, Andrew Dawson, Martin Innes (Source: https://gcn.com/articles/2018/11/30/facial-recognitionpolicing.aspx, accessed May 30th, 2020)
READ TEXT I AND ANSWER QUESTION.
READ TEXT I AND ANSWER THE QUESTION.
READ TEXT I AND ANSWER QUESTIONS 11 TO 15
TEXT I
Will computers ever truly understand what we're saying?
Date: January 11, 2016
Source University of California - Berkeley
Summary:
If you think computers are quickly approaching true human
communication, think again. Computers like Siri often get
confused because they judge meaning by looking at a word's
statistical regularity. This is unlike humans, for whom context is
more important than the word or signal, according to a
researcher who invented a communication game allowing only
nonverbal cues, and used it to pinpoint regions of the brain where
mutual understanding takes place.
From Apple's Siri to Honda's robot Asimo, machines seem to be
getting better and better at communicating with humans. But
some neuroscientists caution that today's computers will never
truly understand what we're saying because they do not take into
account the context of a conversation the way people do.
Specifically, say University of California, Berkeley, postdoctoral
fellow Arjen Stolk and his Dutch colleagues, machines don't
develop a shared understanding of the people, place and
situation - often including a long social history - that is key to
human communication. Without such common ground, a
computer cannot help but be confused.
“People tend to think of communication as an exchange of
linguistic signs or gestures, forgetting that much of
communication is about the social context, about who you are
communicating with," Stolk said.
The word “bank," for example, would be interpreted one way if
you're holding a credit card but a different way if you're holding a
fishing pole. Without context, making a “V" with two fingers
could mean victory, the number two, or “these are the two
fingers I broke."
“All these subtleties are quite crucial to understanding one
another," Stolk said, perhaps more so than the words and signals
that computers and many neuroscientists focus on as the key to
communication. “In fact, we can understand one another without
language, without words and signs that already have a shared
meaning."
(Adapted from http://www.sciencedaily.com/releases/2016/01/1
60111135231.htm)
How facial recognition technology aids police

Police officers’ ability to recognize and locate individuals with a history of committing crime is vital to their work. In fact, it is so important that officers believe possessing it is fundamental to the craft of effective street policing, crime prevention and investigation. However, with the total police workforce falling by almost 20 percent since 2010 and recorded crime rising, police forces are turning to new technological solutions to help enhance their capability and capacity to monitor and track individuals about whom they have concerns.
One such technology is Automated Facial Recognition (known as AFR). This works by analyzing key facial features, generating a mathematical representation of them, and then comparing them against known faces in a database, to determine possible matches. While a number of UK and international police forces have been enthusiastically exploring the potential of AFR, some groups have spoken about its legal and ethical status. They are concerned that the technology significantly extends the reach and depth of surveillance by the state.
Until now, however, there has been no robust evidence about what AFR systems can and cannot deliver for policing. Although AFR has become increasingly familiar to the public through its use at airports to help manage passport checks, the environment in such settings is quite controlled. Applying similar procedures to street policing is far more complex. Individuals on the street will be moving and may not look directly towards the camera. Levels of lighting change, too, and the system will have to cope with the vagaries of the British weather.
[…]
As with all innovative policing technologies there are important legal and ethical concerns and issues that still need to be considered. But in order for these to be meaningfully debated and assessed by citizens, regulators and law-makers, we need a detailed understanding of precisely what the technology can realistically accomplish. Sound evidence, rather than references to science fiction technology --- as seen in films such as Minority Report --- is essential.
With this in mind, one of our conclusions is that in terms of describing how AFR is being applied in policing currently, it is more accurate to think of it as “assisted facial recognition,” as opposed to a fully automated system. Unlike border control functions -- where the facial recognition is more of an automated system -- when supporting street policing, the algorithm is not deciding whether there is a match between a person and what is stored in the database. Rather, the system makes suggestions to a police operator about possible similarities. It is then down to the operator to confirm or refute them.
By Bethan Davies, Andrew Dawson, Martin Innes (Source: https://gcn.com/articles/2018/11/30/facial-recognitionpolicing.aspx, accessed May 30th, 2020)
Mark the statements below as true ( T ) or false ( F ) according to Text I.
( ) Private sector auditors have more responsibility than government auditors on risk-taking decisions.
( ) The weight government auditors take upon themselves is quite heavy.
( ) Part of the auditor's job is one of mediation between the public and the government.
The correct sequence is: