For once, not being up with the latest tech is a good thing – according to the authors of a new 85-page report on facial recognition technology by Nessa Lynch and Andrew Chen.
The authors, both local experts in the field of facial recognition and policy, state that they feel more comfortable with New Zealand Police “remaining cautious about the adoption of this controversial technology, rather than feeling the need to be a technology leader in this space.”
That follows on from comparable jurisdictions where law enforcement embraced FRT and are now pulling back from the technology or belatedly establishing regulations and guidelines. Some have banned or restricted certain high-risk applications of FRT.
NZ Police say it will adopt all 10 recommendations from the review; these include that the Police review its current collection and retention of facial images and develop a policy statement on FRT surveillance in public places.
Police, for now, have put live facial recognition on hold and say they will consult with communities before making any decisions on live FRT (facial recognition technology).
Live FRT is a fast and efficient policing tool used by many forces around the world that works by comparing a preselected watch list of images to a video feed and electronically alerts Police when the person’s face is detected.
While it’s a great crime-fighting tool, the technology could also give Police the ability to identify law-abiding people at a protest – giving New Zealand Police their names, location and a list of contacts and brings with it a myriad of human rights, privacy and legal concerns as well as the potential to alert Police to false positives.
The report’s authors point out that using a FR equipped camera in a public place could be considered a ‘search’ because of the increased technical capabilities of FR as opposed to regular CCTV or recording. And note that the use of FR seems to contradict the right of a person’s reasonable expectation of privacy in a public place and those affected could cite protections in the Search and Surveillance Act 2012.
(It should be noted that, while FRT is used regularly to locate persons of interest in many states in America, at present it cannot be used as evidence in court proceedings and any convictions have to be upheld with independent evidence.)
The report’s authors believe that the NZ public need clearer guidance on the threshold between when “Police are allowed to capture an image (particularly a facial image) and when a warrant is needed, to ensure confidence and trust.”
There’s also a grey area where officers take photos and record CCTV video on their own phones.
The end of anonymity
The report had been instigated after NZ Police revealed it had undertaken trials last year using controversial software from ClearviewAI without informing the government, the Privacy Commissioner, the public or even its own Police Commissioner Andrew Coster.
After that hit the headlines, Police released a “tech stocktake” that gave us a (rather opaque) glimpse into the array of technologies utilised by Police – these included 5G drones, a system to spot suspects in CCTV feeds and tools from Israeli digital intelligence company Cellebrite that can search lawfully seized mobile phones.
While a brief look at the headlines surrounding the new report might lead many to think that images of themselves captured through third-party systems like CCTV aren’t subject to FRT – “Review prompts police to halt plans to use facial recognition technology” went one headline from RNZ – Police still use non-live FRT to identify persons of interest and it’s likely that, whatever FRT software it uses, your face could already be in its, or someone’s, database.
ClearviewAI for example has over 3 billion images scraped from publicly available images on the web and social media. That enables law enforcement using the technology to easily match an image and find out who it is, along with links to where those photos appeared.
The report also identifies government policy as an important step in clarifying issues around biometric data as, at present, the regulations governing it are complex and sit across numerous pieces of legislation, regulation and policy.
It cites Scotland as an example of a similar jurisdiction that has approached the technology in a considered way. It has established the office of Scottish Biometrics Commissioner and a code of practice for the use of biometrics by its police force.
The EU is taking a harder stance – with the European Data Protection Board calling for a complete ban on live automated FRT – “Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.”
Police response
“It is critical that we continue to use technology safely and responsibly, as accuracy and bias are key concerns for FRT. We are committing today to engaging with communities before we make any decisions on the use of Live FRT,” said Deputy Chief Executive Mark Evans in a release responding to the report’s findings.
“This puts us in the best position to prepare for any considered future adoption of the technology.”
But is this really a win for privacy advocates?
Not really; reading between the lines Police, while accepting all ten recommendations of the report, can continue as before, while awaiting or commissioning internal reports and formulating its own policies.
- Re live FRT – Police will “continue to monitor technology developments in this space, to help inform any decisions on future use.”
- Re the current collection and retention of facial images – Police will “implement a set of rules for to be used in FRT systems across varying contexts and will align this with recommendations of the upcoming Independent Police Conduct Authority and Office of Privacy Commissioner” – a report only undertaken after Police was revealed to have regularly taken photographs of Māori young people in Wairarapa in August 2020.
- Re Police developing a policy statement on FRT surveillance in public places – “Police will develop clear policy covering the application of principles which guides Police treatment of public surveillance obtained images in relation to FRT use.”
- Re the potential of algorithmic or human bias in FRT systems – “Police will develop further understanding of data ethics principles and bias across the organisation. This could include through recruit training, scenario examples, embedded messaging at points of access to devices and tools, and online learning.”
Police are currently developing a New Technology Framework, which considers some of the risks around these emerging technologies.
Still, it’s essentially a “trust us to do the right thing” response which, after a few public blunders in this space, doesn’t inspire confidence.
Matthew Guariglia, of the Electronic Frontier Foundation in California told Stuff that any retrospective policy regarding policing tech like this is often ineffective and self-serving.
“Once it’s already in the hands of the police, it becomes even harder to take it away from them. Because then they feel like they’re being denied technology, they’ve already worked into their daily routines.
“And so then it’s often relied upon to bring experts in to justify how they’ve been using it, even though they did not get permission from anyone, and had no accountability or regulation, when they instituted the technology in the first place.”
—
I reached out to Police regarding the report.
Did the report offer any surprises to Police?
Police commissioned the review to get an impartial analysis from independent parties, who are experts in the field of FRT in NZ. On this basis, Police were open to, and prepared to consider the recommendations made. The recommendations were not pre-empted by Police.
Will the Police keep the Privacy Commissioner informed of any updates on its use of FRT technology?
Yes. The Office of the Privacy Commissioner will continue to be updated on our considerations of new technology, and our progress and developments on the Response Plan.
What will change in the short-term re third-party FRT systems the Police presently have access to?
Police use of current third-party systems will continue to be restricted to identification and evidence support, which uses Police staff verification processes. Police does not use any third-party live FRT systems and any consideration of future use of third-party FRT systems will undergo the same guidelines and principles being adopted in the Response Plan.
Can the public be confident that images already held by the Police will be used appropriately?
Yes. Police will implement a set of rules for the collection and retention of facial images to be used in FRT systems across varying contexts, including the collection and use of Open Source Intelligence (OSINT) data. Police will also develop rules for the use of children and young person whose rights and interests are likely to be impacted. Police will also ensure we align the Response Plan with upcoming recommendations of the Independent Police Conduct Authority and Office of Privacy Commissioner reviews into use of facial identification photos taken in public by Police.
How long before an over-arching policy re FRT is ready for implementation?
Police will next develop a programme of work developed to understand how and when we will deliver to the recommendations. A prioritisation will be gathering further information to inform new policy and guidelines for our future consideration and use of FRT systems, working alongside Maori in the co-design. This is expected to begin in early 2022.
Police can take photographs of people if they are in a public space lawfully unless someone has a reasonable expectation of privacy and they are not being obstructed in any way. The Response Plan will take account of any recommendations made by the Independent Police Conduct Authority and Office of Privacy Commissioner reviews into use of facial identification photos taken in public by Police.
Police guidelines relating to the use of iPhones on duty to collect images state;
“iPhones have the capacity to record photographic images, however, the first option for collecting images should always be a police photographer where available, or a police issued digital camera.
Images should not be recorded on an iPhone for police purposes unless there is an urgent and identifiable need to do so, i.e. vital evidence would be lost.
If images are recorded on an iPhone, to ensure they will be accepted by the Court as reliable evidence and to minimise the risk of legal challenges around whether they could have been compromised, employees should:
– record details (date, time and location) of images in their notebook; and
– follow the Photography (Forensic Imaging) Guidelines on the police intranet for downloading images to a Police Enterprise/desktop computer and subsequent secure storage”
How will Police ensure algorithmic bias is avoided in any use of FRT?
Last year, Police released a policy around the use of emergent technology, signed up to the Algorithm Charter for Aotearoa New Zealand, and set up an independent expert panel to externally peer-review the use of emergent technologies. The Charter sets out several commitments for algorithm development and use including ensuring data is fit for purpose by identifying and managing bias.
Police also commissioned specialist consultants Taylor Fry to conduct a stocktake of algorithms in use or development by New Zealand Police, and to provide advice on best practices to assure safe and ethical development and use of algorithms moving forward.
Police have implemented governance and guidelines on developing and managing algorithms and what to consider to ensure these are fair and ethical. It steps through the chronology of development and ongoing management of a new algorithm.
If live FRT is implemented will Police make public what software is used?
Yes. Any further FRT development will include consultation, and processes to remove the risk of bias, inaccuracies and negative impacts on the public. This will involve co-design with Maori academics, messaging that informs our communities on what we are doing. Police will also go out to consultation with the public on any further consideration of Live FRT. We will continue to be transparent through our Emergent Technology programme of seeking advice from the Emergent Technology Expert Panel, and proactively releasing information on our web page.