Sky’s the limit, as crime fighting AI drones take flight
A Police ‘stake out’ is a method of catching criminals red handed. It involves, according to legend, an unassuming car crouched in a dark backstreet, an undercover police officer wearing a dark overcoat, eyes burning and coffee smoking, for days on end.
That officer will watch their unsuspecting target like a Hawk hunting its prey, collecting information and lying in wait for as long as it takes – for years, if necessary. It might sound like a rum job, and it certainly is. Which is why it’s going to be done by robots – or, drones, to be more specific.
“I actually think in years to come, and it’s not that far off, we will have one of those drones in the back of every police vehicle so we will be able to deploy those all the time.” Said Jeff Farrar, Chief Police Constable in Gwent, Wales.
Drones can provide HD quality pictures of a scene via video link, explained Mr Farrar, which police officers can view from their offices. When searching large areas, he said, more can be done with drones than would ever have been possible before.
From a financial point of view, drones make perfect sense. Police in Cumbria, England, spent £4,600 to buy two drones in 2015, and they have been flown around 100 times. In June of the same year, the Metropolitan police revealed they had spent £11.1 million in under three years on a manned ‘stake out’ outside the Ecuadorian Embassy in Knightsbridge, London, watching Julian Assange.
At a time when UK police are facing budget cuts in excess of 20% the advent of cheap, robotic colleagues will be welcomed with open arms – unless they muscle in on jobs. Over the pond, Police surveillance could well be part of the 6% of US jobs that robots will eliminate by 2020.
Mr Farrar maintained that there will still be policeman doing the beat, saying “the public like to see a police officer as its reassuring”. Reassuring, but not necessarily safer. After all, aren’t drones actually better than people at doing surveillance?
Eye in the sky
Take watching for sharks along coast lines, a job traditionally done by people in helicopters or small planes. New research suggests that 87.5% of the time, human pilots will not be able to spot a shark in the water from above.
By contrast, the Westpac Little Ripper Lifesaver drone, developed in collaboration with the University of Technology Sydney (UTS), is said to spot sharks 90% of the time. On spotting a shark the drone sounds a siren to warn bathers of the danger. It can even distinguish between sharks, whales, dolphins, and floating objects.
There were 26 shark attacks in Australia in 2016, including two fatal ones, so the introduction of further security measures can only be a good thing. What remains to be seen is how much of an impact these drones will make in 2017, and also: are people comfortable with the idea of being policed by AI?
As long as AI do the job well, you might say, what is the harm? After all, the public are not all so reassured by the presence of human police, particularly groups like African Americans in places like Oklahoma. The introduction of AI could massively improve policing in areas of social tension, and AI is currently being developed for police body cameras to reduce mistaken shootings.
But how would we feel if a robot made a mistake when policing us, and failed to spot the colossal Great White Shark, or thought your sandwich was a gun? Would we be able to trust machines in the same way that we trust humans, even if its more logical to place trust in AI?
Well, it seems like most people already do trust machines. A UK survey found 61% of people are comfortable with Artificial intelligence recognising criminals’ faces via CCTV, because the benefits outweighed the risks.
If robots start routinely mistaking innocent civilians for notorious criminals, then this attitude may change. But until then, what with the abundance of new AI technology, and the ever increasing demands on police forces around the world, it’s safe to say that Officer Drone is on the case.