Imagine a major city completely covered by a video surveillance system designed to monitor the every move of its citizens. Now imagine that the system is run by a fast-learning machine intelligence, that’s designed to spot crimes before they even happen. No, this isn’t the dystopian dream of a cyber-punk science fiction author, or the writers of TV show “Person of Interest”. This is Boston, on the US East Coast, and it could soon be many more cities around the world.
In the aftermath of the Boston Marathon Bombings in April of last year, as law enforcement and the world’s media struggled to make sense of the tragedy, the Boston Police Department contacted a company well-known for developing innovative and cutting-edge surveillance technology based on advanced artificial intelligence.
Behavioral Recognition Systems, Inc. (BRS Labs) is a software development company based out of a nondescript office block in Houston Texas, with the motto: “New World. New security.”
Headed by former Secret Service special agent John Frazzini, the company brings a crack team of security gurus to bear on its ambitious artificial intelligence projects. With the heavy traffic of Houston’s West Loop South Freeway churning out fumes and noise just outside, BRS Labs has developed one of the most advanced, and perhaps most sinister, surveillance platforms known to man.
AISight (pronounced “eyesight”), works by using a particular form of reason-based analysis of video footage that promises to change the way humans conduct their surveillance of other humans.
Artificial intelligence is already in use across surveillance networks around the world. At high security sites like prisons, nuclear facilities or government agencies, it’s commonplace for security systems to set up a number of rules-based alerts for their video analytics. So if an object on the screen (a person, or a car, for instance) crosses a designated part of the scene, an alert is passed on to the human operator. The operator surveys the footage, and works out if further action needs to be taken.
This method of detecting suspicious behaviour has a number of drawbacks: it’s labour-intensive for the operators, each rule has to be programmed by a technician, and routinely generates more false positives than anything useful. What’s more, it means you can’t move the camera or change the environment without having to reprogram all your rules.
BRS Labs’ AISight is different because it doesn’t rely on a human programmer to tell it what behaviour is suspicious. It learns that all by itself.
The system enables a machine to monitor is environment, and build up a detailed profile of what can be considered “normal” behaviour. The AI can then determine what kind of behaviour is abnormal, without human pre-programing.
Artificial neural networks
What’s more, AISight permanently learns and registers when changes in normal behaviour occur, so no ongoing programing is required from human operators. In order to do this, it employs a technology known as “artificial neural networks”, which mimics the function of the human brain.
What’s more, BRS Labs’ system is extremely easy to implement even across huge, disparate networks of outdated camera equipment. The company claims that it needs maximum of only a few days for the complete hardware and software installation.
After that, the system sets about “autonomously building an ever-changing knowledgebase of activity seen through every camera on your video network.”
The software is already in place in other cities around the United States, such as Chicago and Washington.
“Our system will figure out things you never thought of looking for,” said Wesley Cobb, BRS’ chief science officer. “You never thought to look for a car driving backwards up the entrance of a parking garage, for example. Our system will find that and alert on it, because it’s different from what it usually sees. It’s taught itself what to look for.”
Preventing future crimes
BRS is currently working with the organisers of the world cup to try to prevent crimes before they even take place.
“We can recognise a precursor pattern that could be associated with a crime before it happens,” Cobb told reporters. “In a lot of cases, you can see someone casing the joint, poking around the back of buildings, going where they shouldn’t be.”
The inevitable security concerns have already been raised. While BRS claims to be “concerned about the privacy rights of individuals everywhere,” it’s not hard to imagine a future where our every move is assessed, quantified and judged by ever-smarter generations of artificial intelligence.
There’s one security camera for every 11 people in the UK, and it has been reported that the average British citizen is recorded on camera over 300 times every day.
Countries such as the United States, Canada and Germany have strict rules on where CCTV systems can be installed and how images of the public can be used. But it wasn’t until the full implementation of the Data Protection Act in 2000 that any legal controls for CCTV were put in place in Britain.
If BRS’s system makes its way across the pond, we could be looking at more, not less, surveillance in the near future.