Researchers build AI-powered security systems that predict criminal behavior, claim 82.8% accuracy in predicting felonies with CCTV monitoring
1984 meets Minority Report.
According to a TechXplore report, the South Korean Electronics and Telecommunications Research Institute (ETRI) announced that it has developed an AI monitoring system which could be a step towards a real life version of the Minority Report's "Pre-Crime" team. The system integrates CCTV footage, crime statistics, positioning data, and other data to look for signals and predict the chances that a crime would occur. This technology, called Dejaview, compares current environmental and social patterns and records of past criminal cases to forecast any potential crimes.
Dejaview uses two different technologies to predict the chance of a crime happening in a given area, says TechXplore. The first one analyzes historical criminal activity data and correlates it with other factors, for example location and time. If a particular area has multiple reports of crime happening late at night, the AI system would then mark it as a high-risk location in its crime heat map, allowing the police department in charge of the area to allocate more officers to deter potential felons.
By giving law enforcement a heat map of the area they cover, they could see potential crime hotspots in real time through the predictive crime map (PCM). This allows for better allocation of police officers and for more effective coverage.
The PCM is made from over 32,656 individual CCTV clips collated over three years from 2018.
Ultimately this should help the police to be more proactive instead of just reacting to events as they happen. If the data that the AI systems uses is actively updated, then it could track changes as they happen. If there is strong police presence in an area prone to crime, criminals might move to other areas to conduct their illicit activities. But if Dejaview can predict where that will happen, then law enforcement could stop crime from expanding to other areas.
Using AI to track individuals
However, Dejaview also uses another more controversial technology called ‘individual-centered recidivism prediction’. TechXplore says that it’s strictly only applied to individuals who are considered to be at a high-risk of committing another crime. The AI system would track the movements of these individuals and detect if they have violated their location restrictions (such as those under house arrest or on parole). But more than that, it could also determine if that particular individual deviated from their restrictions due to valid reasons (like work or an emergency). This AI technology could then analyze that person’s behavioral patterns and compute the chances that they will return to their criminal ways.
The researchers have already tested the system with the help of the Telecommunications Technology Association (TTA), and say that the PCM had a crime prediction performance of 82.8%.
The team aims to develop Dejaview as a safety-related system to be deployed in high-risk areas like airports, energy facilities, national events, and more.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
These AI-powered CCTV systems will make it easier for police and law enforcement to not just control crime, but to prevent them from happening in the first place. However, there’s also the risk that this technology could be misused, by governments and organizations. So, there must be safeguards in place before this surveillance system on steroids could be deployed, and only the most trustworthy personnel should be entrusted with its operation.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
-
MacZ24 80% of the time, it works every time.Reply
And without much effort, I predict that skin color will be a big part of the predicting. -
brandonjclark
I predict IT will predict what is was trained on.MacZ24 said:80% of the time, it works every time.
And without much effort, I predict that skin color will be a big part of the predicting. -
ezst036 When I see things like this I sometimes think to myself:Reply
Who were the unscrupulous software developers who actually agreed to build this and WTF is wrong with their brains?
It's like a few months ago, Google had some layoffs that hit their software engineer teams. Sorry, no. I have zero sympathy for you. You willingly built a spying machine for Google to steal our data. Perhaps you even deserved to lose your job.
I just don't understand what is wrong with people like these. -
MacZ24 brandonjclark said:I predict IT will predict what is was trained on.
If it ever comes to a western country like the US or western Europe, where population ethnicity is far less homogenous than South Korea, it will encounter a problem.
That's because some ethnicities have higher crime rate than others. Its is denied by the left and used by the right as a way to justify their racism. But it is the results of different factors (including structural racism) that put these ethnicities more at the lower end (income wise) of the society. And that means that they are more suceptible to turn to crime.
Since the model will not have access to the socio-economic profile of the person it is monitoring via CCTV (presumably), I think you could try to make it ignore skin color, but it will reduce its efficacy, since skin color is, in reality, a predictor. We, as human, understand that we should not use this parameter, because we have a moral compass. But AI models don't. And by the behavior of some policemen, you undestand that their own 'training' incorporates this parameter.
TL;DR : very bad idea. -
Co BIY The ability of the AI to track a known habitual criminal and detect his "Pre-crime indicators" is amazing.Reply
Unfortunately, the ability to track every single citizen and detect any "undesirable" behavior is just a matter of scaling up and changing a setting. -
Ogotai heh, a few months ago with AI, it seemed we we creating the beginings of Skynet, now, it seems we are creating Project Insight.Reply
:ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: -
t3t4 Oh here we go again..... Didn't anyone see the "Minority Report"? Crime prediction doesn't work with out the pre-cog triplets!Reply
-
Wimpers We will have to film you everywhere, all the time, even in your own house.Reply
It's for your safety, people! -
SyncroScales Has anyone noticed that in retail stores facial recognition prompts security to target specific people? Even when they know they are wrong? Then ignoring actual thieves, mothers and their children, families, the same ethnicities and cultural indifferences or racism.Reply
Does everyone know that specific people and their likeness were put into the AI, language models and CCTV, facial recognition, etc? The families that do not like certain people and have multi-generational grudges target who they choose to. Even when they were the fault of the conflicts. It was designed to limit the ability to even have the access to goods and essentials and to always feel uncomfortable, watched, be harassed and even to provoke violent attacks or incidents to further justify the personal vendettas. Facial recognition was mass implemented in what 2010 - 2012?
It is uncomfortable. But there are no regrets.