AI profiles predict crimes fast but can discriminate against the poor

On 1 March, more than 40 civil organizations called on European governments and lawmakers to ban Artificial Intelligence(AI)-based predictive and profiling systems while fighting crime.

The call comes amid attempts by the European Union to regulate the use of AI in its upcoming Artificial Intelligence Act (AIA).

Civil society organizations believe AI systems in law enforcement, especially in the use of predictive and profiling, discriminate against the most marginalized in society, infringe on liberty and fair trial rights, and reinforce structural discrimination.

“Age-old discrimination is being hard-wired into new age technologies in the form of predictive and profiling AI systems used by law enforcement and criminal justice authorities. Seeking to predict people’s future behaviour and punish them, for it is completely incompatible with the fundamental right to be presumed innocent until proven guilty. The only way to protect people from these harms and other fundamental rights infringements is to prohibit their use.” said Griff Ferris, legal and policy officer, Fair Trials in a statement.

The use of AI in predictive policing involves the use of advanced algorithms to analyze large amounts of data to draw insights and understand patterns that can help authorities predict and help prevent crimes. In the US, for instance, AI-powered predictive policing creates hotspots of areas where there is likely to be more crime, allowing authorities to send more police to those areas.

India is not far behind. In 2015, for instance, the Delhi Police and Indian Space Research Organisation–Advanced Data Processing Research Institute (ISRO-ADRIN) partnered to develop a Crime Mapping, Analytics and Predictive System (CMAPS) — a web-based application deployed in Delhi Police Headquarters and accessible via a browser at all police stations and districts of Delhi.

CMAPS generates crime-reporting queries and has the capacity to identify crime hotspots by auto sweep on the Dial 100 database every one-three minutes, replacing a Delhi Police crime-mapping tool that involved manual gathering of data every 15 days.

It performs trend analysis, compiles crime and criminal profiles and analyses the behaviour of suspected offenders—all with accompanying graphics. CMAPS also has a security module for VIP threat rating, based on vulnerability of the potential target and the security deployed, and advanced predictive analysis, among other features.

Likewise, police in Himachal Pradesh have installed hundreds of CCTV cameras to form a ‘CCTV Surveillance Matrix’, and similar steps have been taken by the governments of Telangana and Jharkhand.

That said, the findings are not admissible as evidence in courts. “CMAPS basically points the police to where crime is happening, etc. (But) I don’t think the CMAPS data itself can be used in court under the Evidence Act. It helps the police find crime, and then they’ll have to prove that the crime has been committed using traditional non-predictive methods,” explained Arindrajit Basu, research lead at the Centre for Internet Society.

Gurugram-based Staqu, which built an AI system for Punjab police, has applied for a tender under Lucknow Smart City project to deploy a new voice-based surveillance system. For this, the company’s JARVIS AI system will use microphones attached to CCTV cameras to recognize specific sounds, like a gunshot or a scream, and contact the authorities automatically.

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint.
Download
our App Now!!

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.