NotebookCHECK - Notebook Forum

English => News => Topic started by: Redaktion on September 20, 2024, 11:56:36

Title: AI systems like GPT-4 and Gemini misinterpreting Ring Camera footage could lead to false police calls, especially in minority neighborhoods
Post by: Redaktion on September 20, 2024, 11:56:36
MIT-based research analyzed Amazon Ring footage using AI models like GPT-4 to assess police intervention decisions. The study revealed inconsistencies, including high false-positive rates and bias against minority neighborhoods, potentially leading to unnecessary police calls in non-criminal situations in the future.

https://www.notebookcheck.net/AI-systems-like-GPT-4-and-Gemini-misinterpreting-Ring-Camera-footage-could-lead-to-false-police-calls-especially-in-minority-neighborhoods.891320.0.html
Title: Re: AI systems like GPT-4 and Gemini misinterpreting Ring Camera footage could lead to false police
Post by: mdongwe on September 20, 2024, 11:59:28
"We've tried several LLMs that were not trained to identify crime and found out they can't identify crime" .. Well, okay? I guess
Title: Re: AI systems like GPT-4 and Gemini misinterpreting Ring Camera footage could lead to false police
Post by: Anonymous453 on September 24, 2024, 22:44:22
AI analysis of video footage should be illegal. This is completely and utterly dystopian.