AI's getting better at faking crowds. Here's why that's cause for concern

<img src='https://npr.brightspotcdn.com/dims3/default/strip/false/crop/3600×2700+0+0/resize/3600×2700!/?url=http%3A%2F%2Fnpr-brightspot.s3.amazonaws.com%2F62%2F19%2F77617e0043bdb9ce68263d554756%2Fai-generated-4x3standard.jpg' alt='A still showing an AI-created crowd at a big public event from OpenAI’s publicity video for its new video generation platform Sora 2. AI crowd scenes have traditionally posed a big technical challenge for companies like OpenAI and Google. But their models are improving all the time.’/>

Odd fingers and faces in the crowd of a recent Will Smith concert video led to suspicions of AI. But AI is improving fast, and there are serious implications for how “fake” crowds might be coopted.

(Image credit: OpenAI)

Background

This developing story continues to unfold as more information becomes available. The situation has drawn attention from various stakeholders and continues to be monitored closely.

Analysis

Industry experts suggest that this development could have significant implications moving forward. The broader context of this news reflects ongoing trends in the sector.

What This Means

The implications of this news extend beyond the immediate circumstances. Stakeholders are closely watching how this situation develops and what it might mean for future developments in this area.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *