[HTML payload içeriği buraya]
31.8 C
Jakarta
Monday, May 11, 2026

Microsoft has a brand new plan to show what’s actual and what’s AI on-line


Hany Farid, a professor at UC Berkeley who focuses on digital forensics however wasn’t concerned within the Microsoft analysis, says that if the trade adopted the corporate’s blueprint, it could be meaningfully harder to deceive the general public with manipulated content material. Subtle people or governments can work to bypass such instruments, he says, however the brand new customary might get rid of a good portion of deceptive materials.

“I don’t assume it solves the issue, however I feel it takes a pleasant massive chunk out of it,” he says.

Nonetheless, there are causes to see Microsoft’s method for example of considerably naïve techno-optimism. There’s rising proof that individuals are swayed by AI-generated content material even after they know that it’s false. And in a current examine of pro-Russian AI-generated movies in regards to the struggle in Ukraine, feedback declaring that the movies have been made with AI obtained far much less engagement than feedback treating them as real. 

“Are there individuals who, it doesn’t matter what you inform them, are going to consider what they consider?” Farid asks. “Sure.” However, he provides, “there are a overwhelming majority of Individuals and residents all over the world who I do assume need to know the reality.”

That want has not precisely led to pressing motion from tech firms. Google began including a watermark to content material generated by its AI instruments in 2023, which Farid says has been useful in his investigations. Some platforms use C2PA, a provenance customary Microsoft helped launch in 2021. However the full suite of modifications that Microsoft suggests, highly effective as they’re, would possibly stay solely strategies in the event that they threaten the enterprise fashions of AI firms or social media platforms.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles