business

Deepfake of principal's voice is the latest case of AI being used for harm

Font size+Author:Global Gallery news portalSource:sport2024-05-21 12:45:37I want to comment(0)

The most recent criminal case involving artificial intelligence emerged last week from a Maryland hi

The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.

The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.

“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.

Here’s what to know about some of the latest uses of AI to cause harm:

AI HAS BECOME VERY ACCESSIBLE

Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.

The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.

Related articles
  • Supreme Court rejects an appeal from a Canadian man once held at Guantanamo

    Supreme Court rejects an appeal from a Canadian man once held at Guantanamo

    2024-05-21 11:35

  • China to Strengthen Home Visit Services for Elderly

    China to Strengthen Home Visit Services for Elderly

    2024-05-21 11:26

  • Villagers Get Access to Medical Treatment in Rural Areas in East China's Shandong

    Villagers Get Access to Medical Treatment in Rural Areas in East China's Shandong

    2024-05-21 10:19

  • Autumn Harvest Across China

    Autumn Harvest Across China

    2024-05-21 10:11

Netizen comments