Gym teacher arrested for AI clone of principal’s voice

Published on:

A Baltimore highschool athletic director was arrested on Thursday after police stated he had allegedly used AI to create a faux audio clip of the varsity’s principal.

In January, we reported on the audio clip that presupposed to be a recording of Pikesville Excessive principal Eric Eiswert making racist and antisemitic feedback about workers and college students.

On the time, Eiswert denied the authenticity of the audio, and claims had been made that the clip was an AI faux.

- Advertisement -

The broadly shared audio clip led to Eiswert dropping his job quickly whereas the authenticity of the audio was investigated. The backlash from lecturers, college students, and others locally who believed the clip was real upended Eiswert’s life.

When you hearken to the audio clip it’s simple to know why individuals believed it was real.

Consultants have now concluded that the audio was an AI faux primarily based on the audio’s flat tone, lack of constant respiratory sounds or pauses, and unusually clear background sounds.

It’s alleged that Dazhon Darien, Pikesville Excessive’s former athletic director, made the AI faux in retaliation for Eiswert launching an investigation into his misuse of college funds.

- Advertisement -

Dazhon Darien has been arrested and charged with disrupting college operations, theft, retaliating towards a witness, and stalking. The cost sheet famous that Darien had used the varsity’s computer systems “to entry OpenAI instruments and Microsoft Bing Chat companies.”

Absent from the checklist of fees was one thing that will quickly be a criminal offense: utilizing AI to faux somebody’s voice.

Payments just like the No Fakes Act and the No AI Fraud Act have been filed within the US Congress however are but to be handed. So when Darien used AI to create a non-consensual faux audio clip of Eiswert’s voice it technically wasn’t a criminal offense.

See also  Pope Francis addresses world leaders on AI ethics at G7 event in Italy

The truth that it took weeks for an skilled to finally affirm that the audio clip was a faux highlights simply how unprepared society and authorities are to deal with new points that generative AI presents.

The convenience with which these fakes could be generated compounds the issue. If a gymnasium trainer can put collectively a satisfactory faux then think about what extra technically competent unhealthy actors may obtain.

Within the absence of a straightforward solution to determine faux audio and video, our greatest protection could also be to imagine issues are faux till they’re confirmed as actual.

- Advertisment -

Related

- Advertisment -

Leave a Reply

Please enter your comment!
Please enter your name here