Gadget

Their children were shot, so they used AI to recreate their voices and call lawmakers

[ad_1]

The parents of a teenager who was killed in Florida’s Parkland school shooting in 2018 have started a bold new project called The Shotline to lobby for stricter gun laws in the country. The Shotline uses AI to recreate the voices of children killed by gun violence and send recordings through automated calls to lawmakers, The Wall Street Journal reported.

The project launched on Wednesday, six years after a gunman killed 17 people and injured more than a dozen at a high school in Parkland, Florida. It features the voice of six children, some as young as ten, and young adults, who have lost their lives in incidents of gun violence across the US. Once you type in your zip code, The Shotline finds your local representative and lets you place an automated call from one of the six dead people in their own voice, urging for stronger gun control laws. “I’m back today because my parents used AI to recreate my voice to call you,” says the AI-generated voice of Joaquin Oliver, one of the teenagers killed in the Parkland shooting. “Other victims like me will be calling too.” At the time of publishing, more than 8,000 AI calls had been submitted to lawmakers through the website.

“This is a United States problem and we have not been able to fix it,” Oliver’s father Manuel, who started the project along with his wife Patricia, told the Journal. “If we need to use creepy stuff to fix it, welcome to the creepy.”

To recreate the voices, the Olivers used a voice cloning service from ElevenLabs, a two-year-old startup that recently raised $80 million in a round of funding led by Andreessen Horowitz. Using just a few minutes of vocal samples, the software is able to recreate voices in more than two dozen languages. The Olivers reportedly used their son’s social media posts for his voice samples. Parents and legal guardians of gun violence victims can fill up a form to submit their voices to The Shotline to be added its repository of AI-generated voices.

The project raises ethical questions about using AI to generate deepfakes of voices belonging to dead people. Last week, the Federal Communications Commission declared that robocalls made using AI-generated voices were illegal, a decision that came weeks after voters in New Hampshire received calls impersonating President Joe Biden telling them to not vote in their state’s primary. An analysis by security company called Pindrop revealed that Biden’s audio deepfake was created using software from ElevenLabs.

The company’s co-founder Mati Staniszewski told the Journal that ElevenLabs allows people to recreate the voices of dead relatives if they have the rights and permissions. But so far, it’s not clear whether parents of minors had the rights to their children’s likenesses.

[ad_2]
Source link

Related Articles

Back to top button