LOL: Cornel West Thinks Gavin Newsom Has a White Supremacist Mindset
Transgender Charged After Shooting at Border Patrol in New Hampshire
Democrats Will Lose Their Minds After JD Vance's Announcement About Minnesota Fraud
Chinese Official Thought ChatGPT Was Private – Now We Know How China Silences...
They Spied on Kash Patel and Susie Wiles – Now They Are Paying...
Rocket Mortgage Announces Partnership With Compass to Make Trump's Home Affordability Agen...
Following Backlash, Pro-Abortion Professor Withdraws From Notre Dame Appointment
Utah Proposal for Citizen Carry Puts Pro-Teams in Crosshairs
Cuban Coast Guard Kills Four, Injures Six on Florida-Registered Speed Boat
Rep. Wesley Hunt Slams Gavin Newsom For His Racist Comments: 'You're Not Like...
If This CA City Elects This Man, It Will Be a New Low...
‘Tax the Jews’ Chants Erupt at San Francisco Mayor’s Tax Reform Press Conference
SCOTUSblog Co-Founder Convicted of Tax and Mortgage Fraud
Report: No Deal yet Between U.S. and Iran Over Nuclear Weapons
Former Air Force Pilot Arrested Over Allegations That He Trained Chinese Military Pilots
Tipsheet

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns
AP Photo/Ted S. Warren

As Artificial Intelligence continues to dominate social media, no corner of the art and information world is seemingly untouched by the new phenomenon: including true crime.  

Advertisement

Although AI's controversial nature has garnered backlash regarding the legal use of "real voices" and art, its entrance into true-crime sphere on TikTok raises these ethical concerns to another disturbing level. 

This video is an Artificial Intelligence depiction of Brianna Lopez, known as "Baby Brianna," who died at the hands of her mother, father, and uncle in 2002 in New Mexico. As if these details of her death weren't gruesome enough, in the video, it's told through a hyper-realistic depiction of what Lopez would've looked and sounded like like as a toddler, while covered with blood and other signs of abuse.

"From the day I was born, until five months later, when I died, I had received no love from anyone," the voice says in the video. "I would go through abuse every single day from the people that should have loved me."

And this isn't the only example. The TikTok account @mycriminalstory is just one of the many that solely posts these types of videos, where victims, and, in some cases, perpetrators, of unspeakable crimes tell their side of the story.

Many of the users in the comments section support the videos, saying it gives the crime victims (many of whom are no longer alive) a voice they never had. Some defenders of this true crime trend even say it brings awareness to horrific crimes that otherwise may be forgotten. 

Advertisement

On the contrary, these videos have also received intense backlash from many users and creators alike, pointing out that the victims' families did not give permission. Not to mention, some even argue that the videos are being used for clout rather than to bring awareness. Although the voices can't be traced back to the victims themselves, they are designed to look and sound nearly identical to them. 

Here's what Paul Bleakley, assistant professor in criminal justice at the University of New Haven, told Rolling Stone about the issue: 

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.” 

The music publication described the phenomenon as a "walking nightmare."

Artificial Intelligence has already created problems. In the music realm, creators have used the technology to use artists' voices to create music they never recorded themselves, like with the case of TikTok user Ghostwriter creating an AI-generated Drake song to promote his own name. Another complicated side of AI is that its code essentially steals artistic style from the internet which can't be proven legally because of its nature, but may have unknown ramifications on the price of art created by actual artists. 

Advertisement

AI-created crime victims, however, generate a different level of concern. It could not only be opening old wounds for the survivors or surviving family members of a tragic crime, but it also often involves the depiction of young children. Supporters could say that the U.S. has always been known for its sensationalized violence, so how is this different? Opposers could say, will we stop this before there is literal live holographic depictions of the murders and their young victims who can't give permission for use of their face and voice? 

Ultimately, Artificial Intelligence is too young to have any legal ramifications for unethical behavior, so the users and viewers of AI generators must ask themselves: when has it gone too far? 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement