Can You Feel the Vibe Shift?
A YouTuber Did What No Other News Outlet Would Regarding These Somalian Fraud...
A Guest Thought She Could Slide This Gibe at the GOP on CNN....
This CBS News Reporter DID NOT Drink the Kool-Aid Regarding This Story About...
Democrats Hate Police, Love Postal Workers?
Check Out the Photo That the Green Bay Press-Gazette Called One of Their...
The Washington Post Got Massive Backlash for Sob Story About a Trans Athlete
Keir Starmer Celebrated Return of Egyptian Activist Unaware of His History of Violent,...
SNAP Waivers Mean These States Will Ban Junk Food Purchases Starting in 2026
Three Illegal Immigrants Were Just Arrested for a Massive Gift Card Fraud Scheme
Let's Be Honest
Reflection on Year’s End: Infighting at TPUSA?
Trump's Yearly Performance Review
It's Morning Again in America
Frightening CCP Infiltration of the U.S. at All-Time High
Tipsheet

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns

AP Photo/Ted S. Warren

As Artificial Intelligence continues to dominate social media, no corner of the art and information world is seemingly untouched by the new phenomenon: including true crime.  

Advertisement

Although AI's controversial nature has garnered backlash regarding the legal use of "real voices" and art, its entrance into true-crime sphere on TikTok raises these ethical concerns to another disturbing level. 

This video is an Artificial Intelligence depiction of Brianna Lopez, known as "Baby Brianna," who died at the hands of her mother, father, and uncle in 2002 in New Mexico. As if these details of her death weren't gruesome enough, in the video, it's told through a hyper-realistic depiction of what Lopez would've looked and sounded like like as a toddler, while covered with blood and other signs of abuse.

"From the day I was born, until five months later, when I died, I had received no love from anyone," the voice says in the video. "I would go through abuse every single day from the people that should have loved me."

And this isn't the only example. The TikTok account @mycriminalstory is just one of the many that solely posts these types of videos, where victims, and, in some cases, perpetrators, of unspeakable crimes tell their side of the story.

Many of the users in the comments section support the videos, saying it gives the crime victims (many of whom are no longer alive) a voice they never had. Some defenders of this true crime trend even say it brings awareness to horrific crimes that otherwise may be forgotten. 

Advertisement

On the contrary, these videos have also received intense backlash from many users and creators alike, pointing out that the victims' families did not give permission. Not to mention, some even argue that the videos are being used for clout rather than to bring awareness. Although the voices can't be traced back to the victims themselves, they are designed to look and sound nearly identical to them. 

Here's what Paul Bleakley, assistant professor in criminal justice at the University of New Haven, told Rolling Stone about the issue: 

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.” 

The music publication described the phenomenon as a "walking nightmare."

Artificial Intelligence has already created problems. In the music realm, creators have used the technology to use artists' voices to create music they never recorded themselves, like with the case of TikTok user Ghostwriter creating an AI-generated Drake song to promote his own name. Another complicated side of AI is that its code essentially steals artistic style from the internet which can't be proven legally because of its nature, but may have unknown ramifications on the price of art created by actual artists. 

Advertisement

AI-created crime victims, however, generate a different level of concern. It could not only be opening old wounds for the survivors or surviving family members of a tragic crime, but it also often involves the depiction of young children. Supporters could say that the U.S. has always been known for its sensationalized violence, so how is this different? Opposers could say, will we stop this before there is literal live holographic depictions of the murders and their young victims who can't give permission for use of their face and voice? 

Ultimately, Artificial Intelligence is too young to have any legal ramifications for unethical behavior, so the users and viewers of AI generators must ask themselves: when has it gone too far? 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement