Post-Assad Syrian Christians Rise Up to Celebrate Christmas
The Details Are in on How the Feds Are Blowing Your Tax Dollars
Here's the Final Tally on How Much Money Trump Raised for Hurricane Victims
Since When Did We Republicans Start Being Against Punishing Criminals?
Poll Shows Americans Are Hopeful For 2025, and the Reason Why Might Make...
Protecting the Lives of Murderers, but Not Babies
Legal Group Puts Sanctuary Jurisdictions on Notice Ahead of Trump's Mass Deportation Opera...
Wishing for Santa-Like Efficiency in the USA
Celebrating the Miracle of Redemption
A Letter to Jesus
Here's Why Texas AG Ken Paxton Sued the NCAA
Of Course NYT Mocks the Virgin Mary
What Is With Jill Biden's White House Christmas Decorations?
Jesus Fulfilled Amazing Prophecies
Meet the Worst of the Worst Biden Just Spared From Execution
Tipsheet

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns

AP Photo/Ted S. Warren

As Artificial Intelligence continues to dominate social media, no corner of the art and information world is seemingly untouched by the new phenomenon: including true crime.  

Advertisement

Although AI's controversial nature has garnered backlash regarding the legal use of "real voices" and art, its entrance into true-crime sphere on TikTok raises these ethical concerns to another disturbing level. 

This video is an Artificial Intelligence depiction of Brianna Lopez, known as "Baby Brianna," who died at the hands of her mother, father, and uncle in 2002 in New Mexico. As if these details of her death weren't gruesome enough, in the video, it's told through a hyper-realistic depiction of what Lopez would've looked and sounded like like as a toddler, while covered with blood and other signs of abuse.

"From the day I was born, until five months later, when I died, I had received no love from anyone," the voice says in the video. "I would go through abuse every single day from the people that should have loved me."

And this isn't the only example. The TikTok account @mycriminalstory is just one of the many that solely posts these types of videos, where victims, and, in some cases, perpetrators, of unspeakable crimes tell their side of the story.

Many of the users in the comments section support the videos, saying it gives the crime victims (many of whom are no longer alive) a voice they never had. Some defenders of this true crime trend even say it brings awareness to horrific crimes that otherwise may be forgotten. 

Advertisement

On the contrary, these videos have also received intense backlash from many users and creators alike, pointing out that the victims' families did not give permission. Not to mention, some even argue that the videos are being used for clout rather than to bring awareness. Although the voices can't be traced back to the victims themselves, they are designed to look and sound nearly identical to them. 

Here's what Paul Bleakley, assistant professor in criminal justice at the University of New Haven, told Rolling Stone about the issue: 

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.” 

The music publication described the phenomenon as a "walking nightmare."

Artificial Intelligence has already created problems. In the music realm, creators have used the technology to use artists' voices to create music they never recorded themselves, like with the case of TikTok user Ghostwriter creating an AI-generated Drake song to promote his own name. Another complicated side of AI is that its code essentially steals artistic style from the internet which can't be proven legally because of its nature, but may have unknown ramifications on the price of art created by actual artists. 

Advertisement

AI-created crime victims, however, generate a different level of concern. It could not only be opening old wounds for the survivors or surviving family members of a tragic crime, but it also often involves the depiction of young children. Supporters could say that the U.S. has always been known for its sensationalized violence, so how is this different? Opposers could say, will we stop this before there is literal live holographic depictions of the murders and their young victims who can't give permission for use of their face and voice? 

Ultimately, Artificial Intelligence is too young to have any legal ramifications for unethical behavior, so the users and viewers of AI generators must ask themselves: when has it gone too far? 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement