Outgoing Border Patrol Chief Shares One of His Biggest Regrets Before Retirement
How These City Employees Turned Taxpayer Cash Into Instagram Profits
Here's What Ron DeSantis Said When Asked Whether He Will Run for President...
Police Slap Fake Drug Charge on Man After He Tried to Report Them...
Here's Who Will Be Joining MI Democrat Senate Candidate Abdul El-Sayed at a...
Katie Pavlich Grills Democrat Over Sanctuary Policies After Chicago Murder
Sen. Kennedy Hammers Schumer, Democrats Over Shutdown
Delta Suspends Stand-Alone Service for Congress Until TSA Is Fully Funded
NJ Gov. Mikie Sherrill Visits Mosque Run by a Radical Imam With Troubling...
Here's Why the Venezuelan Illegal Immigrant Who Killed a College Student Missed His...
Gun Rights Advocate Sues New Jersey Over 'Denied' Public Records
Democrat Wisconsin House Candidate Campaigns With Architect of Sanctuary City Policies
Republican Senate Candidate John Sununu Could Win in New Hampshire According to a...
Judge Rejects Bid to Kick Eric Swalwell Off the California Governor Ballot
Trump Unloads on Joe Kent Over His Resignation As He Makes Clear He...
Tipsheet

More Than 90 Organizations Call on Apple CEO to Abandon New iPhone Surveillance System

More Than 90 Organizations Call on Apple CEO to Abandon New iPhone Surveillance System
AP Photo/Kiichiro Sato, File

More than 90 policy and rights organizations requested that Apple not move forward with its plans to monitor children's IPhones for sexually explicit content over concerns of censorship and privacy violations.

Advertisement

Apple hopes to protect minors by implementing software in IPhones that will scan devices for child pornography and warning users that the content they are sending or receiving may be sexually explicit. The new system will then inform the organizer of a family account, typically a parent, when someone under 13-years-old elects to send or accept graphic content.

In a letter to Apple CEO Tim Cook, dated Thursday, the groups advocating for civil, human and digital rights outlined the dangers that could come from a surveillance system while also acknowledging that the tech manufacturer's intentions may have been pure. 

"Algorithms designed to detect sexually explicit material are notoriously unreliable," the letter reads. "They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child." 

"Moreover, the system Apple has developed assumes that the 'parent' and 'child' accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship," it continued. "This may not always be the case; an abusive adult may be the organiser [sic] of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing." 

Advertisement

Related:

APPLE BIG TECH

The letter went on to note that while Apple currently only intends to flag sexual content found in iMessages, the federal government may later attempt to compel the company into identifying non-sexually explicit content that they deem objectionable.

Apple's operating system will also feature a tool that detects child sexual abuse material provided by the National Center for Missing and Exploited Children. Each graphic image uploaded to iCloud will be scanned against the database of sensitive child images. If a match is found, the user's account will be disabled and authorities will be alerted.

The letter warns that, once this surveillance system is in effect, Apple could face governmental pressure and legal requirements to monitor devices for content beyond just underage sexual material. 

"Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them," the letter says of what content the federal government could consider to be objectionable. "And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."

Advertisement

This comes after officials in the Biden administration made efforts to identify individuals who participated in the Jan. 6 Capitol riot by scanning images posted to social media. Last year, Trump officials used the same tactic to discover who took part in the Black Lives Matter riots that destroyed several U.S. cities following the death of George Floyd.

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement