Apple to permanently scan users’ photos and messages for abusive content | WHAT REALLY HAPPENED X-Frame-Options: SAMEORIGIN

Apple to permanently scan users’ photos and messages for abusive content

Apple announced on Thursday plans to scan users’ iPhones for child sexual abuse content in an effort to “protect children from predators who use communication tools to recruit and exploit them,” in addition to limiting the spread of Child Sexual Abuse Material (CSAM).

According to Apple’s announcement, new operating system technology will allow the company to identify known CSAM images, allowing Apple to report the incidents to the National Center for Missing and Exploited Children, an agency that collaborates with law enforcement to combat child sexual abuse nationwide.

Webmaster's Commentary: 

I just put this on my iPhone.

Comments

So anyone inconveniencing the regime can get nabbed...

George

...with a bunch of pedo porn, whether s/he put the files on the phone or not.

Anyone inconveniencing the Philippine regime is a "drug dealer". And anyone inconveniencing the US regime will be a "pedofile" now. I guess they realized that no one believes "terrorist" anymore.

Can you imagine how they decided on the change? Smoke filled room in DC...

"Guys, we need a new label for regime targets. No one buys 'terrorist' anymore"
"OK, let's think. Pass me the thinking powder. What Americans hate the most?"
"Us."
"(Choking cough) OK, what they hate the most next?"
"Being lied to."
"You are not helping. How about a "War On Pedophilia"?"
"Hey, this might work. Let me run it by Hunter and Howard Rubin."
"Not on your life! Just run with it."

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

SHARE THIS ARTICLE WITH YOUR SOCIAL MEDIA