View Single Post
kscherer
Which way is up?
 
Join Date: Aug 2004
Location: Boyzeee
 
2021-08-05, 15:39

No matter how you stack it, this is Apple making the assumption that all of their users are potential pedophiles. It cannot be argued around, no matter how hard you try. It's casting a wide net, dragging up all the fish, and then sorting through it all to find the one fish you want.

Bad plan, and way too open for abuse. I'm not saying Apple will abuse this, but they may not be given a choice. With all of the anti-trust floating about, what happens when developers demand access to this feature, and the government—being obtuse and uneducated in security matters—puts forward a law that requires Apple to open up all of their security features to 3rd parties? And I'm pretty sure this is coming, sooner rather than later. Touch ID, Face ID, hash-scanner thingies. Grief, the potential for abuse is just unfathomable.

And then there is the international abuse that is sure to come. What happens when, say, China demands Apple open this up to track unpopular populations like, oh, I don't know, Uyghurs. Will China force Apple to scan for Uyghur hashes? "Oh, we see you took a picture of a criminal element. Now, tell us everything you know! Where are they? Who are they? Where did you take this photo?" And on and on.

"Oh, I see you have a picture of Hitler on your phone."

"Oh, I see you and your significant other are into kinky things the gubmint does not like."

"Oh, I see you took a photo of a person of whom it is illegal to photograph."

"Oh, I see …"

And they do "see".

Better hope you don't have any baby nudies on your phone.

And I'm apparently not the only person thinking like this:

Quote:
At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.
- Settings / Apple ID / iCloud / Photos = OFF


- AppleNova is the best Mac-users forum on the internet. We are smart, educated, capable, and helpful. We are also loaded with smart-alecks! :)
- Blessed are the peacemakers, for they shall be called sons of God. (Mat 5:9)

Last edited by kscherer : 2021-08-05 at 15:49.
  quote