Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.

Good riddance to bad spyware.

squid
link
fedilink
English
541Y

Best way to fix the issue is for parents to monitor they’re kids internet usage and not big brother. And the online pics and people acting on these urges are the end result of a bigger issue that if we focused on would better benefit the live of everyone. And let’s be honest, who truely believes that apple or any big tech company are acting in the interest of mankind, its a company with its sole purpose is to make money

phillaholic
link
fedilink
English
81Y

I don’t see how this topic has anything to do with money at all.

Dizzy Devil Ducky
link
fedilink
English
111Y

Apple will find a way to make it about money if at all possible. Large companies are known for following the law as minimally as possible and pretty much only making decisions that benefit them financially.

phillaholic
link
fedilink
English
71Y

Your just saying a bunch of generic words that have nothing to do with this specific situation.

@Rai@lemmy.dbzer0.com
link
fedilink
English
21Y

All I read from their post is “Apple bad because… Apple bad”

Well, I assume they stopped doing it cuz it cost them money to do it… So he’s not wrong it’s about money.

phillaholic
link
fedilink
English
21Y

You should look up the definition of Circular Logic.

KrombopulosMikl
link
fedilink
English
47
edit-2
1Y

Can we please start ignoring these groups? “The Children” are their only concern no matter how it might negatively affect anyone else. Kids who are abused or trafficked should be helped, but that’s not the only issue we have right now and it’s definitely not more important than the effects eroding privacy would have for people and groups around the world. But these “wHaT aBoUt ThE cHiLdReN?!” groups really don’t care - as long as their concerns are addressed everyone else can go fuck themselves. Plus, lots of these groups are primarily staffed and run by former or current LEO who would like to see less privacy anyways.

Children being sexually absued is absolutely more important then iPhones automatically scanning things you upload to their private servers.

KrombopulosMikl
link
fedilink
English
21Y

And this is exactly what I’m talking about

Completely agree. They should follow the Jeffrey Epstein’s link and networks if they have the balls. You don’t need to scan all people’s phone on earth. Well, it includes high profile people, so good luck with that.

Bipta
link
fedilink
101Y

The other day I saw a story “legal marijuana hurts the children” was filed in a court argument. No support for that sensationalist claim, of course.

@PleasantAura@lemmy.one
link
fedilink
English
42
edit-2
1Y

Almost none of these groups actually care about the kids. Most of them actively support policies that are proven to enable/cause more abuse because it feels like they’re hurting the bad guys. As a childhood survivor of a bunch of awful shit that I don’t want to get into specifics on, I’ve never seen a single “for the children” group advocate anything that wouldn’t have caused more trauma for me when I was younger. There’s no care about fixing problems and preventing childhood trauma/abuse, just care about asserting control and investing in what “feels good”: retributive justice (that’s more likely to cause recidivism) against one single specific style of abuser while ignoring others (and the survivors) entirely.

This is more about feeling good (and, for some, more authoritarian control) than about actually helping the issue of child abuse.

@OscarRobin@lemmy.world
link
fedilink
English
11Y

Also I highly suspect groups like this one in particular are funded or even founded by government organizations with a high interest in accessing all your data.

I’m glad that this is finally put to rest. I ditched iCloud in protest of this but came back after Advanced Data Protection came along.

Although now I just send my photos to my PC directly now (Photosync) and copy files with iTunes since iCloud Drive (on Windows) kept breaking and also ballooning it’s logs to absurd sizes. So I’m just paying for cloud backups and message storage.

@mahony@lemmy.world
link
fedilink
English
1091Y

The client side scanning of contents of your phone is the most 1984 thing you will hear.

@HughJanus@lemmy.ml
link
fedilink
English
121Y

THINK OF THE CHILDREN

@Asudox@lemmy.world
link
fedilink
English
2
edit-2
1Y

WHO ELSE WILL PROTECT THE CHILDREN IF THE GOVERNMENT ISN’T THERE FOR THEM?! CHILDREN… THINK OF THE CHILDREN!!!

phillaholic
link
fedilink
English
131Y

It was client side scanning if you chose to upload those files to iCloud. The equivalent of having your ID checked before you enter a club.

deleted by creator

phillaholic
link
fedilink
English
11Y

You’re thinking of Google, where they data mine you as their primary business model. Google Photos scan your photos for object recognition, what do you think that is? There’s no E2E there at all. Apple’s object detection is done on device. It amazes me that Apple got attacked about this when literally everyone else is just doing it without telling you and not offering encryption.

Uriel238 [all pronouns]
link
fedilink
English
12
edit-2
1Y

Let’s say my grandson came to a realization that he was actually my granddaughter. She grows her hair long. She practices with make-up and gets some cute dresses and skirts, and is totally into it.

Now Apple knows.

Any any law-enforcement interests that think its wrong or abusive by fiat can force Apple to let them know.

Same, if my grandkid decides they are pagan and go from wearing a cross to wearing a pentacle.

Same if law enforcement notices that they are caramel colored, that mom is Germanic pale and dad is dark brown.

The US is a society in which neither law nor law enforcement are on our side, and can at any time decide that arbitrary life shit is worthy of sending a SWAT team to collect us. And if the GOP is determined to make it worse.

Not really. The plan that Apple backpedaled on was to compare hashes photos on device to hashes of known CSAM material. They wouldn’t see any user-generated photos unless they was a hash collision. Other companies have been known to report false positives on user-generated photos and delete accounts with no process to recover them.

Uriel238 [all pronouns]
link
fedilink
English
41Y

This assumes the program stays that way. Much the way Google promised no human would look at (or be able to look at) the data set, we dont have an external oversight entity watching over Apple.

And then there’s the matter of mission creep, much the way the NSA PRISM program was supposed to only deal with foreign threats to national security (specifically Islamist terrorism) yet now it tells local precincts about large liquidatable assets that can be easily seized.

Even if it only looks as hash codes, it means law enforcement can add its own catalog of hashes to isolate and secure, say content that is embarrassing to law enforcement, like videos of police gunning down unarmed, unresisting suspects in cold blood, which are challenged only when the event is captured on a private smartphone.

phillaholic
link
fedilink
English
51Y

They published a white paper on it. It would have taken many detected examples before they did anything about it. It’s not strictly a hash as it’s not looking for exact copies but similar ones. Collisions have been proven, but afaik they are all reverse engineered. Just Grey blobs of nonsense that match CSAM examples. I don’t recall hearing about someone’s random taken photo matching with anything, but correct me if I’m wrong.

True, it’s hash-like in that the comparison is using some mathematic representation of the source material. It was intended to be a little fuzzy so it would still catch minor alterations like cropping, watermarks, rendering to a new format, etc…

The example I heard of was someone that was using an app for a remote doctors appointment. The doctor requested photos of the issue, a rash in the genital area of a minor, supposedly one included an adult hand touching the area involved. That photo ended up in Google’s cloud service where it was flagged, reported to law enforcement, and that users while Google account was frozen. The investigation quickly confirmed the innocence of the photo, and provided official documentation of such, but last I heard Google would not release the account.

phillaholic
link
fedilink
English
11Y

Google has unencrypted access to your files to do whatever they want with, do we know this was the same CSAM system or one of Google internal ones? Google Photos does their face and object scanning on the cloud where apple does it on device.

regalia
link
fedilink
English
71Y

You’re paying to reserve some space in their cloud to store your encrypted bits. If you exchange money for that space, then you’re entitled for it to be encrypted and private.

phillaholic
link
fedilink
English
31Y

Find me any place you don’t own that you can store your stuff that has no restrictions on what you can store there.

regalia
link
fedilink
English
41Y

Something like Proton Cloud, or a self hosted Nextcloud instance. If it’s encrypted, it’s nobody’s business.

phillaholic
link
fedilink
English
51Y

Not according to their terms of service

You agree not to use your Account or the Services for any illegal or prohibited activities. Unauthorized activities include, but are not limited to: Disrupting the Company’s networks and Servers in your use of the Services; Accessing/sharing/downloading/uploading illegal content, including but not limited to Child Sexual Abuse Material (CSAM) or content related to CSAM;

regalia
link
fedilink
English
61Y

It’s e2ee, that’s just for them to legally cover their ass. They have zero knowledge of what’s uploaded.

phillaholic
link
fedilink
English
31Y

Proton hasn’t really gotten pushback yet as they are small. If Pedophiles start utilizing Proton for CSAM I guarantee you things will change or they will shut down. Another full e2e provider, can’t recall the name at the moment, just ended up shutting their service down when governments started coming after them. They aren’t the guys from the PirateBay.

@rikonium@discuss.tchncs.de
link
fedilink
English
20
edit-2
1Y

Yes, however my (Others may have other concerns, this is just off the top of my head) chief concern was the breaking a major barrier - in that explicitly user-hostile code would be running on the device itself, one I own. I’d say it’s more of the equivalent of club employees entering your home to check your ID prior to, or during your club visit, and using your restroom/eating a snack while they’re there. (scanning would use “your” device’s resources)

There’s also the trivial nature of flipping the require_iCloud_photos=“true” value to “false” whether by intention or by accident. I have an open ticket with Apple support where my Apple Maps saved locations, favorites, guides, Home, reports, reviews ALL vanished without a trace. Just got a callback today saying that engineering is aware of the problem and that it’s expected to be resolved in the next iOS update. I’m the meantime, I’m SOL, so accidents and problems can and do happen, nor is Apple the police.

And on top of that there’s also concerns of upstream perversion of the CSAM database for other purposes - after all, who can audit it to ensure it’s use for CSAM exclusively and who can add to it? Will those images from the device and database be pulled out for trials or would it be a “trust the machine, the odds of false positives are x%” situation? (I believe those questions might have been already answered when the controversy was flying but there’s just a lot of cans of worms waiting to be opened with this, as well as Apple being pressured to scan for more things once the technology has been made.)

phillaholic
link
fedilink
English
61Y

The CSAM database isn’t controlled by Apple. It’s already in use practically everywhere. Apple tried to compromise between allowing private encrypted image storage at scale and making sure they aren’t a hot bed for CSAM. Their competitors just keep it unencrypted and scan it for content, which last time I checked is worse 🤷‍♂️

@Natanael@slrpnk.net
link
fedilink
English
31Y

But Apple still fetches that list of hashes and can be made to send an alternative list to scan for

phillaholic
link
fedilink
English
21Y

It’s not very useful for much else. It only find known copies of existing CSAM. It doesn’t detect new ones. Governments could already force Apple to do whatever they want, so it’s a keep to say this is going to do much more.

@mahony@lemmy.world
link
fedilink
English
21Y

You go way out of your way to lick Apples boot here. With comparing hashes to whatever Apple wants/is told to, you can profile everyone, find leaked material the gov doesnt want you to have and so on. The fact that people just accept it, or endorse it is beyond me, but again, after the last 3 years I came to the conclusion that most people are scared to be free.

phillaholic
link
fedilink
English
21Y

While scanning for leaked government documents is the first thing I’ve heard that could be a problem for whistleblowers, I’ll point out this scanning tech is already in use in major cloud platforms and no government has forced anyone to do it. Having a database of all government documents like that wouldn’t be trivial to put together either. It’s just not practical to be used that way.

I don’t care that it was Apple who did this, it presents a legitimate answer to E2E encryption of data while cutting many government arguments off at the legs. Without an answer we are closer to E2E being made illegal then we are nothing happening.

@CrypticCoffee@lemmy.ml
link
fedilink
English
28
edit-2
1Y

I don’t agree, it is perfectly normal to spy on people and make sure they’re not committing a crime. I’m sure execs of these companies and those politicians would be fine with that if we were watching from their windows just to make sure they aren’t using illegal content… /s

Create a post

In the digital age, protecting your personal information might seem like an impossible task. We’re here to help.

This is a community for sharing news about privacy, posting information about cool privacy tools and services, and getting advice about your privacy journey.


You can subscribe to this community from any Kbin or Lemmy instance:

Learn more…


Check out our website at privacyguides.org before asking your questions here. We’ve tried answering the common questions and recommendations there!

Want to get involved? The website is open-source on GitHub, and your help would be appreciated!


This community is the “official” Privacy Guides community on Lemmy, which can be verified here. Other “Privacy Guides” communities on other Lemmy servers are not moderated by this team or associated with the website.


Moderation Rules:

  1. We prefer posting about open-source software whenever possible.
  2. This is not the place for self-promotion if you are not listed on privacyguides.org. If you want to be listed, make a suggestion on our forum first.
  3. No soliciting engagement: Don’t ask for upvotes, follows, etc.
  4. Surveys, Fundraising, and Petitions must be pre-approved by the mod team.
  5. Be civil, no violence, hate speech. Assume people here are posting in good faith.
  6. Don’t repost topics which have already been covered here.
  7. News posts must be related to privacy and security, and your post title must match the article headline exactly. Do not editorialize titles, you can post your opinions in the post body or a comment.
  8. Memes/images/video posts that could be summarized as text explanations should not be posted. Infographics and conference talks from reputable sources are acceptable.
  9. No help vampires: This is not a tech support subreddit, don’t abuse our community’s willingness to help. Questions related to privacy, security or privacy/security related software and their configurations are acceptable.
  10. No misinformation: Extraordinary claims must be matched with evidence.
  11. Do not post about VPNs or cryptocurrencies which are not listed on privacyguides.org. See Rule 2 for info on adding new recommendations to the website.
  12. General guides or software lists are not permitted. Original sources and research about specific topics are allowed as long as they are high quality and factual. We are not providing a platform for poorly-vetted, out-of-date or conflicting recommendations.

Additional Resources:

  • 1 user online
  • 1 user / day
  • 4 users / week
  • 45 users / month
  • 395 users / 6 months
  • 1 subscriber
  • 675 Posts
  • 11.2K Comments
  • Modlog