this post was submitted on 06 Jun 2023
3 points (100.0% liked)

Privacy

31815 readers
293 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 13 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 year ago (1 children)

Combating CSAM is great and all, but something tells me this will also be used for far more sinister purposes.

[–] [email protected] 2 points 1 year ago

Always has, always will be unfortunately. It's a classic "Think of the children" change

[–] [email protected] 2 points 1 year ago (2 children)

Does this really even matter for combatting this? were pedos really so stupid that they were putting their shit on the cloud?

I'm all for stopping pedophiles but this seems like a scary breach of privacy, is an apple person going to look at my young-looking but completely legal nudes? This seems scary.

[–] [email protected] 3 points 1 year ago (2 children)

Agreed, my penis could easily be mistaken for a child's

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

That's reddit type funny comment, and i hope this type of humor will transfer here with reddit refugees

[–] [email protected] 1 points 1 year ago

Lemmy is maturing! More than your penis apparently.

[–] [email protected] 1 points 1 year ago
[–] [email protected] 1 points 1 year ago

For any detection logic one has to take the video, .i.e. the series of images in, feed to logic of code, and return results. For a hidden or very big program, the code can as well be,

if nudity send_to_apple_server() print("We detected nudity, and flagged this video")

The user cannot differentiate it from well intended code. The right thing to do is not track at all! No "SMART" logic to "HELP"!

[–] [email protected] 1 points 1 year ago (1 children)

@Tretiak i wonder when will they expand it to combat establishment dissent and free speech in general?

[–] [email protected] 1 points 1 year ago

Don't worry, most social media's already do that free of charge 😁

[–] [email protected] 0 points 1 year ago (2 children)

This would scan regardless of whether iCloud is enabled or not. But only for minors. Correct?

[–] [email protected] 2 points 1 year ago

Once the capabilities exist, how hard would it be for future fascist regimes to tell Apple to turn it on for whatever other purposes?

Under His Eye. Blessed be the fruit.

[–] [email protected] 1 points 1 year ago

If you believe Apple then yes.

load more comments
view more: next ›