• lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    10
    ·
    edit-2
    19 days ago

    The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.

    But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

    • lurklurk@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      19 days ago

      You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it’s added, next year it’s extended to cover terrorism. Then to look for missing people. Then “illegal content” in general.

      The reason most people seem to disagree with you in this case is that you’re wrong

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        19 days ago

        We could’ve burned that bridge when we got to it. If Apple would’ve been allowed to implement on-device scanning, they could’ve done proper E2E “we don’t have the keys officer, we can’t unlock it” encryption for iCloud.

        Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.