• bassomitron@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    9 days ago

    I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        8 days ago

        The company should be doing more to support these employees, that’s the point. Right now, Meta doesn’t give a fuck if their employees are getting severely traumatized trying to keep content off their platforms. They don’t pay them much, don’t offer resources for mental health, etc. A maybe bad analogy would be like a construction company having no heavy machinery safety policies and when those employees get hurt and can’t work anymore, just firing them with no worker’s comp.

        For comparison, hospitals or law enforcement provide therapy and/or other mental health resources for their employees, since those jobs put their employees in potentially traumatic positions with some frequency (e.g. a doctor/nurse witnessing death a lot).

  • FourPacketsOfPeanuts@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    7
    ·
    9 days ago

    Look… this is going to sound exceedingly stupid… but shouldn’t they find a way to use convicted sex offenders to filter out CSAM? They are not going to be traumatised by it and it saves some other poor soul from taking the brunt. How do you motivate them to actually do it? Well first one has to flag, and a second one earns a bonus every time the first one flags wrong. Motivation aligned!

    </joking… but seriously though…>