I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.
Exposed to a firehose of the worst humanity has to offer. I can’t even imagine
That’s not fb’s fault though?
The company should be doing more to support these employees, that’s the point. Right now, Meta doesn’t give a fuck if their employees are getting severely traumatized trying to keep content off their platforms. They don’t pay them much, don’t offer resources for mental health, etc. A maybe bad analogy would be like a construction company having no heavy machinery safety policies and when those employees get hurt and can’t work anymore, just firing them with no worker’s comp.
For comparison, hospitals or law enforcement provide therapy and/or other mental health resources for their employees, since those jobs put their employees in potentially traumatic positions with some frequency (e.g. a doctor/nurse witnessing death a lot).
Yeah you’re right
Look… this is going to sound exceedingly stupid… but shouldn’t they find a way to use convicted sex offenders to filter out CSAM? They are not going to be traumatised by it and it saves some other poor soul from taking the brunt. How do you motivate them to actually do it? Well first one has to flag, and a second one earns a bonus every time the first one flags wrong. Motivation aligned!
</joking… but seriously though…>