Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.
The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.
In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.
If the other admins want to give their opinions about this, then I am all ears.
I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.
Sorry to hear that mate! That’s one of the biggest reasons I’ve never wanted to move towards IT forensics even though I think I’d enjoy the actual work. But having to regularly sift through the absolute worst humanity has to offer sounds awful.
Hope the immediate pain of it settles as soon as possible!
This might not be what people want but since beehaw is going to leave Lemmy anyway, couldn’t you just completely defederate and run as an isolated instance? Then you’d have control of what her life gets published without having to deal with federated nastiness?