Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.
The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.
In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.
If the other admins want to give their opinions about this, then I am all ears.
I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.
I think temporarily suspending images until you guys can settle on a safe alternative to lemmy is a good idea.
There is no such thing as a safer alternative to Lemmy. It’s very easy to say things like “use tools” to filter these things, but in actuality it’s anything but, it’s way beyond a foss project. (or Reddit for that matter, though they are trying and good gawd, I just remembered something I saw on reddit and have not thought of for years, damn it)
Well true, but I meant more like a forum with limited access (no images or links) until you meet certain requirements etc. So not totally safe, but a bit safer than the current setup