Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.
The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.
In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.
If the other admins want to give their opinions about this, then I am all ears.
I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.
I’d be willing to consider either and would love your, particular, feedback on this as well.
Did you forget to log into your alts or are you unaware of how the edit button functions?
Storage is super cheap, fwiw.
Now be nice. Of course I know about the edit button. The comments were not posted at the same time and generally later editing is discouraged. Nor are long comments or one comment on different topics great.
Why on earth would I have multiple accounts? I am sure people do, but that too is kind of strange behavior and perhaps abusive depending on how they are used.
Edits are not frowned upon unless you’re just editing a post to make someone look bad
The RATE of storage both the increasing and the bandwidth transferring, is the expensive part.
I think if a platform has image capabilities this is to be expected. I guess the only exception if there are filters that can be used, but this seems unlikely. So I think it is an image vs. no image decision. The other problem with images is they can be attack vectors from a security point of view. Any complex file format can be an attack vector as interpreters of complex file formats often have bugs.
Can you imagine that the large platforms have whole teams of people that have to look at this stuff all day and filter it out. Not sure how that works, but it is probably the reality. Notice R$ never hosted images.