Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won’t help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.
deleted by creator
removed by mod
deleted by creator
Thanks.
Genuine question: won’t they just move to spamming CSAM in other communities?
With how slow Lemmy moves anyways, it wouldn’t be hard to make everything “mod approved” if it’s a picture/video.
Or it could even just ask 50 random instance users to approve it. To escape this, >50% of accounts would have to be bots, which is unlikely.
But then people would have to see the horrible content first
That definitely is a downside
This, or blocking self hosting pictures
Honestly, this sounds like the best start until they develop better moderation tools.
This seems like the better approach. Let other sites who theoretically have image detection in place sort this out. We can just link to images hosted elsewhere
I generally use imgur anyway because I don’t like loading my home instance with storage + bandwidth. Imgur is simply made for it.
Yes, and only whitelist trusted image hosting services (that is ones that have the resources to deal with any illegal material).
the problem is those sites can also misuse the same tools in a way that harms the privacy of it’s users. We shouldn’t resort to “hacks” to fix real problems, like using client scanning to break E2EE . One solution might be an open sourced and community maintained auto mod bot…
This seems like a really good solution for the time being.
[This comment has been deleted by an automated system]
Not-so-fun fact - the FBI has a hard limit on how long an individual agent can spend on CSAM related work. Any agent that does so is mandated to go to therapy afterwards.
It’s not an easy task at all and does emotionally destroy you. There’s a reason why you can find dozens of different tools to automate the detection and reporting.
[This comment has been deleted by an automated system]
Yep. I know someone that does related work for a living, and there are definite time limits and so on for exactly the reasons you say. This kind of stuff leaves a mark on normal people.
I had no idea that this was going on. I expect that, like me, most people are horrified.
wow that’s fucked, but thanks for the transparency
Sorry for my ignorance, what does CSAM mean? Thank you in advance.
Child Sexual Abuse Material
Thanks.
I’m gone for a few days and some assholes trying to fuck up this instance. Smh.
Correction, fkd the instance
Saw that one post (unfortunately). How come people who spread content like that in the “open” internet (not darkweb) don’t get arrested?
Darkweb as a proxy to the clear web. Its a ligitimate tool used for horriffic things somtimes. Tor is a huge double edge sword.
If these people were distrobuting, the most open stratigy would hide in plain sight. Like say the dark corners of some well known mainstream, but negligent social media site. These people are here to terrorize.
Usually they do, eventually (even on the darkweb).
Though it does take time to pin them down sometimes because many criminals take steps to hide their identities and make it very difficult.
Lots of people though are caught easily and quickly, hopefully these guys will be like that and will get caught quickly and put away for good.
I’m in the US and grew up here. My dad is a piece of shit pedophile who exploited me for several websites. None on darkweb but at the time, they didn’t really need to be.
Word got out and cops came to interrogate ME the person who was the victim in this situation. They also blamed me for what was going on (how? I don’t know, I was a teen who was being exploited underage but cops gonna cop) and they basically intimidated me into dropping out of school and taking the blame for my dad because the other option was that I be sent to a home as an orphan.
My dad got away with a slap on the wrist basically because cops in the US don’t do their job. They even cover for people. I ended up literally running away from home.
I think the issue is that people see this topic as all outrage - I mean, look at the comments here. Everyone is so mad they can’t even think straight. And I’m personally noticing that some of the outrage doesn’t even seem to be directed at the right people. Like everyone is super willing to shit on pedos left and right but I wonder if they would be willing to listen to someone who has been knee deep in this shit before and were innocent because they were being exploited by a pedophile.
Like a lot of comments say “disgusting” but then don’t say anything about how the person that it happened to must feel. Everyone’s upset they saw something but they don’t seem to be upset about who they saw it happening to.
Like I wonder who they think hurts more in this situation. The person who was made into a victim or the people who just saw it happen.
I’m sorry for your experience and I hope you’re doing better now. It’s very important to note that the major victims in this “attack” are absolutely the abused children. That’s exactly what makes this the most disgusting attack on Lemmy to date.
First, thank you for your comment and I’m glad you posted it.
Second, I want to take a moment to clarify what people mean when they say “disgusting” - at least for me - it’s not a matter of “oh my god I saw something awful, my day is ruined, that’s disgusting”, it’s instead “oh my god I can’t believe that happened, that’s disgusting”. The nuance is that the “disgust” is against the people who allowed it to happen - The perpetrators, the cops who did nothing, the people who shared it, and the world that allowed it to happen.
Again, thank you for sharing your story, it’s a great reminder of what the actual issue at hand is. Much love!
They would need to be reported with some credible and actionable evidence to the appropriate authorities. Just having the material they are posting isn’t enough. If they are taking precautions, it’s gonna be hard to figure out who and where they are from what the site would be able to log while they use it.
GET OUT THE 4CHAN
The legal system also takes time. The material could be disseminated fairly widely before an investigation would be started and completed. Especially with bots. Hundreds of bots are a lot faster than a team of humans.
Just had like 90 (probably unrelated) arrests in one state in Germany. There are just way too man to arrest them all.
I hardly know anything about the subject of the dark web, but my take is, they operate from the dark web. They open temporary email accounts, then free computing or free hosting accounts, or are part of a bot net. The rest is relatively easy.
Law enforcement monitors the dark web, though, so if they make a stupid move (and they will), they will eventually get caught.
God I hope they get caught.
Was wondering why I couldn’t reply to a comment from earlier today when I was sure I hadn’t broken any rule to get banned. Hope it’s back soon, but more importantly that you can stop all the damn CP.
Please get some legal advice, this is so fucked up.
Why would someone post that crap? If you’ve been banning and removing posts all day that seems like someone is malicious trying to do it to get Lemmy in trouble. I don’t know, just a guess but someone needs to go to prison for doing that.
This sucks, but we were probably due to have a Grand Registration Security Hash-Out at some point; going forward, any instance that wants others to federate with it is probably going to have to have a system in place to make it impossible for jackasses like this to create endless spam accounts.
Looks like Google has some tooling available that might help: https://protectingchildren.google/tools-for-partners
Probably other options too.
I have detailed why in plenty of my other comments. Here is one. https://programming.dev/comment/2426720
This is seriously fucked up, but won’t closing lemmyshitpost just lead them to target other LW communities?
Probably unrelated but I once saw that community get attacked with scat porn, it could be the same person.
There’s also another issue. If you upload an image to Lemmy but then cancel, the image is still hosted on the Lemmy instance and one can still access it if one copies the image’s URL before canceling. This basically means that there might be other illegal stuff that’s being hosted on Lemmy instances without anyone noticing.
That should probably be fixed at the Lemmy level.
I also found the opposite: had a post, and the image disappeared, with an “error in image” message, for a jpeg. I wonder if whoever did this, was aiming for exactly that outcome.
Who the fuck has such a problem with this instance?
It’s a great question. What’s the motivation? Who loses if Lemmy succeeds?
The motivation could also be just trying to get the instance shut down, or otherwise break it, like the user that was spamming communities not that long ago.
Nothing too conspiratorial.
Spez loses. Maybe he’s the one behind it cause he’s salty that people called him out for being a pedophile.
It wouldn’t surprise me if an associate of reddit (or someone they hired at arms length) was responsible for the DDOS attacks, but I highly doubt reddit as a company is hiring anyone to spam CSAM.
Think about what sort of motivations and personality someone would have to do it. IMO it’s the guy who got butthurt over not being able to spam and threatened that he would wreck the site.
It’s highly unlikely. There are better, less CSAM ways for a company to make lemmy instances unstable. This is clearly for shock and troll value.
I know; I’m just being sarcastic. I didn’t think I had to pull out the /s here on Lemmy, cause everyone here seems more intelligent (for the most part).
Yeah it’s definitely a targeted campaign of sabotage. I’m glad people have found a way to stem the flow, but we’ll need a better solution long term.
I don’t think either a corporate entity or another instance/software type is behind this. The motivation for these people is the same as those who used to post a certain image with the word ‘goat’ as part of the title all over usenet binary groups and web forums. They simply find it funny that so many people are appalled and they feel a sense of power that they’ve affected so large a community. There’s nothing more complicated than that to it.
Exactly.
Rivals such as Reddit, the groups who benefit from the ease at which they can manipulate what people see on Reddit to further their agendas (companies, governments, groups).
And fascists in general, who hate open global community platforms that are hard to control, and hate things that bring people together, things that give people strength to fight together against things bigger than them.
All it takes is one of those interested parties to fund a couple of low rent hackers to poke at Lemmy until it’s so unstable and untrustworthy that people stop using it.
Cheap and effective if done right, a good investment in the long run for any discerning fascist.
I doubt it would be company rivals for the time being. Lemmy is nowhere big enough, and even the largest instances are dwarfed by either other social networks, or other projects like Mastodon.
Reddit has much more of a concern for Facebook or Twitter trying to co-opt their model, than some small open source project, like how Meta released Threads during the recent Twitter controversy.
… I can think of one possibility - an instance known that lemmy.world recently defederated from.
I was thinking in combination with the ddos attack, personally I believe they’re the same people doing this, obviously not everybody will share that belief
Yup. It’s this. This is the mentality behind trolling. The point is to antagonize people just to get a reaction. That’s all they care about. They want to do the most outrageous thing in a public space because they know people will respond and they think its funny when people react to anything they do.
People need to realize: They want to get banned. They want you to try and sit around figuring out why they’re doing this.
Last week an explodingheads shithead kept posting racist (Also antisemitic etc etc) “memes”. People told them to get fucked. It was wonderful to see everyone be incredibly mean to the fucker.
If we know one thing about nazis, it’s that when they get busted for any crime, they always end up finding csam as well. So wouldn’t suprise me if some nazi got angry because they weren’t welcome here and cracked open his personal collection
Huh, interesting. I was thinking hexbear because of the tianmen square post
This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.
The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.
The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.
Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.
Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:
Talk to your children about online safety and the dangers of CSAM.
Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.
Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.