human resources (department) is for punishing the human resources (employees).
I try things on the internet.
rarely, shit just works.
human resources (department) is for punishing the human resources (employees).
Thank you. I’m cured.
And if we have a gun problem its not a gun problem its a mental health problem. If we have a mental health problem then its not my problem. Now if you’ll excuse me, I need to buy another gun to protect myself. No, I don’t have a mental health problem. Hoarding guns is the american way!
Plex, running locally, on my server: “You should add a server!”
Plex, running locally, on my server: “Claim 10.0.0.10!”
Plex, running locally, on my server, after claiming my server: “You should add a server!”
You’re a laywer?
This advil hits harder than I expected.
I am my own hexbear. The hexbear is inside of us all.
Maybe we shouldn’t defederate because a user has an app preference and a feature preference that don’t match up?
If an app doesn’t have a feature you like, try an app that has a feature you like. There’s lots to choose from. In fact, there’s the web interface itself, and the ability to write your own scripts, too.
You can block instances too. That’s why I said block instances. Are you having a hard time reading?
You can block any instance you like. Super easy to do so. Just click block instance and it’s like you didn’t even need to post this!
You mispelled “dumb”. We aren’t dumb enough to be a police officer.
You mispelled “cowardly”. We aren’t cowardly enough to stand outside of buildings while lifes are endanger.
It doesn’t take much bravery to stand outside a school and watch kids get shot.
All Cats Are Beautiful but All Cops Are Bastards.
too much collodial silver = any amount of collodial silver.
This is an interesting idea. So if I’m understanding you correctly the workflow would be like this:
user uploads 4 images… 2 are flagged as CSAM.
user overwrites the flag on one image, asserting that “no, this isn’t CSAM”
in other sites, I’ve seen this work by the content remaining hidden except for the user until a team reviews it. If the team agrees, it’s allowed on the site. I think this is different from what you are describing though. I think you’re suggesting that the content stay online after the user overwrites the flagging, but then a mod will later double-check to see if the user was indeed trustworthy.
I only worry that an untrustworthy user will keep the content online until a mod reviews it, increasing the time the material is online and increasing the risk. It would be difficult to argue that “this was done in the interest of user satisfaction, even though it means that more CSAM got out”. Ultimately I don’t know how many people want to argue that to a judge.
Lemmy admins need to do whatever it is they can to handle CSAM if and when it arises. Users need to be understanding in this because as I’ve argued in other threads, CSAM itself poses a threat to the instance itself, as it poses a threat to the admins if they cannot clean up the material in a timely manner.
This is going to likely get weird for a bit, including but not limited to:
I just want folks to know that major sites like reddit and facebook usually have (not very well) paid teams of people who’s sole job is to remove this material. Lemmy has overworked volunteers. Please have patience, and if you feel like arguing about why any of the methods I mentioned above are BS or have any questions reply to this message.
I’m not an admin, but I’m planning on being one and I’m sort of getting a feel for how the community responds to this sort of action. We don’t get to see it a lot in major social media sites because they aren’t as transparent (or as understaffed) as lemmy instances are.
Too bad you couldn’t copy it over with low-speed dubbing.