Skip Navigation

Posts
0
Comments
198
Joined
2 yr. ago

  • You realize lip and gum cancer are caused by smoking as well? Not to mention teeth and skin yellowing? There are more but you get my point.

    There are so many different issues caused by so many different things in cigarettes. Smoke in the lungs is just part of the problem.

  • For the very well known health concerns of smoking?

  • I don’t know the logistics behind why they went that route. Eventually they upped the physical security on the electronics they were stealing, and then things just went quiet. 🤷🏻‍♂️

  • I’m looking at the GitHub now, but I’m not seeing anything that screams “must have,” is there something not obvious that it does?

  • It was an “extra security” procedure put in place because at the time a gang had been targeting our stores by breaking in through the emergency exit, grabbing expensive electronics, and getting out in under 2 minutes. The machinery was meant to only be in place while the building was empty, with the intent of them opening the door and deciding that it would take too long to maneuver around it and instead just leave.

  • Note: I deleted my comment by mistake. X.x

    So I think most of the time we would be in the clear, as long as actual CSAM is handled when it is found/reported.

    Just like Reddit doesn’t get hauled to court when CSAM is posted. And mods don’t get arrested for viewing it while they are removing it.

  • This was likely worse, the intent was explicitly to block the emergency exit. That was the point of the request.

  • Honestly, I don’t know how the law would handle this kind of situation. But in my mind, the only time you’re in legal hot water is when (a) there is actual CSAM involved, and (b) nothing is done to prevent that association.

    In this case, (a) was proven to be false. So there’s no concern. But if it had been the case, then defederation makes sense.

    Otherwise, there’s no reason to federate at all. Anyone can post CSAM on any instance at any time. There’s nothing in place to detect it, nothing in place to handle it other than manual moderation. That’s just a hard fact of lemmy instance hosting.

  • So, yes. Their instance would have copies of content viewed by their users. That said, they didn’t defederate because of CSAM, which would make perfect sense. They defederated because they made an incorrect assumption, and then wanted an entire community nuked because of that assumption… even after they were corrected.

    The moment things were made clear, they should have said “oh okay, our bad.” But instead they doubled down.

  • Honestly I’d just be happy if their fixed the shitshow that is Siri.

  • rule

    Jump
  • And none of them will bother googling adam.the.creator

  • Why would a pedophile be interested in a grown ass adult pretending to be a kid? That’s like saying you’d be attracted to someone pretending to be short by walking on their knees…

  • Once had a manager instruct me to block an emergency exit with an extremely large piece of machinery. While the building was still full of customers.

  • Ageplay is absolutely a thing, but the point is they are adults. pretending to be something else, doesn’t change what they are.

    It’s creepy, I would certainly not take part. But the bottom line is, in reality it’s just two adults playing pretend.

  • I feel this needs to be clarified. The point is that anyone of legal age deserves to be lusted after if that’s what they want. You telling them “you look too young, no one is allowed to find you attractive” is a bit… fucked.

  • An adult role playing as a kid isn’t any different than them role playing as a dog, or a car, or a dragon. Are you going to tell me I can’t role play as a dragon while my partner role plays as a car?

  • ”Love me and accept me or I’ll have someone else torture you for eternity!”

    Sounds like a suuuper healthy relationship.

  • Oh, that’s odd. Wonder if that’s just an odd quirk of Mastadon interacting with lemmy. On this end it looks like you’re manually typing out the users name on its own line.

  • I’m inclined to agree with you. Though I’ll argue that most users over there are agreeing based on a colorful interpretation of what happened, assuming that there is indeed a community based around legal porn meant to look like CSAM… which doesn’t appear to be the case at all. Look at the community in question (!adorableporn@lemmynsfw.com) and you’ll notice a lack of anything encouraging people to present as underage.

  • Looking through the alleged community (!adorableporn@lemmynsfw.com) I’m really not seeing anything that looks inherently like it could be misinterpreted as CSAM.