Hm, yeah I guess no one has been speculating about this part of the de/federate Threads reality. Everyone's worried about Meta and EEE, but what we should have really been discussing is the history of Meta moderation and community guidelines which have often cited "free speech" when people use white supremacist dog whistling but cite "calls to violence" when people of color actively complain about white supremacy.
There's a reason why we have seen news articles about large LEO Facebook groups trading and making joke comments on racist memes...
We were worried about the technology, but we should have been worried about cultural infiltration.
White supremacists are like that guy nobody ever wants at their party but who always invites himself anyway. It's hard enough to keep him from washing his balls in the punch bowl when you're actively trying to keep him out. Meta doesn't even try except to the meager extent required by law.
I mean… I wasn’t expecting this to not happen eventually… I’m just surprised it happened so quickly, and that Meta has done nothing in terms of mitigation - and moreover, didn’t see this as a thing they’d need to guard against out of the gates (unless, I suppose, this isn’t intended to be a Twitter clone, and it’s more shooting for being a Parler clone).
There’s probably a lesson somewhere in there about the benefits of growing your userbase organically instead of trying to force-march users over by creating shadow accounts, but applying that lesson would be unprofitable, so Meta definitely won’t care.
I guess I just assumed that that was commonly understood, As soon as I saw that it was going to be run according to Facebook's moderation standards, I took that to mean that it was going to be tailored to suit white supremacists and Christian nationalists, like Facebook.
Supporting free speech means allowing people you hate to talk too. Censor a Nazi one day, then the next day it's something your weird friend likes, then the next day it's something you like.
Everyone deserves a platform online, but they have to earn their audience. Censoring them is only going to make more people want to go to other platforms to hear and see what they have to say.
@MiscreantMouse This is why I’m of the opinion that defederating from anything that smacks of Meta or Threads should be done immediately. Zsuck supports Russian bots, Alt-right Insurrectionists and hate speech and has done so since 2015, in other words, longer than Elon. Should be walled off and removed like a cancerous tumor. In my view, that should include any instance that signed an NDA with them.
I saw a survey of instances that indicated many are taking a “wait and see” approach, which is mystifying. What do people think they are find that they don’t already know about Meta?
Its just free speech that nobody has to listen to, right? Lemmy has no ads anyways so what if there's some nonsense mixed in? I doubt it would outnumber the people who want good content to prevail.
They flock everywhere that will have them. There are definitely far-right shitstains on every platform, even this one. They're like a gas and expand to take up whatever volume they're in, or in this case, that doesn't ban them. Simply "flocking" to a platform isn't really an indictment.
I fail to believe a company that is so focused on being in line to whatever is politically correct, would allow Nazis onto their platform. HELL I don't believe ANY social media that wants a good rep would allow Nazis. They must be getting angry at people for having an opinion and jumping straight to "THEIR A FUCKING NAZI!"
I did not read the article just the title so I don't even know why I'm commenting.