"We're suffering from success and our Discord was the first casualty. You know as well as I do that if you gather 250k people in one spot someone is going to say something that makes you look bad. That room was golden and the people that run it are awesome. We blocked all bad words with a bot, which should be enough, but apparently if someone can say a bad word with weird unicode icelandic characters and someone can screenshot it you don't get to hang out with your friends anymore. Discord did us dirty and I am not impressed with them destroying our community"
Key point for me. Under the guise of 'hate speech' suppression is occurring. Of course every platform has the right to dictate users of their platform yada yada...
I think the long history of events like these, particularly the more recent publicised ones, are going to push lots of communities, and by extension the Internet, to decentralisation.
That's good and bad because communities can't be shut down when they shouldn't, and can't be shut down when they should.
That's a bold statement, you think it's so obvious? Do you think not being able to shut down child pornographic trading communities is a lesser evil than shutting down WSB discord?
This tired justification of "child pornography", "terrorism" and "hate speech" has been beat to death to ban just about everything these days.
Why not ban cars ? After all both paedos and terrorists use them to carry out their evil, nefarious deeds. They both also use toothpaste.
I think is perfectly fine being un-able to shutdown sites for "social justice". If it ever escalates to being a major criminal problem, a legal case can be filed and it can go through the courts before the ban-hammer falls.
> a legal case can be filed and it can go through the courts before the ban-hammer falls
Once a community that cannot be shut down is declared illegal, what happens? One of the major points of decentralized communities is that there is no effective "ban hammer" to shut them down.
Then it is declared illegal and simply continues to exist. If this is truly a major crime (loss of life, major financial loss), then governments have many means outside of technology to shut stuff down: old-fashioned detective work, cross-national police work, enforcement of treaty obligations or even para-military action in severe cases.
Right now, a sneeze can ban you on the cloud. And the bans are just a click away - and in many cases not even that.
CP trading chat rooms should be shut down by arresting people creating and selling content in there, not by giving someone the right to shut down any chat room they want by the click of a button. Because if you can shut down CP trading chat rooms by the click of a button, we all know this button will be used mainly to target good causes as CP represents a tiny proportion of people or even criminal activities.
Child pornography is illegal, IMO rightly so, and people trading in it should be arrested.
I will never cease to be amused by people who invoke CP to defend censorship when a discussion about control of online discourse is ongoing. We are talking about companies shutting down a group organizing against wall street hedge funds on the pretext that they're Nazi white supremacists, we aren't talking about pedos physically abusing children for profit. If you expect that this is a gotcha and that I have no choice but to declare that I think CP should be protected or else I have to support banning WSB from discord you really need to refine your conversational skills.
I get they're probably scared as hell, but they don't even take care of the child porn or child grooming Discords.
It's not even slightly paranoid to say that somebody gave financial incentive to make this happen, but for them to censor with absolute bullshit reasoning behind it... This is just not good.
Are you saying Discord openly permits groups that share child porn? Or discusss how to find it? I could understand the latter if the discord itself has nothing illegal but that’s still pretty shocking.
I know for a fact that it happens. Certainly they would stop it if they knew about it, but they don’t police this stuff like you might think. They rely on their users to self-report for the most part. [1]
“ On April 23, 2020 at 1:55 p.m., the Marysville Police received a tip that child pornography had been uploaded while using a discord online service.”
There are groups like this all over the place. I know who this person is.
“A discord online service” is Discord.
Also kind of interesting in writing here you can see some local papers are just way out of touch with the tech community.
Don't services that accept large amounts of user-generated content scan for illegal stuff against a database of known material? I find it odd that Discord wouldn't do that.
It's been said that some Discord investors are also investors with Steve Cohen's Point72. The fund targeted by WSB, Melvin Capital, managed $1bn for Points72
Beyond them possibly being directly or indirectly involved on the (currently) losing end of GameStop stock, they may be worried about SEC paying attention to them. I personally think they're in the clear here, but maybe they aren't so sure.
The buses were public and segregation was a law. Blacks made up 75% of the bus customers [0]. A competing, nonsegregated, service would have owned the market if it were possible. So it probably wasn't.
The situation here is very different. There is no government monopoly favouring Reddit or Discord.
That’s a weird analogy that doesn’t make any sense. It’s actually quite offensive to compare the segregation of a whole people to what’s basically a sign on Discords door that said “We don’t allow this type of speech. We have bouncers here.”
Also Rosa Parks wasn’t anonymous, and I assume many on WSB are. I wonder if they would act the same if they weren’t? Or in real life?
It was more a comment on the general "Oh, big tech banned you?" well just create your own payment network, dns service, cdn, hosting, etc etc.
And to extend your analogy, it's like a whole nightclub being permanently closed because one of the 250,000 people in the club (admittedly a big club) said a bad word. It's just a new form of Tone Policing (https://en.wikipedia.org/wiki/Tone_policing)
It’s like a nightclub being closed by the nightclub’s owner, which I think most would agree the nightclub’s owner has the right to do by virtue of being the owner. It’s not closed by police or any outside entity.
> You know as well as I do that if you gather 250k people in one spot someone is going to say something that makes you look bad.
The problem with this statement is the shifting use of "you". I might know that a mob acts like a mob, but I'm certainly not going to be surprised if "my" rights are affected if I'm part of the group from which the mob manifested. That doesn't mean the decision to alter my rights is directed at me, personally.
Random speculation about why this was done or who did it, for what reasons, is also a type of mob thinking.
> On both Discord and Reddit, wallstreetbets users frequently refer to themselves collectively as “retards” and “autists,” and have been known to deploy the kinds of racial slurs and deliberately offensive language that have become commonplace in 4chan-style posting forums.
I read WSB daily for years and cannot recall a single usage of a racial slur. They definitely referred to themselves as autists/retards in a self-deprecating manner but that's hardly the same thing.
This is one of the dangers of using "hate speech" as an nebulous term. You can then turn it around and use it as an excuse to ban whatever you want.
I could easily read through any of the front page posts on Reddit, and find what I consider to be hate speech, and if I were in a position of power, I could just use that as a justification to ban the entire domain.
Moreover, it easily allows adversaries the ability to plant hate speech in forums they dislike, and then pressure the authorities to ban it.
At the end of the day, I am an adult, and I reject the notion that sites need to protect me from speech.
I'm confused, what is the lesson here? That quote seems to get thrown around so often that it can be applied to anytime that anyone gets criticized for anything. The moderator's statement appears to be asking for more help with moderation, not advocating for less moderation or advocating for more hateful behavior to be allowed.
> That quote seems to get thrown around so often that it can be applied to anytime that anyone gets criticized for anything.
That's the point. It serves to show the perils of judging people by what they say, particularly if single offence against some perceived norms becomes a standard for punishment.
And there's another quote from that same period: "like Saturn, the Revolution devours its children" (Jacques Mallet du Pan).
I have to say, that still doesn't mean much to me here. In a real court system, we judge people by their actions and behaviors, sometimes their speech can play into that. I assume that's what the author was referring to. (Apparently the origin of this quote is not clear, but it does seem to come from that same time period) This can't be compared to social media, the only way to judge anyone in the context of a social media platform is by what they say. That's all there is on these platforms, if you take that away, there's nothing left. You are judging someone every time you make a decision whether to like/follow/reply/retweet or not. If your view is that all that doesn't matter and the speech is meaningless, then it follows that any perceived punishments are also meaningless.
1) Language is imprecise. We do not communicate with the idea that our words have a unique, specific meaning. This means that you can do things like quote my statements out of context in order to change their intent.
2) People are not perfect. You can almost always find some "fault" in even the best people. People should not be punished for a "fault" unless it does measurable harm to someone else.
Also that meaning is created by the listener, not defined by the speaker. Anything you say can be misinterpreted, either genuinely or maliciously. The message communicated is whatever the listener hears, with all their history, biases and intentions.
The lesson in that quote is that it doesn't matter what you write, if someone with the right authority wants you hanged, or wants to shut down your discord, it'll happen regardless of due process.
This is different from what the parent poster is implying. It's about the dangers of authoritarianism. No plants or justification are necessary.
I don't think they're protecting you from speech, but protecting themselves from it. A murky reputation harms their ad revenue or corporate opportunities, I imagine.
It makes no sense to pick a specific word and draw an arbitrary line. You can be far more offensive without any "slurs" by asking if one's mother was frequently drunk while pregnant.
Yes, and I see no problems with being offensive in a joking or ironic manner (as seems to be the case on WSB). However, when you decide to use a slur to do so, you are deciding to insult an entire subgroup of the population as collateral damage.
isn't the above poster's example explicitly insulting to everyone who's mother was drunk repeatedly/for long periods of time while pregnant? and everyone with foetal alcohol syndrome?
No? The insult is to the person you say "was your mother frequently drunk while pregnant with you", not everyone else who might fit that particular profile.
If what you're saying would be true, then calling someone an idiot would be an insult to all the idiots out there. They are clearly not the target of the insult, the person you say it to is the insulted.
And more importantly why have we ever allowed it to get so far.
"oh, he hurt my feelings, stop him"
It's fucking moronic for us to have demonised speech like this, we should have held on to the old "sticks and stones" because frankly this Orwellian alternative is bonkers.
it seems pretty clear to me the insult of "idiot" can only be a successful insult if it is inherently offensive to be in the state of idiocy (well, I mean, they're idiots so maybe they don't know they're being insulted, but the point stands).
Or in short, of course it's an insult to the idiots out there. That's literally what makes idiot an insult.
I'm not sure what your point is. There are appropriate and inappropriate ways to use words. Making fun of the neuro-atypicals is inappropriate, joking with likeminded folks in a community built around such is likely appropriate. You've just that word you are complaining about, should you be banned?
I think it depends. Certainly it could be used in a way that conveys racist positions and it could be used in a way that doesn't. Is someone singing along to a rap song a racist? Kids being edgy?
I am not saying though that I think use of the term is okay, only that I don't think it should be prohibited either by law or by the pressure of big companies punishing people who say what is forbidden for them to say. I'm perfectly happy with people who make the judgement that they may never say certain words. I'm not happy with people telling others what they can or cannot say.
Yes. And even with them around. All depends on the context and the willingness to get one's nose bopped.
I would, for example, encourage an actor to use overly offensive expletives, even repeating them ad nauseam in the presence of their fellow actors so they feel more comfortable using such words in a production dealing with racial strife, historical storytelling or what have you.
No word is so beyond the pale it must be Voldemorted.
> You've just that word you are complaining about, should you be banned?
The commenting guidelines state: Be kind. Don't be snarky. [...] Don't sneer. Regardless, I will indulge your curiosity.
1) There are certainly other terms in use on WSB that one could take offense to. I personally find this one to be particularly problematic. (The fact that I am myself autistic probably plays a big role in this ;))
2) I have posted many comments here on HN, and as far as I know this one and the one you replied to are the only ones I've posted on the topic of slurs. I fail to see why I should be banned over this discussion.
I don't mean that comment to be snarky or sneering and I apologize if it came off as unkind. What I mean to convey with that comment is that clearly you believe there are acceptable uses of the word. In other words, you don't think that every use of the word should be prohibited.
Once we've reached that point it seems like a negotiation. Why would it be wrong to say sometimes and not wrong other times? My intuition here is that this is tied to whether the word is used with intent to offend or harass someone. In your case, you weren't using the word that way, and I would argue that the same applies largely to uses of the term on wallstreetbets.
Joking among friends using slurs is not appropriate in my eyes. Doesn’t matter if the slur is based on neurodiversity, race, age, gender or something else.
People of various backgrounds often refer to themselves or close friends of theirs by what would otherwise be offensive slurs. That goes for black people, LGBT people, women, Asians, Jews, heck even white people often refer to themselves or another white person they are close with using a slur. There is actually evidence that people who refer to one another using slurs that would otherwise be offensive do so to strengthen social bonds with each other and there's some interesting game theory for why this is the case.
It's also quite common among comedians and other entertainers to make fun of their own ethnicity using slurs or use slurs in various performances.
You personally may not find it appropriate and that's a subjective point of view that you are entitled to have, but consider whether you have a good rational justification for it or whether you are exercising too much caution to the point that you are getting offended on behalf of others even when those others are not themselves offended.
I don't think people are talking about using slurs to describe themselves, but slurs to describe others.
The WSB folks are in a weird position where they're using slurs to describe themselves that don't actually apply to them (well, not as a whole; I imagine there are some autistic WSB members, just like there are in most populations). I'm bewildered as to why they think that's funny (or whatever justification they have for it), but it does strike me as potentially offensive.
Why does it matter what you find offensive though? What you're saying sounds like an excellent reason for you personally not to visit wallstreetbets, it doesn't sound like any kind of reason at all for the community to be curtailed, censored, or banned.
As an example, I grew up in a very religious home. My family would find things like cursing, taking the Lord's name in vain, or advocating for atheism to be very offensive. And yet, I hope it is very easy for you to understand that it would be wrong to take what my religious family finds offensive and apply those ethical rules to other people. Jehovah's Witnesses think birthdays and holidays are offensive, Muslims think graven images are offensive, etc etc. All sorts of people believe all sorts of things. Every group of people is not obliged to not be offensive to every other group of people.
You may be inclined to say that people can choose not to be religious but have no say over things like mental type or race. I don't think that's true because the majority of religious people are in the same religion as their parents, implying there isn't much free choice involved. Even if that were true, why is something you believe less respected than something that you are?
How can you know what is a slur and what is not when judging the way others communicate in their relationships? What if it’s a nickname from childhood? What if...anything really?
How on earth would you take action on this point, besides teaching our children to be kind, thoughtful, and curious.
> How can you know what is a slur and what is not when judging the way others communicate in their relationships? What if it’s a nickname from childhood?
How can you know what is an insult and what is not when judging the way others communicate in their relationships? What if it’s a nickname from childhood?
How can you know what is a threat and what is not when judging the way others communicate in their relationships? What if it’s a nickname from childhood?
Spillover from medical terms is very common.
You sometimes hear someone being called a "psycho" or a "sociopath" but you don't make the assumption that the accuser is a trained psychiatrist.
"Idiot" used to be a medical condition, someone suffering from idiocy.
"Retard"; someone whose mental development had been retarded and remained in a childhood state into adulthood.
These words are not supposed to be taken literally (when used colloquially) but to suggest that the person in question displays signs of the medical condition.
I guess we could try to label this as misuse of the terms and try to end the practice, but where would that lead us?
"Bob is double plus ungood at thinking"?
You make some good points, I think. The extrapolation I’d make is that even that last statement is offensive. As some say, “if you don’t have anything nice to say, don’t say anything at all”. There’s a culture of decency, and it is opposed to a culture of liberty. Which isn’t to say we shouldn’t have decency, but that everything should be kept in balance. We shouldn’t embrace Gilead’s “freedom from”.
I'm not quite sure how these are inherently in conflict. It seems like a community might allow the use of slurs as insults while also discouraging insults in general. In different words, "Please be nice to each other. We don't ban for the use of any particular words, but we do ban for disparaging others."
I could easily imagine a policy like this in, say, a linguistics forum, where it's totally allowed to discuss the usage of slurs, as long as they're not directed at someone.
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.”
– Michael Crichton (1942-2008)
Every time you read an article saying “X community is filled with Y horrible thing,” remember this.
> We're suffering from success and our Discord was the first casualty. You know as well as I do that if you gather 250k people in one spot someone is going to say something that makes you look bad. That room was golden and the people that run it are awesome. We blocked all bad words with a bot, which should be enough, but apparently if someone can say a bad word with weird unicode icelandic characters and someone can screenshot it you don't get to hang out with your friends anymore. Discord did us dirty...
> That is why I'm throwing my support behind the Twitter handle in general.
> Is "my house, my rules" really so hard to understand?
People are waking up to the bait&switch the on-line platforms are. They do everything they say to pretend they're the public square, short of saying it, but when push comes to shove, they reveal themselves as private owners with an agenda.
What's doubly annoying is that governments worldwide decided to participate. When politicians' primary channel of communicating with their constituents/subjects is through Twitter, you could morally argue that it should be treated as public square. That one is on the politicians though.
Parler didn't go down because it's impossible to host platforms without getting taken down. Parler got taken down because they went with hosting providers who have proven time and time again they are not independent platform providers who just follow the law, they are platform providers with opinions that will act on their opinions if they feel like it in the morning.
If Parler wanted, they could have gone a different route and survived without any issues.
I agree there ought to be publicly-owned infrastructure.
But the infrastructure we have is privately-owned, so yeah, the owners can do what they like.
We'll never get rid of greedy quasi-monopolies unless we let them do a crap job so that people demand alternatives.
This is how market capitalism is supposed to work. (And similarly, we get rid of market capitalism by letting it demonstrate that it just doesn't work very well.)
I'm surprised they haven't formed stonkchan and begun migrating there.
If they don't use this moment to formalize self-managed comms, they're not going to be a community for much longer, given they're putting the squeeze on people to the order of billions of dollars.
> We wrote software to do most of the moderation for us but that software isn't allowed to read the Reddit new feed fast enough and submit responses, and the admins haven't given us special access despite asking for it.
Interesting that they made software to moderate but reddit either doesn't want to or can't handle the throughput required to run it. These threads are some of the biggest ever seen on reddit. What happens if they get shut down again? Should they begin migrating to telegram groups?
Reddit provides pretty much no support at all for reasonable moderation tools and has as recently as this month migrated services and taken away key features in the process.
Most often mods are complicit in the bullshit that the sub participates in (trump related subs, /r/fatpeoplehate, etc). In this case I can see the mods wanting to keep the community afloat but reddit banning them for reasons which effectively become out of their control or unable to moderate.
Even the “official” Automoderator bot I think is just a underpowered cluster running some python.
They will make one phone call to the CEO saying they are opening an investigation that Reddit is facilitating and unlicensed forum for providing investment advice, and that to limit potential liability, the SEC is recommending that Reddit close the sub in question.
Reddit's one lawyer will sit with the CEO, and they will have a 4 second conversation "Yeah, we're not sticking our neck out for those self-proclaimed retards.".
...and that's it. Reddit has banned subs for far far less.
I have run a web forum for 15 or so years. There are often threads or comments I think should remain up, but that legal pressure pushes me to remove. There's "I think this is probably fair in a legal sense" and "I think this is probably fair in a legal sense and I'm willing to gamble my family's assets/wellbeing on proving that in court for the benefit of random forum users that don't care about me for a second."
There's is definitely chatter in the finance community to regulate this kind of behaviour to prevent it from happening again.
They're all bad faith arguments which have ranged from collusive behaviour being insider trading (even though messages were public) to foreign interference destabilising US markets.
There are multiple attack points in which pressure can be applied: the trading platforms ( Robinhood et al), the communication platform (Reddit, Discord), the regulatory framework or through back channels at the board level.
If several funds do end up going bankrupt, which is unlikely to happen now, regulation or some legal framework would likely follow.
This highlights a need in the market for tools to help moderate a community. Imagine a tool that automatically detects hate speech and either auto-deletes or brings it to a moderator’s attention. Certain communities are being highjacked by extremist, racist, and simply malicious actors. The current method of reading chat and banning users doesn’t scale when sudden growth occurs.
If effective moderation can occur at smaller levels, like discord channels or subreddits, then those communities won’t been to be removed by the larger platform. This would also be helpful for startup social media platforms that have yet to bring enough revenue to afford a facebook sized moderation team.
Technically speaking it can do things like:
* Flag posts that contain a blacklist of words, including non-obvious spellings of said word (using non-standard Unicode characters in place of letters)
* Cross reference IP addresses or user names with banned users in other communities
* Notify moderators of trending slogans, phrases, or hashtags that have non-obvious extremist roots.
* Flag content that contains any political discussion for communities that want to be completely apolitical.
* Flag pornography
EDIT: To be clear, the target audience for this would be community moderators/admins or startup social networks that haven’t built their own moderation infrastructure.
This sounds dystopian to me. I already can't stand Google Docs grammar checker trying to re-write my sentences to match some AI's idea of a correct sentence. Sure, when it actually detects a mistake I'm happy but 20% of the time it's just wrong and feels like it's trying to take my personality out of my writing.
Your suggestion sounds like a step to toward the Black Mirror Season 2 "White Christmas" episode where the main character gets ban from all social interaction (not just online) forever until death
I think the current situation where a herd of deplatformed white supremacist and conspiracy theorist cultist can hijack your community is dystopian. Leaders in a community should be given the power to determine who is and isn’t a part of it.
Some of this exists, and both Quora and Facebook (among others) use it extensively. Both hate speech and porn are good targets for machine learning. It needs supervision, but it can take a lot of load off human moderators.
I suspect more message board will want to start applying these sooner rather than later. Most have already figured out that they need anti-spam tools, rather than it coming as a surprise when they roll things out and it fills up with bots. The technology is similar.
You mention being able to share that information across boards, and I don't know of any widespread implementation of that. You can, at least, let somebody else handle your authentication, which slightly slows their ability to create new accounts when you blacklist one. I'd like to see those sites distinguish "aged" accounts, so that it at least takes some effort or cost to use a new account.
Except that they are also saying they weren't given the mod tools they needed. And I'm not sure how you moderate decentralised social media.
I say this as the founder of Retalk.com, which we built exactly for this type of situation. I sincerely hope WSB decide to move platform and join us. We can give them the exact mod tools they need.
You might not want to publicly state you don't know how to deal with moderation (centralized or decentralized) when you're running a public forum, there are tons of solutions out there in the public. Also, why would WSB decide to join you when you're stating you don't really know what you're doing?
"Lemmy is similar to sites like Reddit, Lobste.rs, Raddle, or Hacker News: you subscribe to forums you're interested in, post links and discussions, then vote, and comment on them. Behind the scenes, it is very different; anyone can easily run a server, and all these servers are federated (think email), and connected to the same universe, called the Fediverse.
For a link aggregator, this means a user registered on one server can subscribe to forums on any other server, and can have discussions with users registered elsewhere."
Short interest is reported daily. You don't have to do much guess work to know who is short, in general. And funds have to report beneficial ownership above certain thresholds, so highly concentrated "tactical" managers are susceptible to news or sentiment driven herd events involving stocks in which they have large positions.
Short sellers first short the shares and then declare their position publicly that they are short because XYZ reason.
If others find that the reasons given by the short sellers is legit more people short and driving down the price but instead with GME the opposite happened.
I'm pretty sure that it is public information, though I'm not sure where to obtain it. Presumably the exchange? The WSB members know when the shorts expire etc, so it's detailed info.
Shorts don’t ever expire. Call and put options expire. Almost always on fridays.
Short float stats are assembled by data aggregators who work with the exchanges and brokers. Its not directly available for free but instead with subscription services like Bloomberg or others. Many websites post the information publicly but it is ofter delayed and stripped of much additional information.
Isn't the lesson to take away from this that online mega-communities simply doesn't work after they reach a certain size. Humans aren't wired in a way that allows them to gather in that number and be able to communicate with each other well. The only workable solution is software moderation which can never possess the moral agency required to legitimize tough decisions and make ethical calls.
Rather than one big community, why not a whole tree of sub-communities. With some clever discovery mechanisms to share content and discussions between sub-communities, this would enable both the larger sense of community while allowing individual circles of tighter knit communities to also form. I think this is the basic idea behind federated social media too where the "clever discovery mechanisms" are human curated lists. It's a good pattern that could work in a more centralized model too.
Reddit owns the whole thing and can boot individual communities. I believe they are talking about self owned communities that form a shared meta-community - bottom up, rather than top down, organisation.
That seems to be contrary to the sentiment expressed in the reddit post - "There's also too much political bullshit in a community that was never ever political. The only way I want to occupy Wall St is in a suit myself or rent-free in the mind of a blown up short."
It's just manipulation- as a flood of rubes come in to "buy and hold" the price explodes and then some rubes get into options trades that give them downside exposure, some short the stock, some believe the hype and think it's an investment, and some are there to speculate but can either only go long or get the timing wrong.
So the buying pressure at some point subsides, and momentum to the short side builds as people exit.
Participants in these forums like yahoo finance before it are constantly trying to exhort the group to move into names in which the touts already have positions setup.
> a Discord spokesperson told Gizmodo that the channel had been banned “for continuing to allow hateful and discriminatory content after repeated warnings.” On both Discord and Reddit, WallStreetBets users frequently refer to themselves collectively as “retards” and “autists,” and have been known to deploy the kinds of racial slurs and deliberately offensive language that have become commonplace in 4chan-style posting forums. [0]
They could've just as well "repeated" the "warning" one last time and shut them down next week. They choose to act now. At the very least, this was a PR mistake.
They are perpetrating a financial bubble that will harm its participants, the economy (possibly to a significant degree), and that is specifically designed to hurt certain people. They are encouraging a mechanism that could do far more damage in the future. It's an angry, violent mob, and what I see looks very much like other violent, angry mobs I've seen recently, including the portrayal of victimhood in the post, used as a justification for the behavior.
Mobs don't end well for the community and especially for the mob. Those who think 'well, I don't like the target so I'll overlook some things' soon find the mob doesn't stop when and where they hoped it would. Inviting and encouraging riots in your city is not a way to solve problems. Sometimes mobs do end well for people behind the scenes that use and manipulate the mob.
What good are the participants doing society? I think if you step back and take a deep breath, there's no way that this activity is a good idea.
What r/wsb is doing is fundamentally no different than hedge funds. Wall Street constantly makes trades that move the market, and nobody bats an eye.
Now when it's some self deprecating band of screaming mad lads beat hedge fund managers at their own game, suddenly it's foul play and the free market must be regulated. Doubly so if they are loud and make stupidly idiotic trades that go against the grain of any financial wisdom passed out in the past century and win out big time - how dare they?!
PS: They haven't sold their positions, and I'm sure by the time they expire some of them will definitely have won big time.
PPS: I also like to think of the man or woman who had some shares sitting in the account that they got from working for Gamestop a couple of decades ago, thinking it was worthless now. Until some dumbasses on the internet went and skyrocketed their net worth for no apparent reason other than for shits and giggles.
You have fallen victim to propaganda by wall street billionaire vultures. This is not a bubble. This is a high stakes market movement speculation game that billionaire vulture short sellers are playing. They just got careless and lazy and got caught in money losing position. Now, these billionaire vultures are crying and whining to the government to help them with bailouts and change the rules in the middle of the game.
WSB is playing the game fair and square. It's the billionaires that's been short selling and manipulating the markets that got caught with their pants down.
WSB is squeezing the greed out of billionaire vulture short sellers, which has gotten so brazen that they have painted themselves into a corner. Usually, they get away with these kind of shenanigans. But this time, someone who knew the game saw the positions and realized that other player had the edge using a winning strategy. All they have to do is drive the price higher and short sellers have to buy at whatever price to cover their positions.
Think of it like this, someone has to buy an item that you have, at any price. What price would you charge?
Genuine question: If you believe AMC and / or Gamestop is a failing business model, why is shorting the stock bad? (NOTE I am not well read on what's happening, other than the gist)
This sounds like an aristocratic response to a revolution. I don’t mean this as dismissive, I mean that clearly there are enough people dissatisfied with the status quo that they are taking measures surprising to those satisfied with the status quo.
I think you are talking about a subject that's completely out of your depth. WSB has always been about greed and making profit.
The so called revolution is just a halo that they attached to themselves so they can all look like heros. From an economic perspective, what they did is irresponsible and dangerous.
I agree with most of what OP said. It's just that some facts are harder to swallow in the current political climate of poor = good and rich = evil.
So it's nihilism all the way down. I suppose then that shutting this trade down to protect the hedge fund is morally equivocal to letting them lose their shirts.
You make a compelling point why I shouldn’t care about Wall Street hedge funds, you are right they form financial bubbles, you’re right it harms participants, the economy, and are specifically designed to bankrupt companies.
Unfortunately even if I don’t participate in these games, my tax money is used to bail them out when they lose.
I was under the impression that the historical consensus is that it was an important stepping stone to building modern civilization and granting people the rights they enjoy today, but it was an absolute disaster for everyone who participated in the revolution itself.
The reason censorship doesn't work is because the idea is still in the banned person's head. Denying access hardly does anything to change someone's mind.
Sure, every company has the right to do what they want with their "platform", but don't be surprised if in the future these people persist into new platforms to share their ideas. New platforms that will be controlled by them with no possibility of censorship, or worse, the opposite type in an attempt to reverse the damage.
"keep your friends close, but your enemies closer"
And this is granting the justification for censorship, which of course most of the time is just about power and control, with innocent people paying the price.
I don't understand why this is an issue? Reddit/Discord are companies and they get to decide who they serve or not serve. WSB guys can just go off and make their own reddit and discord if they want. Why all the hullabaloo?
Hopefully they can migrate to a forum. Some subs that have been deplatformed have been very successful with phpBB, believe it or not. Moderation tools make it much easier to ban and harder for potential bad actors to have an impact on the community.
Downside is growth, but I don't think you can get much better than a tweet from Elon and attention from every major news network.
> so the world can see that we aren't doing anything wrong here
The world will see whatever the media, in aggregate, report. They got lucky that the first reports were “Elon Musk tweets about this quirky thing happening”, but the media will soon find another angle to the story. The only thing selling right now is outrage, and since people can’t be quite as outraged about Trump anymore, anything to fill the void is welcomed. You don’t have to do anything outrageous for the outrage maching to get you; they’ll get you anyway for whatever you were alleged to have been associated with, somebody who agreed with something somebody else claimed was from you. Add to this that there’s huge financial pressure for this kind of thing to be put out of business, and they’re probably completely doomed.
Discord bans everyone. You can't even talk about Team Fortress 2 bots without getting banned. What's frightening here is all the reports of the companies people were using to trade stocks cutting off the users from being able to buy gamestop stock arbitrarily. That kind of activity is surely against some sort of law.
This is precisely what works, unfortunately. Each time you deplatform a group, only a certain percentage of the most dedicated users follow to the next iteration.
> only a certain percentage of the most dedicated users follow to the next iteration.
And they become the seed around which the new community crystallizes.
As the social media and tech oligarchies get emboldened to attack less extreme groups, the people who find themselves deplatformed end up taking refuge in the only sites that are prepared to stand against the ratcheting up of restrictions on speech.
Don't be surprised if, as a result of pressure on WSB, a lot of casual traders find themselves mixing and sympathizing with violent extremists on Gab and other platforms.
I'm not trying to suggest that "they're out to get us", just that once a group gets a taste of the power that comes from suppressing speech, it can become quite addictive and the goal posts for "safe" speech can shift.
To see this dynamic played out in different circumstances, look at the Euphemism Treadmill[0] (as mentioned elsewhere in this discussion) and Purity Spirals[1]. It may be bad for business to encourage this nonsense, but as echo-chambers become less diverse, they can exert more pressure on platforms to enforce their rules for them.
> I read WSB daily for years and cannot recall a single usage of a racial slur.
Same, I've been a subscriber of WSB since at least 4 years ago. I also check TheDonald.win (now Patriots.win) on a regular basis. Both are constantly accused of racism and/or white supremacy and yet I see none of either.
People use terms like 'racist', 'nazi', 'fascist' and many others as synonyms for "any person or thing I don't like or disagree with". It's also the media's favorite new strategy for discrediting sources that contradict their chosen narrative.
> I also check TheDonald.win (now Patriots.win) on a regular basis. Both are constantly accused of racism and/or white supremacy and yet I see none of either.
The 8th post on the homepage is this: https://patriots.win/p/11SK7JKMbx/guess-what-my-favorite-thi..., and the top comment is a list of racial slurs for Asian/Chinese people. There's a few other threads that include similar lists, as well as comments making fun of Asian/Chinese people in general.
I saw a few slurs for gay people as well.
I found all this in 5 minutes, so I'm surprised you haven't seen anything even though you browse it "on a regular basis".
Did you actually click the click the link and look at the very top comment? It's full of racial slurs and making fun of Chinese people. And it's not even average Chinese people doing the censorship, it's Biden. So they're going after the wrong people anyway.
Yes I browse regularly and there is definitely prejudice on display. However the general tenor is to downvote or at least not upvote such content. Certainly not as bad as I expected given their reputation. Leads me to see them in a more sympathetic light than as a bunch of racist revolutionaries.
I must admit the list is not as extreme as I expected given your comment. It is par for the course of what I see on the site.
I am not saying the group is good. There are indeed many lightly veiled hints towards violent armed insurrection, as the members fear imminent communist takeover of the USA. But it is best to carefully characterize the forum as the whiplash of overreaction can push people to be more sympathetic to their claims, and feeds their oppression narrative. I have certainly felt this effect.
"Ch*nk flu" is clear racial slur. The rest of list is based on stereotypes of Chinese people. The key thing is that regular Chinese people aren't even responsible for the virus or the censorship (the censorship was done by Biden, a noted non-Chinese person), so targeting them seems misplaced and clearly racist.
> It is part for the course of what I see on the site.
Well yes, if a site is full of racism, a comment with racism in it is "par for the course".
Edit: To reply to your second paragraph that you added after I posted my comment:
Yes, I understand where they're coming from, and that they feel like US is under attack. However, it should obvious to anyone that China isn't governed by the average Chinese person; it's governed by the CCP. By choosing to insult and make fun of the average Chinese person, they're clearly engaging in mean-spirited racism (which has lead to increased racial violence against Asians), and that has to be condemned as the racism it is.
This thread seems half full of people downvoting who seem like they'd also be the ones to coddle those irrationally offended by people who will continue to use the words master-slave in computer contexts.
Key point for me. Under the guise of 'hate speech' suppression is occurring. Of course every platform has the right to dictate users of their platform yada yada...