Login to account Create an account  


  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Study finds there are nasty people on social media
#1
Moreover, the posts users write in hate subreddits tend to demonstrate hatred or even violence towards these out-groups. Both findings lead us to conclude the users exhibit greater out-group hostility. Because we do not know what users think, we cannot directly determine if they are rejecting egalitarian and democratic values more than they did prior to exposure (Marwick et al. 2022), but this is a reasonable if still hypothetical conclusion. Our work therefore provides evidence that users adopt these extremist beliefs simply from exposure to hate subreddits.

https://link.springer.com/article/10.100...23-01184-8

I am still pondering this study, but it is strange or dicey in places. I detest racism, sexism and fat shaming, but labelling those people/Reddit users extremists is problematic. The same also applies to the Alt-right label.

Mathematical calculations can't replace knowing someone's character before they join a social media platform or community. People might seek fall guys for their circumstances, but they don't magically join a social media community and become morons.

The authors admit the evidence behind their findings is tentative. So, are they intellectually honest or just seeking to justify a pre-determined conclusion?


Incidentally, I live in regional New Zealand, where the people are extremely clique, judgemental/set in their ways, and often casually racist. Nobody is a harsher critic of their local population than me, but I wouldn't label the majority of the people in my area as extremists.
Reply
#2
Fascinating stuff (and rather pertinent to a paper I intend to write about the evolution of a particular pejorative) and yes, I think it's solid.

Have you heard of something called "operant conditioning"?  It creates psychological harm, but can also be used by a psychologist and a willing client to heal a broken client.  Basically it's all about rewards for behavior (WebMD source: https://www.webmd.com/mental-health/what...nditioning)

So... what's going on is operant conditioning, and the poor moderators (if any) have to try and balance out between a very boring board and something worthwhile with good information and a lively engagement or the Pits of Sheol.

So let's say I start a subreddit about ... Nash Equilibriums in mathematical game theory (to pick a weird topic -- I'm sure the entire world will flock to it to hear us discuss zero sum games and the like)  I mention in passing that Trump's behavior in Kaplan's courtroom (the E. Jean Carrol trial, for those of you reading this post many years in the future) is a losing strategy and point to the Prisoner's Dilemma game (no, I'm not going to discuss this point.  It's an example, folks.  Be nice or I bring out the Math.)

All the liberals hop on and say "yes!  Delightful!" and praise me by making meme photos that say something like "Can't deviate from strategy, obvious loser!" and "Orange Toast" (or something else.)  I feel very empowered by this and go on to talk about Trump in terms of other game theory ideas, and since everyone laughs and starts giving him scathing nicknames, I feel even bolder.  And perhaps for my next post (or the one after that or after that) I am emboldened to repeat these playground taunts (for a man who'll never read them... how lame is that?)  The moderators don't stop me, and the few rational conservatives on the board who say "you're using some extreme examples and you've forgotten a few factors" get laughed at with the old "ooooh!  Tears of the conservatives for their deity!" sort of thing.

Now every angry statement about "liberal misunderstanding of conservative views" gets laughed at, ignored, or simply ends up with people posting a bunch of meme pictures (basically implying that the one protesting has no understanding of reality and is probably a complete fool).  So this tiny example suddenly turns into a whole lot of "pat on the back" sort of stuff for me.

Everyone's giving me (and my cronies) attention.  It's great stuff!  I feel validated.  I write more, and people come to see the fights... not to discuss game theory.  Word spreads.  

Who comes to the fights?  The less empowered people who want to "get back" at the ones that they feel are causing them misery.  They're glad to even fabricate and deliberately misunderstand situations and events in order to get pats for cutting down Trump and his supporters.  Moderators don't step in (moderation is VERY loose on Reddit)

Rational mathematicians say "there's better places" and leave.  Casual readers who came for discussion of Dominant Strategies are out-shouted by the ones who are getting a rush of excitement or pleasure in joining the fight. 

Now the board has turned toxic.  

What the participants may not notice is that their views are getting more extreme because they're accepting the created misinformation of others since that misinformation helps them do battle with the Other Side.  And they'll end up believing this misinformation and never bother to check.

Misinformation never goes toward a middle ground.

Now, here's an even uglier thing as mentioned in the paper: "Causal analysis through interrupted time series leads us to conclude that becoming active in a hate community leads to a measurable spillover in hate speech to other non-hate communities, meaning that a user’s hate is not self-contained within the subreddit that they join."

Let's take a trip to another discipline to show you how poisonous this is -- have you heard of something called "method acting"?  In a nutshell, you "program" your mind to become like the character so you can portray them convincingly.  After playing a certain character for some time, some actors often unconsciously adopts some of the attitudes and behaviors of this character.  It can become a REAL problem with relationships (families, spouses, friends.)

If someone gets a rush from "getting a zinger in on someone" by a cruel comment or nasty remark and people praise them, over time they will (operant conditioning) repeat this behavior more often and it will spill over into their lives.  And that's exactly what they're reporting on.




...and that, friends and neighbors, is why your friendly local Byrd never engages in rude behaviors.   You become the self that you practice most.  I don't want to become THAT self.
Reply
#3
Specifically, I hadn't heard of Operant Conditioning, but it gives me much to ponder in the scope of this topic and for unrelated matters. For instance, shaping people's behaviour concerning developing critical thinking skills is done more indirectly (outcomes-based learning).

Granted, critical thinking is another topic, but that skill spills over into a person's daily life and isn't confined to a single exercise. But I can see the spillage effect in another context. So, it makes sense that human behaviour isn't narrowly confined on or off social media.


Concerning "method acting", that condition plagues members of my family. I cut those toxic people out of my life in past years. I wasn't prepared to play my "assigned role, " which caused waves. Nor could I grow as a person until I found stable ground.

Nor will those actors change their stripes, but obsessions with their "roles" drive them to more extreme acts.
Reply
#4
(01-18-2024, 01:49 AM)xpert11 Wrote: Concerning "method acting", that condition plagues members of my family. I cut those toxic people out of my life in past years. I wasn't prepared to play my "assigned role, " which caused waves. Nor could I grow as a person until I found stable ground.

My heart goes out to you.  I had the same type of experience and it was very painful.
Reply
#5
The problem with any social media is that it works as an amplifier of people's opinions, as those that agree will flock to support it and those that disagree will flock to attack it.
The fact that all this happens without physical intervention makes it easier to have extreme positions that people would not have on a face to face discussion.
Reply
#6
(01-17-2024, 11:18 PM)xpert11 Wrote:  I detest racism, sexism and fat shaming, 

Nobdy EVER includes the opposite-'skinny shaming'.  Lol  Believe me, it's just as big (and hurtful) thing as fat shaming.
I'm not a Domestic Engineer; I'm still feral.
Reply
#7
(01-18-2024, 05:54 AM)ArMaP Wrote: The problem with any social media is that it works as an amplifier of people's opinions, as those that agree will flock to support it and those that disagree will flock to attack it.
The fact that all this happens without physical intervention makes it easier to have extreme positions that people would not have on a face to face discussion.

It's long been known that anonymity gives people space and the ability to say things and to behave in ways that they wouldn't if they met face to face.  Basically, it removes some social accountability from the entire process.

(01-18-2024, 11:40 AM)Nugget Wrote: Nobdy EVER includes the opposite-'skinny shaming'.  Lol  Believe me, it's just as big (and hurtful) thing as fat shaming.

Yes, that does happen, and when a man is involved it often involves questioning their manhood.

Now, to add a bit more flavor to the discussion here, it turns out (https://d1wqtxts1xzle7.cloudfront.net/67...GGSLRBV4ZA) that trolling is sometimes accepted in an online space under certain conditions -- when it's tied into community beliefs.  I've seen this before when an atheist showed up on an evangelical board (to discuss and not to troll) and they got trolled by several members of the community.  I found this ironic because this caused the trolls to engage in behaviors that they said were against their principles ("turn the other cheek" was one of them.)

They also enjoyed trolling (which is one of the characteristics of trolling -- as I said above, not only the "rush" from getting in a good "zing" but also community approval feeds right into this.)  

More interestingly, one study found that male trolls are often perceived very negatively, but female trolls are perceived as "confused."  (Fichman, P. and M. R. Sanfilippo. 2016. Online Trolling and Its Perpetrators: Under the Cyberbridge. Rowman and Littlefield Publishers.)

...a little food for conversation...
Reply
#8
Quote:Social media offers an avenue for like-minded individuals to interact in ways that were previously not possible. Yet, it can also be a breeding ground for hate and extremism to spread.

I think anything beyond what I underlined is a case of a group of like-minded researchers attempting to mold their findings to their own preconceived notions and biases.

Most of its conclusions should be tossed aside due to the fact that the subreddits attract like-minded people in the first place. Makes me wonder why they bothered, unless the intent was to confirm that like-minded people associate with one another.
Reply
#9
(01-18-2024, 01:25 PM)Byrd Wrote: It's long been known that anonymity gives people space and the ability to say things and to behave in ways that they wouldn't if they met face to face.  Basically, it removes some social accountability from the entire process.

Also, if I'm not mistaken (I can't find any reference to it), someone made an experiment, with some people destroying a car with hammers and baseball bats, and soon other people joined in the destruction, showing that if we see we are not alone in doing something we are more likely to do it, even if we think it may be wrong.
Reply
#10
(01-18-2024, 05:00 PM)Blaine91555 Wrote: I think anything beyond what I underlined is a case of a group of like-minded researchers attempting to mold their findings to their own preconceived notions and biases.

Most of its conclusions should be tossed aside due to the fact that the subreddits attract like-minded people in the first place. Makes me wonder why they bothered, unless the intent was to confirm that like-minded people associate with one another.

Their study found that the behavior then spilled over into other subreddits where these people interacted with others.

I've noticed this in World of Warcraft.  They have "channels" which are live talk (well, text talk) that initially were fairly innocuous -- people trading things and looking for others to run dungeons.  It was pretty heavily moderated and there were sub-channels as well.

Then someone bought Blizzard (if I remember correctly) and decided to save money (and give it to investors) by eliminating active moderators.  Without anything to hold them in check, we got trolls and misogynists and it started turning ugly.  Then they made the chats "world wide" - any time you were in any city you heard all the chats from all the cities.

And it became a race for "who could be the most notorious."  Most of us turned the channel off, but it was still there.  If "ugly behavior" is accepted by the community (in spite of protests against it by others) then it will continue.  And if it shocks people (gets a reaction) this will encourage the behavior to continue.

I've seen it in other places as well.

(01-18-2024, 08:08 PM)ArMaP Wrote: Also, if I'm not mistaken (I can't find any reference to it), someone made an experiment, with some people destroying a car with hammers and baseball bats, and soon other people joined in the destruction, showing that if we see we are not alone in doing something we are more likely to do it, even if we think it may be wrong.

Not familiar with that one, though I *am* familiar with the experiments that show people are more likely to behave politely and honestly if they've got "eyes on them"... either live people or even posters or images of an eye staring at them.
Reply