Free speech on social media | INFJ Forum

Free speech on social media

wolly.green

Permanent Fixture
Jul 20, 2016
1,067
2,718
1,236
MBTI
ENTP
Enneagram
4w5
To people that are concerned about free speech on social media: yes free speech is important, but it is just as important to sanitise the way we engage with others online.

Cyber bullying has become a prevalent issue in society, especially among young people. Bullying, in general, is a known contributor to mental health issues, such as depression, anxiety, and suicide. Now, with the advent of cyber bullying, which can include anything from sending mean texts or emails to posting hurtful comments on social media, the mental health risks for victims are even greater.

A study by the American Academy of Pediatrics found that cyber bullying is linked to depression, anxiety, and suicidal thoughts in young people. The study surveyed nearly 1,500 young people, ages 10 to 18, and found that those who were cyber bullied were more likely to suffer from mental health issues. In particular, those who were cyber bullied were two times more likely to report suicidal thoughts and three times more likely to report depression than those who were not cyber bullied.

Another study, this one by the University of Calgary, found that cyber bullying is associated with increased levels of anxiety and depression in adolescents. The study surveyed nearly 1,500 students and found that those who were cyber bullied were more likely to experience mental health issues, including depression, social phobia, and general anxiety disorder.

Cyberbullying, or bullying through electronic means, is a particularly insidious form of bullying as it can be relentless and constant, reaching a victim at any time of the day or night. Free speech is important. But so is sanitising the way we interact with others on social media.

For more information, take a look at 'Ledger of Harm's' by The Center for Humane Technology.

References

http://www.aap.org/en-us/about-the-aap/aap-press-room/pages/Cyberbullying-and-Mental-Health.aspx

https://www.thorn.org/resources-and-research/
http://www.ucalgary.ca/utoday/issue...vels-of-anxiety-and-depression-in-adolescents

https://jamanetwork.com/journals/jamapediatrics/fullarticle/1840250
 
  • Like
Reactions: slant and Wildfire
I don't understand why content control isn't just left to the discretion of the user.

I'm not referring to simply blocking individuals, but in addition to that, one could block entire categories of people. For example, it would be good to be able to block anyone who comments on covid more than once a fortnight, or anyone who posts news articles related to US politics daily, etc.

I don't use social media, so my suggestions are theoretical. I'm not a fan of either censorship, or not being able to filter what one is exposed to.
 
I don't understand why content control isn't just left to the discretion of the user.

I'm not referring to simply blocking individuals, but in addition to that, one could block entire categories of people. For example, it would be good to be able to block anyone who comments on covid more than once a fortnight, or anyone who posts news articles related to US politics daily, etc.

I don't use social media, so my suggestions are theoretical. I'm not a fan of either censorship, or not being able to filter what one is exposed to.

Unfortunately, that won't happen because it will slash profits.

Quite simply, social media companies make a windfall by monetising your attention and selling it to advertisers. The more attention they get, the more money they make. And can you guess what gets the most attention? Content that makes you either angry, scared or paranoid.

Unfortunately, anger and outrage get the most attention and make the most money for social media companies. Allowing users to filter their content will make social media less attractive and less addictive. So unless there is a business model that makes more money, this solution will never happen.
 
next best thing I can think of is teaching social media literacy.

What do you think?

Yeah I've long thought there's a need for this space to develop
 
I agree that people shouldn't be forced not to say insulting shit online, but the people being insulted can stand up for themselves, provide an alternative perspective, or block people as needed. Especially when you're young and don't have the experiences that obviously disprove shit people say the particular way people bully online (putting on an air of talking about facts) is pretty insidious.

It's down to the community cleaning up itself I think. Which it's actually starting to do.
 
Looks like OP payed no attention in history class growing up or didn't take it at all as every time where there was a lack of free speech there were always abuses on some level or another and while there are real problems now it could be worse. Just imagine how easy it would be for example should one or more of the big platforms decided overnight to censor a target demographic say a racial or ethnic group for narrative control ect. The same already happens in some capacity with religious types on the fringes as well those with non conforming opinions on certain topics that quickly result in a swift ban on twitter or fakebook for example.

Anything this serves those with totalitarian leanings or intentions to control what the public is allowed to see much the same that was in Hitler's Germany and the Soviet Union as two examples where there was heavy control of what the public was allowed to see while dissenters got sent to the camps. If it gets to be that way again where saying the wrong thing not only meant losing one's job, status and place in society, or even one's life then this world has truly gotten to be a darker place. The path to hell is often paved with good intentions.
 
Looks like OP payed no attention in history class growing up or didn't take it at all as every time where there was a lack of free speech there were always abuses on some level or another and while there are real problems now it could be worse. Just imagine how easy it would be for example should one or more of the big platforms decided overnight to censor a target demographic say a racial or ethnic group for narrative control ect. The same already happens in some capacity with religious types on the fringes as well those with non conforming opinions on certain topics that quickly result in a swift ban on twitter or fakebook for example.

Anything this serves those with totalitarian leanings or intentions to control what the public is allowed to see much the same that was in Hitler's Germany and the Soviet Union as two examples where there was heavy control of what the public was allowed to see while dissenters got sent to the camps. If it gets to be that way again where saying the wrong thing not only meant losing one's job, status and place in society, or even one's life then this world has truly gotten to be a darker place. The path to hell is often paved with good intentions.

Social media is a new kind of beast that flips conventional arguments about free speech. Here are a few facts you may not be aware of.

Social media centers profit immensely by amplifying messages of extreme content, which then attract more views and hike up advertising revenue. For example, one week after the Capitol attacks, military gear ads continued to be attached to content on the US elections and the attacks, despite Facebook staff and external watchdogs flagging these instances. [1]

Analyzing over 2 million recommendations and 72 million comments on YouTube in 2019, researchers demonstrated that viewers consistently moved from watching moderate to extremist videos; simulation experiments run on YouTube revealed that its recommendation system steers viewers towards politically extreme content. The study notes "a comprehensive picture of user radicalization on YouTube". [2]

Exposure to a fake political news story can rewire your memories: in a study, where over 3,000 voters were shown fake stories, many voters later not only “remembered” the fake stories as if they were real events but also "remembered" additional, rich details of how and when the events took place. [3]

The order in which search engines present results has a powerful impact on users' political opinions. Experimental studies show that when undecided voters search for information about political candidates, more than 20% will change their opinion based on the ordering of their search results. Few people are aware of bias in search engine results or how their own choice of political candidate changed as a result. [4]

Game theory analysis has shown how a few bots with extreme political views, carefully placed within a network of real people, can have a disproportionate effect within current social media systems. Studies demonstrate how an extremist minority political group can have undue influence using such bots—for example, reversing a 2:1 voter difference to win a majority of the votes. [5]

More fake political headlines were shared on Facebook than real ones during the last 3 months of the 2016 US elections. [6]

The outcomes of elections around the world are being more easily manipulated via social media: during the 2018 Mexican election, 25% of Facebook and Twitter posts were created by bots and trolls; during Ecuador's 2017 elections, president Lenin Moreno's advisors bought tens of thousands of fake followers; China's state-run news agency (Xinhua) has paid for hundreds of thousands of fake followers, tweeting propaganda to the Twitter accounts of Western users. [7]

The 2017 genocide in Myanmar was exacerbated by unmoderated fake news, with only 4 Burmese speakers at Facebook to monitor its 7.3 million Burmese users.[8]

References:

[1] https://www.buzzfeednews.com/article/ryanmac/facebook-profits-military-gear-ads-capitol-riot
[2] https://dl.acm.org/doi/abs/10.1145/3351095.3372879
[3] https://journals.sagepub.com/doi/10.1177/0956797619864887
[4] https://www.pnas.org/content/112/33/E4512
[5] https://www.nature.com/articles/s41586-019-1507-6
[6] https://www.buzzfeednews.com/articl...ction-news-outperformed-real-news-on-facebook
[7]https://www.nytimes.com/interactive/2018/01/27/technology/social-media-bots.html
[8] https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/
 
I'm still amazed that people actually consume news, be it fake or otherwise.

My YouTube suggestions is littered with girl fails, surfing videos, physics and astronomy videos, bushcraft tutorials, Bach organ and harpsichord music, and a few thomistic theologians.

A couple of years ago, I kept clicking "do not recommend videos from this channel" every time anything from CNN, FOX, Reuters, etc popped up. Now I get a minimal gist of what's going on by looking through headlines on a dedicated news collating app, but besides a couple of economics articles, I haven't read any news articles in months.
 
Very compelling there Reason.
I have no interest in convincing you of anything i just don't want to live in the shithole that government censorship will create. There is nobody alive on this earth who I trust to decide for me what I am allowed to see or read or think. I decide that for myself.
 
I have no interest in convincing you of anything i just don't want to live in the shithole that government censorship will create. There is nobody alive on this earth who I trust to decide for me what I am allowed to see or read or think. I decide that for myself.

Unfortunately, on social media, you have less choice than you realize. Underneath the hood of these giant social media platforms is a powerful technology that is perfectly optimized to persuade you and modify your behaviour. To tap your brain and exploit your weaknesses.

Our evolved biology serves us brilliantly in many ways but also includes vulnerabilities that can be exploited. Persuasive technology—technology that shapes attitudes and behavior—pushes many of these buttons, leveraging our vulnerabilities to generate engagement and, ultimately, corporate revenue.

Our brains are more porous than we tend to believe. We shape our environment and, for better or worse, our environment shapes our brains. When we engage persuasive technology repeatedly, it begins to train us: our thoughts, feelings, motivations and attention start to replicate what the technology is designed to produce. This training creates a kind of neural momentum that makes us more likely to persist in those behaviors, even when they’re not good for us.

Social media presents a special case of persuasive technology where psychological levers are poked and prodded again and again, often without our conscious awareness. We don’t click randomly: many designs deliberately leverage our deepest vulnerabilities by promoting compulsive behavior that compromises our autonomy and well-being.

Social media platforms, but in particular Facebook, are the world's largest persuasive technologies. Though they are marketed as 'bringing people together' their true purpose is to make money by monetising your attention. It does exactly this by striping us of our autonomy and grooming us into addicts for the sake of profit. If you'd like to know more, feel free to ask questions.
 
Last edited:
Unfortunately, on social media, you have less choice than you realize. Underneath the hood of these giant social media platforms is a powerful technology that is perfectly optimized to persuade you and modify your behaviour. To tap your brain and exploit your weaknesses.

Our evolved biology serves us brilliantly in many ways but also includes vulnerabilities that can be exploited. Persuasive technology—technology that shapes attitudes and behavior—pushes many of these buttons, leveraging our vulnerabilities to generate engagement and, ultimately, corporate revenue.

Our brains are more porous than we tend to believe. We shape our environment and, for better or worse, our environment shapes our brains. When we engage persuasive technology repeatedly, it begins to train us: our thoughts, feelings, motivations and attention start to replicate what the technology is designed to produce. This training creates a kind of neural momentum that makes us more likely to persist in those behaviors, even when they’re not good for us.

Social media presents a special case of persuasive technology where psychological levers are poked and prodded again and again, often without our conscious awareness. We don’t click randomly: many designs deliberately leverage our deepest vulnerabilities by promoting compulsive behavior that compromises our autonomy and well-being.

Social media platforms, but in particular Facebook, are the world's largest persuasive technologies. Though they are marketed as 'bringing people together' their true purpose is to make money by monetising your attention. It does exactly this by striping us of our autonomy and grooming us into addicts for the sake of profit. If you'd like to know more, feel free to ask questions.
I don't trust a stranger's ability to filter misinformation better than my own ability to do so. There are specific people I would somewhat trust on specific topics but I would never trust some random stooge to decide for me what I'm allowed to see.

If you want to only use social media that has strict moderation and government oversight I believe you should be able to do make that choice but I'm not interested in having others decide that for me.
 
I don't trust a stranger's ability to filter misinformation better than my own ability to do so. There are specific people I would somewhat trust on specific topics but I would never trust some random stooge to decide for me what I'm allowed to see.

If you want to only use social media that has strict moderation and government oversight you should be able to do so but I'm not interested in having others decide for me.

The government doesn't understand how social media works. It will be the tech companies doing the filtering.

Regardless, moderating free speech isn't the way to tackle this problem. A better solution is to teach social media literacy. Particularly to young people who are disproportionately affected by the vitriol that surfaces.

Ideally social media companies would change their business model. Unfortunately, not only will that not happen, but leaving a vacuum won't solve the problem. You may be able to get Facebook to change, but persuasive technologies will always be hugely profitable. There is no reason to think another company won't just step into the void.

So regulating free speech isn't a solution. But teaching people social media literacy and helping them understand just how their 'speech' shapes the algorithm would be a good start.
 
Giving humans the illusion of free speech while making a profit off of the turmoil.
Hmm where've I heard that before...

:looninati:
 
The combination of anonymity, algorithms, late-stage capitalism, and a median human IQ of 100 means this is inevitable.

I’m with @Reason on this.

I have enough sense to never go on Facebook.
I have enough sense to never watch anything political on YouTube.
My memory is shit, so I don’t really remember any news stories.
I don’t think I have ever been an undecided voter.
Instagram, for me, is for pretty pictures, not politics.

I can’t think of a single news source that isn’t part of the functional arm of the military-industrial complex. All news is a bias-and-degree-of-fake sandwich.

None of this is new. The only change is the use of the internet.

This will sound jaded, and it is...if someone else falls prey to those things, that’s their choice, and their problem.

The whole world is like that, always has been...watch out, or they will eat you alive. You are in a crab bucket, and you won’t make it out.

That Facebook is running ads for arms manufacturers right after Jan 6 means the system is working exactly as designed. If something needs to be flagged, make sure it’s ’murrican.

Here’s all you need to know...there’s a sucker born every minute...don’t be that sucker.

As for the cyber bullying, I’m older, so I never had to deal with that. Instead, I got physically bullied, and sexually assaulted. Kids are little shits, with all the monster and none of the restraint. Solution? Beat the living shit out of them. If they can get up, you did it wrong.

Keep Calm, and Get Back to Consuming,
Ian