Skip to main content

Yes...the entire democratic party concerns me.

*******************************

Facebook users were disturbed, shocked, and angered to receive a new Orwellian pop-up warning asking if they have been exposed to harmful extremist content or if they know anyone who is becoming an extremist.

On Thursday, Facebook users began receiving a new notification that warned they might have been exposed to “extremist content.” It started a buzz and a number of conservatives shared screenshots of the warning. Editor Kira Davis of RedState was one of those who received the alarming message. She shared it on her Twitter account.

“Hey has anyone had this message pop up on their FB? My friend (who is not an ideologue but hosts lots of competing chatter) got this message twice. He’s very disturbed,” Davis asked in her tweet.

The message that raised alarm bells stated: “Are you concerned that someone you know is becoming an extremist?” That was followed by: “We care about preventing extremism on Facebook. Others in your situation have received confidential support” along with the allegedly helpful option to “Get Support.” Davis was not alone in experiencing Facebook’s new trial pop-ups. Others included writer Alex Berenson, Newsmax host John Cardillo, and Virginia House Delegate Nick Freitas.

Cardillo noted: “This just popped up when I checked my Facebook app. And you doubt the NSA is illegally spying on Tucker Carlson and every other conservative?” Berenson quipped: “Yeah, I’m becoming an extremist. An anti-@Facebook extremist. ‘Confidential help is available?’ Who do they think they are? Either they’re a publisher and a political platform legally liable for every bit of content they host, or they need to STAY OUT OF THE WAY. Zuck’s choice.” And Freitas also voiced what many are thinking concerning the message: “Yes…actually I have a real concern that some leftist technocrats are creating an Orwellian environment where people are being arbitrarily silenced or banned for saying something the ‘thought police’ doesn’t like.” Another alert was also reportedly seen on Facebook by users: “You may have been exposed to harmful extremist content recently.” This was in turn followed by: “Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.”

The warning is accompanied by a link telling the user how to “Get support from experts.” It instructs how to “spot the signs, understand the dangers of extremism and hear from people who escaped violent groups.” This appears to be the next step in Facebook’s ongoing efforts to crack down on “extremists” online.

Even more disturbing is where the “support” link takes users who dare click on it out of morbid curiosity. It leads to a support page that is linked to “Life After Hate,” billed as “a nonprofit that provides support to anyone who wants to leave hate behind and solve problems in nonviolent ways.”

According to Fox News, Facebook said the warnings were a “test”: “This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk. We are partnering with NGO’s and academic experts in this space and hope to have more to share in the future.”

Some believe that the messages are targeting those who are concerned about Critical Race Theory or those who are questioning election results and the nature of the riots on Capitol Hill.

The Verge is reporting that Facebook claims the tests are part of its Redirect Initiative, which “helps combat violent extremism and dangerous organizations” in numerous countries. According to its webpage, the program redirects users to educational resources instead of what they deem further hateful content. The test is part of its response to the Christchurch Call for Action campaign, which recruits major tech platforms to counter violent extremist content online. It was launched following a 2019 attack in New Zealand that was live-streamed on Facebook. The program seeks to both identify users who may have seen what they consider “extremist content” and those who have been dinged by Facebook in the past.

Facebook has not explained exactly who is deciding what is “harmful” or “extremist.” Users have noted that the warning sometimes doesn’t include any indication of what content, user, page, or group prompted the warning. Some are even reporting that the warning occurs every time they open their Facebook app.
The new alerts follow a bulletin from DHS that is warning of possible non-specific domestic terrorist attacks over the July 4th holiday involving white supremacists. The new alerts follow a bulletin from DHS that is warning of possible non-specific domestic terrorist attacks over the July 4th holiday involving white supremacists.

Last edited by Jutu
Original Post

Replies sorted oldest to newest

Add Reply

Post

Untitled Document
×
×
×
×
Link copied to your clipboard.
×