Incels are targeting ‘next generation of extremists’ online. Could algorithm reform help?

Lifestyle

Products You May Like

Four clicks on an incognito browser is all it takes for YouTube to churn up a video about, as the host puts it, “embracing the idea of violence” in a society that “despises” what it means to be a man.

Four clicks, and the column of recommended next views is full of videos touting the (false) benefits of ivermectin in the treatment of COVID-19, while the Instagram profile linked to by the host directs the viewer to an account promoting interviews about seemingly run of the mill topics like traditional masculinity, fighting, and hunting.

In between are posts interviewing and promoting political figures linked to the far-right, including those endorsing the violent mobs that attacked the U.S. Capitol on Jan. 6, 2021. Others call gender fluidity a “delusion” and “child abuse,” while another urges the men in Kenosha to “get ready” and “load your firearms” on the day jury deliberations on Kyle Rittenhouse began.

And it all started with four words, typed into the YouTube search bar: “self esteem for guys.”

“There’s a lot of gateways into radicalization, and often they’re through fitness and health and on platforms like YouTube,” said Rachel Giese, author of the book Boys: What It Means to Become A Man.

“You can actually see the way in which these algorithms on platforms start putting more and more radicalizing content in front of young men who may have gone simply seeking information about self-esteem or dating or fitness or weight lifting and suddenly find themselves exposed to very hateful ideology,” Giese added.

“So it’s very frightening how quickly a young man who might be lonely or might be troubled or might be seeking information can have disinformation and hateful ideology put in front of him.


Click to play video: 'Young men are finding extremist content online ‘quite easily,’ warns expert'







Young men are finding extremist content online ‘quite easily,’ warns expert


Young men are finding extremist content online ‘quite easily,’ warns expert

To a certain degree, the damaging effect social media can have — particularly on young people — is nothing new. But in the decades since the introduction of MySpace and the eventual rise of Facebook and Instagram, there are indications it’s getting worse.

Advertisement

A recent deep dive into Facebook’s operations by the Wall Street Journal, revealed the company is well aware of its platforms’ negative influences on the mental health of users — a sizable percentage of those being young ones.

Despite the negative effects coming into clearer focus, the entrenchment of social media in the day-to-day lives of Canadians is nearly inescapable. Global News is unravelling the many facets of influence these platforms have — both offline and on — when it comes to the wellbeing of Canadian young men and boys navigating online platforms increasingly used by the far-right.

READ MORE: Canadians falling prey to conspiracy theories despite strong trust in institutions: poll

There are different facets of the far-right online: white supremacy, anti-immigrant, anti-government. Many of them also bleed into each other, and overlap, creating multiplying layers of ideological narratives that fuel a near-endless stream of content creation.

Among the facets increasingly gaining attention over recent years has been the ideology of “incels” — men who feel cheated of sex, and blame women as well as social standards they feel work against them for their involuntary celibacy.

There’s no one easy answer for where a young man or boy will find incel-related content online. The sheer amount of it available on platforms like Facebook, Instagram, Youtube and video game platforms means navigating the potential for radicalization is a bit like learning to identify trap doors.

But part of the challenge lies in the fact that even seemingly normal activities — looking up YouTube videos about how to gain confidence, or posts on Facebook or Instagram about body image or weightlifting — can quickly steer users down rabbit holes of extremist content.

And those being caught up are no longer just the so-called crazy uncles of years past.

They are children, too.

Incels eyeing ‘next generation of extremists’

“We’ve had several people between the ages of 10 to 12 years old,” said one expert, David.

David works with a Toronto area social support program called Estimated Time of Arrival. It’s one of the only programs in the country specifically focused on deradicalizing young men and boys attracted by the incel ideology, including those are actively considering violence.

But it’s also one increasingly being targeted with violent threats as a result, which is why Global News is identifying David only by his first name.

He said the targeting of children is a relatively new trend, but one that’s amped up over the last two years as the world grapples with the societal trauma of the COVID-19 pandemic.

The purpose he describes is simple, yet chilling: “to really grow the next generation of extremists.”

“It’s easy to recruit young people, especially during a pandemic when everybody was at home and we were all anxious about what the world may look like,” David said.

“We have kids that during the pandemic or lockdowns were spending 16,18 hours online.”

READ MORE: CSIS sees ‘unprecedented’ increase in violent online rhetoric during COVID

Estimated Time of Arrival receives grants through Public Safety Canada and works with individuals who have been referred by the RCMP and police services. And David said there are a number of escalating behavioural red flags common among the dozens of young men and boys coming to the program.

More time spent online — in chat rooms, in particular. Unfamiliar websites, and ones that seem strange.

Next comes a shift in politics. New ideas, new language that’s anti-women or anti-immigrant or anti-government. Secrecy, too. An unwillingness by a child or teen to share whom they’re talking with online, or what websites they spent all those hours scrolling or clicking through during the latest round of isolation or lockdown or catching up with friends after virtual school.

Consuming fake news typically factors in — a lack of critical thinking, and a mistrust not just of government or institutions, but of family and friends, too.

And any conversation about the spread of fake news and radicalizing content online would be incomplete without talking about how it actually gets in front of eyeballs: algorithms.

Advertisement


Click to play video: 'University collaborating with Facebook to create Global Network Against Hate'







University collaborating with Facebook to create Global Network Against Hate


University collaborating with Facebook to create Global Network Against Hate – Jul 29, 2020

“I do fundamentally think that the algorithms need to be shifted,” said Barbara Perry, director of the Centre on Hate, Bias and Extremism, which was created in 2018 at the Ontario Tech University.

The Centre announced last year it was partnering with Facebook Canada to create the Global Network Against Hate to work with academics and experts to develop policies to limit the spread of hate online.

Facebook is contributing $500,000 over five years to the network.

“I think we are collectively coming to that point where we are asking — if not demanding — that social media companies really do rethink those algorithms and the way that they work in this particular context and around conspiracy theories, for example, other sorts of disinformation.”

READ MORE: Ontario Tech University partners with Facebook to tackle online hate

Perry noted one concern raised by many of the experts Global News spoke with on this issue. Specifically, while algorithm reform and disincentivizing the creation of extremist content may be among possible solutions, there is no one answer that will solve the problem.

Other aspects, like funding more community-level groups to de-radicalize and prevent radicalization in the first place, are crucial components that need to be ramped up as well as educating teachers and caregivers on the warning signs for extremist content.

Many schools, for example, run workshops and programs to teach young girls to identify and avoid sexual predators online — yet there are few, if any, comparable programs teaching young boys how to identify and turn away from incel-related content they may encounter, too.


Click to play video: 'Here’s why one expert says calls are growing for social media algorithm reforms'







Here’s why one expert says calls are growing for social media algorithm reforms


Here’s why one expert says calls are growing for social media algorithm reforms

The counter-violent extremism firm Moonshot issued a report last year that looked at the behaviour of Canadian incels online and found YouTube to be a “major hub” for incel content.

The YouTube video that Global News was directed to after four clicks on the platform belongs to an account and creator that one expert said is on researchers’ radar as a potential entry point from which young men may go on to access more extreme content on the dark web.

Global News is not sharing the specifics of the account out of concerns about potentially directing vulnerable users to that content, which one expert said is a known potential entry point towards more extreme material.

Global News also sent a detailed breakdown to a YouTube spokesperson of the searches and clicks that preceded the platform’s recommendation to watch the video about “embracing the idea of violence.”

In response, the company said it takes concerns about violence seriously.

“The safety of our community is our top priority, and we take this responsibility seriously. While YouTube is a platform for free and creative expression, we strictly prohibit videos that promote violence or criminal activity,” said a spokesperson for YouTube.

“Over the last several years, we’ve taken additional steps to ensure that those who aim to spread violent, extremist ideology can not do so on YouTube,” the spokesperson added.

“We strengthened our hate speech policy, and made updates to our search and discovery systems to reduce the spread of borderline videos so this content is less likely to be recommended to users. While these interventions have had a significant impact, our work here is ongoing.”

According to YouTube, the company has been working since 2019 to limit the reach of what it calls “borderline content,” which approaches but does not cross the community standards.

Prohibited content under those standards includes incitement to violence and promotion of terrorism, but the company did not provide a precise definition for what constitutes borderline content.

Rather, YouTube says the company has a responsibility to reduce the recommendations of such content on its platform, but that any content that doesn’t violate community standard rules will remain online even if it gets promoted less often as a recommended next-watch for users.

Advertisement

The video in question remains online.


Click to play video: 'The Facebook Papers: Internal documents reveal company failed to stop spread of abusive content'







The Facebook Papers: Internal documents reveal company failed to stop spread of abusive content


The Facebook Papers: Internal documents reveal company failed to stop spread of abusive content – Oct 25, 2021

Facebook — or “Meta,” as the company is now named — also knows its products can radicalize users, according to troves of documents released by the Wall Street Journal in the Facebook Papers. Those records showed the extent to which the company is aware of the spread of hate speech, misinformation and extremism across its platforms.

Those revelations put the spotlight on the hugely powerful yet methodically opaque algorithms churning away beneath the slick surface of products like Facebook (the platform) and Instagram, where there are more than 22,000 posts using what experts say is a common incel-linked hashtag.

Again, Global News is not identifying that hashtag in order to avoid directing vulnerable users to it.

EXPLAINED: What are the Facebook Papers?

Reports published about the Facebook Papers said the information in them showed users are directed into “rabbit holes” on the platforms where the content served to them and prioritized by the algorithms became narrower, more violent and more conspiratorial in nature.

The purpose of most algorithms is straightforward: drive profits by figuring out what kind of content will inspire engagement and shares, and then push that toward users.

While the specifics of the algorithms are not publicly available, what has become increasingly clear in recent years is that algorithms often struggle to distinguish between extremist content and legitimate content when it comes to serving up what its data suggests will generate shares, likes or more views.

“The effect of the radicalizing algorithm is a real concern when you think about this topic,” said one of the analysts working at Moonshot to track incel behaviour on platforms like YouTube and others.

The individual agreed to speak on background because their job requires interacting and tracking incel activities online, and they fear revealing their name would both make that work impossible and potentially make them targets.

“We shouldn’t be discouraging people from searching for or wanting to find out more information about how to how to date or how to have sexual or romantic relationships. The problem is that a young boy, a young man searching for that content is very quickly and very easily pulled down that rabbit hole.”

Algorithms, however, are central to how social media platforms make money.

“So they guard that secret very closely, which makes it very difficult for researchers in any extremism type to understand how the algorithm works and to think about programming that can combat that radicalizing effect,” said the Moonshot analyst.

What actions could help the problem?

Moonshot and other experts have proposed a range of actions taking aim at the platforms where these algorithms operate: among them, adjusting the algorithms to include “safeguards” so incel content can’t be promoted, and working with experts to remove content and accounts.

The other recommendation aims to hit extremists where it hurts: their bank accounts.

“Several incel creators on YouTube have taken advantage of opportunities to monetize violent misogynist content through ad revenue or by promoting their account on Patreon, a subscription service,” the analysts at the firm noted in their report.

“YouTube and other video hosting platforms should ensure the removal of violent misogynist content, and ensure that creators promoting violent incel content cannot benefit financially from ad revenue or by promoting a paid subscription service on their channel.”

For years, platforms like Facebook and Google, which owns YouTube, have insisted when questioned in forums such as parliamentary committees or congressional hearings that they are taking seriously concerns about extremist content and content that could harm youth users.

“Misogyny-based hate groups and violent extremists have no place on our platforms. We take this extremely seriously and remove content and accounts as soon as we become aware of it,” said a spokesperson for Meta to Global News.

Advertisement

“This is an ongoing process and we’re working closely with academics and independent experts to make sure we’re approaching this issue in a thoughtful way.”

One senior leader at the company, who spoke to Global News on background, said Meta has been increasing efforts over the past five years to deal more seriously with the circulation of far-right and incel-related content posted on its platforms like Facebook and Instagram.

That person noted that Meta has also been developing policies around borderline content and how best to reduce the visibility of it on users’ newsfeeds.

Just because people have a right to free speech, doesn’t mean they have a right to reach others, the senior leader said, describing individuals and groups affiliated with incel culture or the far-right as endlessly creative at finding ways to get around enforcement tools.

The leader said that Meta is providing users with more information on how the algorithms work and why they see different kinds of content in their newsfeeds, but that the company does not plan to prevent users from searching out or following content that might qualify as borderline, without directly breaking the rules.

READ MORE: Facebook failed to stop global spread of abusive content, documents reveal

Are lawmakers losing patience?

Lawmakers in a number of countries have signalled they are eyeing tougher rules around how social media platforms operate and evaluate the content published on their products.

In the United States, a bipartisan group of lawmakers introduced a piece of legislation that takes aim squarely at the algorithms used by social media platforms.

That bill, put forward in the House of Representatives last month, proposes forcing social media companies like Meta to let users opt out of seeing content served up to them by algorithms.

Read more:

The dark side of social media: What Canada is — and isn’t — doing about it

In particular, the bill zeros in on the “opaque” nature of the algorithms and says that users should have “the option to engage with a platform without being manipulated by algorithms driven by user-specific data.”

A companion bill was also put forward in the U.S. Senate in June, also from a bipartisan group.

The European Union, too, is considering regulations for the algorithms used by artificial intelligence, though the proposals have faced criticism for not targeting social media companies specifically.


Click to play video: 'Strengthened EU tech rules could become global standard, Facebook whistleblower says'







Strengthened EU tech rules could become global standard, Facebook whistleblower says


Strengthened EU tech rules could become global standard, Facebook whistleblower says – Nov 9, 2021

As Facebook data-scientist-turned-whistleblower Frances Haugen has attested in the wake of the Facebook Papers, leaving the decisions on how to regulate content in the hands of the company may amount to asking the impossible.

“Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” Haugen said in an October interview with 60 Minutes.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” she added.

“And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

A spokesperson for Facebook also told Global News that the company takes seriously the need to keep young users safe, and that it supports updating regulations around the internet, such as creating industry standards that will apply to all.

The Canadian government has promised to reintroduce legislation cracking down on online hate, and create stiff new penalties aimed at getting social media companies to remove hateful content from their platforms.

Previous versions of those proposals, though, have not included measures that would force social media companies to disincentivize the creation of that content in the first place, and have faced questions about whether increased content regulation alone could solve the problem.

Advertisement

READ MORE: Facebook Papers are a ‘call to arms’ over pressing need to regulate, say MPs

Canadian MPs who are part of the International Grand Committee on Disinformation — one Liberal, one NDP — have said the time is now to act on imposing tougher regulations on those companies, with Liberal Nathaniel Erskine-Smith billing the current moment as a “call to arms for public rules.”

A spokesperson for Heritage Minister Pablo Rodriguez, though, would not say whether the minister is considering or open to such moves. In an email, a spokesperson for his office said only that consultations on a proposed regulatory framework had ended in September.

“The Department is reviewing submissions to inform decisions on next steps,” said Camille Gagne in an email last month. “Other elements will be answered in due time.”

In the meantime, experts say one of the best things people can do on an individual level may be one of the hardest: get more comfortable talking about hard things.

Read more:

Influenced: A Global News series about social media’s impact on and offline

David from Estimated Time of Arrival said that ideologies like incels thrive on mistrust. That can mean conversations around difficult topics like why someone is latching onto or espousing hateful views like misogyny can often backfire and reinforce those views.

He said every caregiver who has come forward to seek help for a young man or boy through the program has also been afraid of what will happen if others find out, signaling a key challenge for those who work on the ground to try to prevent extremist thoughts from becoming violent action.

“If you have a child that’s misogynistic or racist, it’s one of the worst things possible in this world today that can happen,” he said.

“People are terrified to come forward because they feel like they’re judged, and that really is a really dangerous situation we’re in.”

Giese added that over the course of researching her book, she heard a lot of stories about the challenges young men and boys face in trying to define out what it means to be a man at a time when gender roles are rapidly evolving.

Some ask questions like: will more rights for girls mean less rights for boys?

Others expressed concerns about feeling like they don’t know their role in the world, or how to act when they feel attacked over something they said or did.

Creating safe spaces to work through those kinds of questions is crucial, Giese said.

“If they don’t have this space, there’s plenty of people in the very dark corners of the web who will play on those insecurities, who are actually there recruiting young men who are coming into this feeling insecure, feeling lost,” she said.

“They’re taking those feelings and turning them into something very twisted and violent, as opposed to starting at that place of vulnerability and fear and maybe even a bit of resentment or bitterness and saying, ‘OK, so why do you feel that way?’”

Products You May Like

Articles You May Like

4 Powerful Ways to Master the Art of Living with Uncertainty
Dwayne Johnson Serenades 4-Year-Old Girl in Home Hospice Care With Her Favorite Song
Tampons may have ‘toxic levels’ of lead and arsenic in them, study warns
How much do you tip? Why most Canadians feel pressured to give extra
For The Rest Of 2024, Chase After What Your Heart Wants

Leave a Reply

Your email address will not be published. Required fields are marked *