What connects a dad living in Lahore in Pakistan, an amateur hockey player from Nova Scotia - and a man named Kevin from Houston, Texas?

They’re all linked to Channel3Now - a website whose story giving a false name for the 17-year-old charged over the Southport attack was widely quoted in viral posts on X. Channel3Now also wrongly suggested the attacker was an asylum seeker who arrived in the UK by boat last year.

This, combined with untrue claims the attacker was a Muslim from other sources, has been widely blamed for contributing to riots across the UK - some of which have targeted mosques and Muslim communities.

[…]

The BBC has tracked down several people linked to Channel3Now.

[…]

The person who gets in touch [from Channel3Now’s official email] says he is called Kevin, and that he is based in Houston, Texas. He declines to share his surname and it is unclear if Kevin is actually who he says he is, but he agrees to answer questions over email.

Kevin says he is speaking to me from the site’s “main office” in the US - which fits with both the timings of the social media posts on some of the site’s social media profiles, and the times Kevin replies to my emails.

He signs off initially as “the editor-in-chief” before he tells me he is actually the “verification producer”. He refuses to share the name of the owner of the site who he says is worried “not only about himself but also about everyone working for him”.

[…]

Although [there is] no evidence to back up these claims of Russian links to Channel3Now, pro-Kremlin Telegram channels did reshare and amplify the site’s false posts. This is a tactic they often use.

Kevin said the site is a commercial operation and “covering as many stories as possible” helps it generate income. The majority of its stories are accurate - seemingly drawing from reliable sources about shootings and car accidents in the US. However, the site has shared further false speculation about the Southport attacker and also the person who attempted to assassinate Donald Trump.

Following the false Southport story and media coverage about Channel3Now, Kevin says its YouTube channel and almost all of its “multiple Facebook pages” have been suspended, but not its X accounts. A Facebook page exclusively re-sharing content from the site called the Daily Felon also remains live.

[…]

Some profiles [on across several social media sites] have racked up millions of views over the past week posting about the Southport attacks and subsequent riots. X’s “ads revenue sharing” means that blue-tick users can earn a share of revenue from the ads in their replies.

Estimates from users with fewer than half a million followers who have generated income in this way say that accounts can make $10-20 per million views or impressions on X. Some of these accounts sharing disinformation are racking up more than a million impressions almost every post, and sharing posts several times a day.

Other social media companies - aside from X - also allow users to make money from views. But YouTube, TikTok, Instagram and Facebook have previously de-monetised or suspended some profiles posting content that break their guidelines on misinformation. Apart from rules against faked AI content, X does not have guidelines on misinformation.

  • tardigrada@beehaw.orgOP
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    There is a good series of research articles on mis/disinformation. It’s content isn’t completely new and focused on the U.S., but it applies to any regions imho.

    Reckoning with Mis/Disinformation in 2024: A four-part series

    […] Political rhetoric is again ramping up and new technologies are casting a shadow over public deliberation […] In this four-part series, members of our research network reflect on their influential work on mis- and disinformation in the context of today’s challenges, laying out some essential questions and insights to bolster public discourse in fraught times.​