
What connects a father from Lahore, Pakistan, an amateur hockey player from Nova Scotia, and a guy named Kevin from Houston, Texas?
They all linked to the website Channel 3 Now, which reported using a false name for the 17-year-old boy charged in the Southport attack, and the article was widely quoted in X’s posts. Channel 3 Now also wrongly suggested the attacker was an asylum seeker who arrived in the UK by boat last year.
This incident, combined with false claims from other sources that the perpetrators were Muslim, has been widely blamed for contributing to the riots across Britain, some of which have targeted mosques and Muslim communities.
The BBC tracked down several people associated with Channel3Now, spoke to their friends and colleagues to verify their authenticity, and questioned someone claiming to be the site’s “administrator.”
What I found appears to be a commercial operation attempting to collect crime news while making money on social media. I found no evidence to support the claims that Channel3Now’s misinformation is linked to the Russian government.
A person identifying themselves as an executive at Channel3Now told me that publishing the false names “should never have happened, but it was unintentional and a mistake.”
The false article was unsigned and it is unclear exactly who wrote it.
———
James, an amateur hockey player from Nova Scotia, is the first person I could track down with a connection to Channel3Now: his name appears as an unusual byline on another article on the site, and a related LinkedIn page features his photo.
The Facebook account linked to James has only four friends, one of whom is named Farhan, whose Facebook profile says he is a journalist for the site.
I messaged dozens of their followers. Social media accounts from the school where James played hockey and one of his friends confirmed that he is a real person who graduated four years ago. When I reached out, his friend said James wanted to know “what he was involved in in the article.” When I replied, he didn’t deny that James was associated with the site, and the friend stopped responding.
Several of Farhan’s former colleagues in Pakistan have confirmed his identity. He posts about his Islamic faith and his children on his social media profiles. He was not named in the fake article.
Shortly after I messaged him, Farhan blocked me on Instagram but eventually responded via an official Channel3Now email.

The person who contacted us said his name was Kevin and that he lives in Houston, Texas. He declined to give his last name, so it’s unclear if he is really Kevin, but he agreed to answer questions via email.
Kevin says he’s speaking to me from the site’s “headquarters” in the United States, which coincides with both the timing of social media posts on several of the site’s social media profiles, as well as Kevin’s times responding to my emails.
He initially introduced himself as the “editor-in-chief” but told me he was actually the “verification producer.” He refused to reveal the name of the site’s owner, saying the owner was “not only concerned about himself, but about everyone who works for him.”
Kevin claims that there are “30+ people” working on the site, spread across the US, UK, Pakistan and India, typically recruited through freelancer sites, including Farhan and James. He specifically says Farhan was not involved in Southport’s false story, and the site has publicly apologised for it, blaming its “UK-based team”.
After Channel3Now spread false claims, based on old videos from a Russian-language YouTube channel, the channel was accused of having ties to the Russian government.
Kevin said the site purchased a Russian-language YouTube channel focused on car rallies “many years ago” and later renamed it.
The account hadn’t posted any videos for about six years before it began uploading content related to Pakistan, where Farhan is based and where the site confirms it has writers.
“Just because you bought a YouTube channel from a Russian seller doesn’t mean you have any kind of relationship with them,” Kevin said.
“We are an independent digital news media website covering news from around the world.”
You can also buy and repurpose channels that are already monetized on YouTube, which is a quick way to grow your audience and start earning money from your account right away.
“As many stories as possible”
While no evidence has been found to support these claims about Channel3Now’s ties to Russia, pro-Kremlin Telegram channels have reshared and amplified the site’s false posts, a common tactic.
Kevin said the site is a commercial operation, and its revenue comes from “featuring as many stories as possible”. The majority of the stories appear to be accurate and drawn from reliable sources about shootings and car accidents in the US. However, the site also shares further false speculation about the Southport attacker and the would-be assassin of Donald Trump.
Following the false Southport article and media coverage on Channel3Now, Kevin said that nearly all of his YouTube channel and “several Facebook pages” have been suspended, but that the X account has not. A Facebook page called Daily Felon, which only reshares content from the site, also remains live.
Kevin said the social media furor over the Southport suspect and the ensuing violence could not be solely blamed on the “mistake” of a “small Twitter account”.
To an extent, he is right: Channel3Now’s erroneous article was cited by many social media accounts, spreading the false accusation.
Some of these accounts are based in the UK and the US and have a track record of posting misinformation about topics like the pandemic, vaccines and climate change. Changes made by Elon Musk after he bought Twitter have allowed these accounts to amass significant followings and reach a wider audience with their content.

A woman’s profile called Bernadette Spofforth is accused of making the original post using the Southport attacker’s pseudonym. She denied being the source and said she saw the name in another post which has since been removed.
In a phone interview with the BBC, she said she was “horrified” by the attack but quickly deleted the post once she realised it was false. She said she was “not trying to make money” from her account.
“Why would I make up something like that? There is nothing to gain and everything to lose,” she said, condemning the recent violence.
Spofforth had previously posted questions about lockdowns and net-zero climate action, but in 2021 Twitter temporarily removed his profile after allegations emerged that he was spreading misinformation about COVID-19 vaccines and the pandemic. He disputed the claims and said he believed COVID-19 was real.
Since Musk’s acquisition, her posts have regularly garnered more than a million views.
The false claims Spofforth posted about the Southport attacker were quickly re-shared and picked up by a loose group of conspiracy influencers and profiles with a history of sharing anti-immigration and far-right ideology.
Many of them have purchased the blue checkmark, and their posts have become more visible since Musk took over at Twitter.
Another change Musk made to X means that it could become profitable for conspiracy accounts, and commercially-minded accounts like Channel3Now, to spread these ideas.
Millions of views
Some of these profiles, including those posting about the Southport attack and subsequent riots, have racked up millions of views in the past week. X’s “ad revenue share” means that users who tick the blue tick can get a cut of the revenue from ads in their replies.
Estimates from users with fewer than 500,000 followers who make money this way suggest that accounts can make anywhere from $1 to $20 per million views or impressions of X. Some of these accounts sharing misinformation get over a million views on almost every post and share posts several times a day.
Other social media companies besides X also allow users to monetize views, but YouTube, TikTok, Instagram and Facebook have previously monetized or suspended some profiles that post content that violates their misinformation guidelines. Beyond its rules against fake AI content, X does not have any guidelines on misinformation.
The riots have led some politicians to call for more from social media companies, but Britain’s recently enacted Online Safety Act does not currently include any laws against disinformation amid concerns it could limit freedom of expression.
Moreover, as we have found out by tracking Channel3Now writers, people involved in posting false information are often based overseas, making it very difficult to take action against them.
Instead, the power to deal with this type of content now lies with the social media companies themselves. X did not respond to the BBC’s request for comment.