Jesus' Coming Back

Censorship-Industrial Complex Enlists U.K. ‘Misinformation’ Group Logically.AI To Meddle In 2024 Election

Brian Murphy, a former FBI agent who once led the intelligence wing of the Department of Homeland Security, reflected last summer on the failures of the Disinformation Governance Board — the panel formed to actively police misinformation. The board, which was proposed in April 2022 after he left DHS, was quickly shelved by the Biden administration in a few short months in the face of criticism that it would be an Orwellian state-sponsored “Ministry of Truth.”

In a July podcast, Murphy said the threat of state-sponsored disinformation meant the executive branch has an “ethical responsibility” to rein in the social media companies. American citizens, he said, must give up “some of your freedoms that you need and deserve so that you get security back.”

The legal problems and public backlash to the Disinformation Governance Board also demonstrated to him that “the government has a major role to play, but they cannot be out in front.”

Murphy, who made headlines late in the Trump administration for improperly building dossiers on journalists, has spent the last few years trying to help the government find ways to suppress and censor speech it doesn’t like without being so “out in front” that it runs afoul of the Constitution. He has proposed that law enforcement and intelligence agencies formalize the process of sharing tips with private sector actors — a “hybrid constellation” including the press, academia, researchers, nonpartisan organizations, and social media companies — to dismantle “misinformation” campaigns before they take hold.

More recently, Murphy has worked to make his vision of countering misinformation a reality by joining a United Kingdom-based tech firm, Logically.AI, whose eponymous product identifies and removes content from social media. Since joining the firm, Murphy has met with military and other government officials in the U.S., many of whom have gone on to contract or pilot Logically’s platform.

Logically says it uses artificial intelligence to keep tabs on over 1 million conversations. It also maintains a public-facing editorial team that produces viral content and liaisons with the traditional news media. It differs from other players in this industry by actively deploying what they call “countermeasures” to dispute or remove problematic content from social media platforms.
 
The business is even experimenting with natural language models, according to one corporate disclosure, “to generate effective counter speech outputs that can be leveraged to deliver novel solutions for content moderation and fact-checking.” In other words, artificial intelligence-powered bots that produce, in real-time, original arguments to dispute content labeled as misinformation.

In many respects, Logically is fulfilling the role Murphy has articulated for a vast public-private partnership to shape social media content decisions. Its technology has already become a key player in a much larger movement that seeks to clamp down on what the government and others deem misinformation or disinformation. A raft of developing evidence — including the “Twitter Files,” the Moderna Reports, the proposed Government Disinformation Panel, and other reports — has shown how governments and industry are determined to monitor, delegitimize, and sometimes censor protected speech. The story of Logically.AI illustrates how sophisticated this effort has become and its global reach. The use of its technology in Britain and Canada raises red flags as it seeks a stronger foothold in the United States.

Logically was founded in 2017 by a then-22-year-old British entrepreneur named Lyric Jain, who was inspired to form the company to combat what he believed were the lies that pushed the U.K. into voting in favor of Brexit, or leaving the European Union. The once-minor startup now has broad contracts across Europe and India, and has worked closely with Microsoft, Google, PwC, TikTok, and other major firms. Meta contracts with Logically to help the company fact-check content on all of its platforms: WhatsApp, Instagram, and Facebook.

The close ties to Silicon Valley provide unusual reach. “When Logically rates a piece of content as false, Facebook will significantly reduce its distribution so that fewer people see it, apply a warning label to let people know that the content has been rated false, and notify people who try to share it,” Meta and Logically announced in a 2021 press release on the partnership.

Meta and Logically did not respond to repeated requests for comment.

During the 2021 local elections in the U.K., Logically monitored up to “one million pieces of harmful content,” some of which they relayed to government officials, according to a document reviewed by RealClearInvestigations. The firm claimed to spot coordinated activity to manipulate narratives around the election, information they reported to tech giants for takedowns.

The following year, the state of Oregon negotiated with Logically for a wide-ranging effort to monitor campaign-related content during the 2022 midterm elections. In a redacted proposal for the project, Logically noted that it would check claims against its “single source of truth database,” which relied on government data, and would also crack down on “malinformation” — a term of art that refers to accurate information that fuels dangerous narratives. The firm similarly sold Oregon on its ability to pressure social media platforms for content removal.

Oregon state Rep. Ed Diehl has a led push against the state from renewing its work with Logically for the election this year. The company, he said in an interview, violates “our constitutional rights to free speech and privacy” by “flagging true information as false, claiming legitimate dissent is a threat, and then promoting “counter-narratives” against valid forms of public debate.

In response, the Oregon secretary of state’s office, which initiated the contract with Logically, claimed “no authority, ability, or desire to censor speech.” Diehl disputes this. He pointed out that the original proposal with Logically clearly states that its service “enables the opportunity for unlimited takedown attempts” of alleged misinformation content and the ability for the Oregon secretary of state’s office to “flag for removal” any “problematic narratives and content.” The contract document touts Logically as a “trusted entity within the social media community” that gives it “preferred status that enables us to support our client’s needs at a moment’s notice.”

Diehl, who shared a copy of the Logically contract with RCI, called the issue a vital “civil rights” fight, and noted that in an ironic twist, the state’s anti-misinformation speech suppression work further inflames distrust in “election systems and government institutions in general.”

Logically’s reach into the U.S. market is quickly growing. The company has piloted programs for the Chicago Police Department to use artificial intelligence to analyze local rap music and deploy predictions on violence in the community, according to a confidential proposal obtained by RCI. Pentagon records show that the firm is a subcontractor to a program run by the U.S. Army’s elite Special Operations Command for work conducted in 2022 and 2023. Via funding from DHS, Logically also conducts research on gamer culture and radicalization.

The company has claimed in its ethics statements that it will not employ any person who holds “a salaried or prominent position” in government. But records show closely entrenched state influence. For instance, Kevin Gross, a director of the U.S. Navy NAVAIR division, was previously embedded within Logically’s team during a 2022 fellowship program. The exchange program supported Logically’s efforts to assist NATO on the analysis of Russian social media.

Other contracts in the U.S. may be shrouded in secrecy. Logically partners with ThunderCat Technologies, a contracting firm that assists tech companies when competing for government work. Such arrangements have helped tech giants conceal secretive work in the past. Google previously attempted to hide its artificial intelligence drone-targeting contracts with the Defense Department through a similar third-party contracting vendor.

But questions swirl over the methods and reach of the firm as it entrenches itself into American life, especially as Logically angles to play a prominent role in the 2024 presidential election. 

Pandemic Policing

In March 2020, as Britain confronted the spread of Covid-19, the government convened a new task force, the Counter Disinformation Unit (CDU). The secretive task force was created with little fanfare but was advertised as a public health measure to protect against dangerous misinformation. Caroline Dinenage, the member of Parliament overseeing media issues, later explained that the unit’s purpose was to provide authoritative sources of information and to “take action to remove misinformation” relating to “misleading narratives related to COVID-19.”

The CDU, it later emerged, had largely outsourced its work to private contractors such as Logically. In January 2021, the company received its first contract from the agency overseeing the CDU, for £400,000, to monitor “potentially harmful disinformation online.” The contracts later swelled, with the U.K. agency that pertains to media issues eventually providing contracts with a combined value of £1.2 million and the Department of Health providing another £1.3 million, for a total of roughly $3.2 million.

That money went into far-reaching surveillance that monitored journalists, activists, and lawmakers who criticized pandemic policies. Logically, according to an investigation last year in the Telegraph, recorded comments from activist Silkie Carlo criticizing vaccine passports in its “Mis/Disinformation” reports.

Logically’s reports similarly collected information on Dr. Alexandre de Figueiredo, a research fellow at the London School of Hygiene and Tropical Medicine. Figueiredo had published reports on the negative ways in which vaccine passports could undermine vaccine confidence and had publicly criticized policies aimed at the mass vaccination of children. Despite his expertise, Logically filed his tweet in a disinformation report to the government. While some of the reports were categorized as evidence of terms of service violations, many were, in fact, routine forms of dissent aired by prominent voices in the U.K. on policies hotly contested by expert opinion.

The documents showing Logically’s role were later uncovered by Carlo’s watchdog group, Big Brother Watch, which produced a detailed report on the surveillance effort. The CDU reports targeted a former judge who argued against coercive lockdowns as a violation of civil liberties and journalists criticizing government corruption. Some of the surveillance documents suggest a mission creep for the unit, as media monitoring emails show that the agency targeted anti-war groups that were vocal against NATO’s policies.

Carlo was surprised to even find her name on posts closely monitored and flagged by Logically. “We found that the company exploits millions of online posts to monitor, record and flag online political dissent to the central government under the banner of countering ‘disinformation,’” she noted in a statement to RCI.

Marketing materials published by Logically suggest its view of Covid-19 went well beyond fact-checking and veered into suppressing dissenting opinions. A case study published by the firm claimed that the #KBF hashtag, referring to Keep Britain Free, an activist group against school and business shutdowns, was a dangerous “anti-vax” narrative. The case study also claimed the suggestion that “the virus was created in a Chinese laboratory” was one of the “conspiracy theories’’ that “have received government support” in the U.S. — despite the fact that a preponderance of evidence now points to a likely lab leak from the Wuhan Institute of Virology as the origin of the pandemic.

Logically was also involved in pandemic work that blurred the line with traditional fact-checking operations. In India, the firm helped actively persuade patients to take the vaccine. In 2021, Jain, the founder and CEO of the company, said in an interview with an Indian news outlet that his company worked “closely with communities that are today vaccine hesitant.” The company, he said, recruited “advocates and evangelists” to shape local opinion.

Questionable Fact-Checking

In 2022, Logically used its technology on behalf of Canadian law enforcement to target the trucker-led “Freedom Convoy” against Covid-19 mandates, according to government records. Logically’s team floated theories that the truckers were “likely influenced by foreign adversaries,” a widely repeated claim used to denigrate the protests as inauthentic.

The push to discredit the Canadian protests showed the overlapping power of Logically’s multiple arms. While its social media surveillance wing fed reports to the Canadian government, its editorial team worked to influence opinion through the news media. When the Financial Times reported on the protest phenomenon, the outlet quoted Murphy, the former FBI man who now works for Logically, who asserted that the truckers were influenced by coordinated “conspiracy theorist groups” in the U.S. and Canada. Vice similarly quoted Joe Ondrak, Logically’s head of investigations, to report that the “Freedom Convoy” had generated excitement among global conspiracy theorists. Neither outlet disclosed Logically’s work for Canadian law enforcement at the time.

Other targets of Logically are quick to point out that the firm has taken liberties with what it classifies as misinformation.

Will Jones, the editor of the Daily Sceptic, a British news outlet with a libertarian bent, has detailed an unusual fact-check from Logically Facts, the company’s editorial site. Jones said the site targeted him for pointing out that data in 2022 showed 71 percent of patients hospitalized for Covid-19 were vaccinated. Logically’s fact-check acknowledged Jones had accurately used statistics from the U.K. Health Security Agency, but tried to undermine him by asserting that he was still misleading by suggesting that “vaccines are ineffective.”

But Jones, in a reply, noted that he never made that argument and that Logically was batting away at a straw man. In fact, his original piece plainly took issue with a Guardian article that incorrectly claimed that “COVID-19 has largely become a disease of the unvaccinated.”

Other Logically fact-checks have bizarrely targeted the Daily Sceptic for reporting on news in January 2022 that vaccine mandates might soon be lifted. The site dinged the Daily Sceptic for challenging the evidence behind the vaccine policy and declared, “COVID-19 vaccines have been proven effective in fighting the pandemic.” And yet, at the end of that month, the mandate was lifted for health care workers, and the following month, all other pandemic restrictions were revoked, just as the Daily Sceptic had reported.

“As far as I can work out, it’s a grift,” said Daily Sceptic founder Toby Young, of Logically. “A group of shysters offer to help the government censor any criticism of its policies under the pretense that they’re not silencing dissent — God forbid! — but merely ‘cleansing’ social media of misinformation, disinformation and hate speech.”

Jones was similarly dismissive of the company, which he said disputes anything that runs contrary to popular consensus. “The consensus of course is that set by the people who pay Logically for their services,” Jones added. “The company claims to protect democratic debate by providing access to ‘reliable information,’ but in reality, it is paid to bark and savage on command whenever genuine free speech makes an inconvenient appearance.”

In some cases, Logically has piled on to news stories to help discredit voices of dissent. Last September,  the anti-misinformation site leaped into action after British news outlets published reports about sexual misconduct allegations surrounding comedian and online broadcaster Russell Brand — one of the outspoken critics of government policy in Britain, who has been compared to Joe Rogan for his heterodox views and large audience.

Brand, a vocal opponent of pandemic policies, had been targeted by Logically in the past for airing opinions critical of the U.S. and U.K. response to the virus outbreak, and in other moments for criticizing new laws in the European Union that compel social media platforms to take down content.

But the site took dramatic action when the sexual allegations, none of which have been proved in court, were published in the media. Ondrak, Logically’s investigations head, provided different quotes to nearly half a dozen news outlets — including Vice, Wired, the BBC, and two separate articles in The Times — that depicted Brand as a dangerous purveyor of misinformation who had finally been held to account.

“He follows a lot of the ostensibly health yoga retreat, kind of left-leaning, anti-capitalist figures, who got really suckered into Covid skepticism, Covid denialism, and anti-vax, and then spat out of the Great Reset at the other end,” Ondrak told Wired. In one of the articles published by The Times, Ondrak aired frustration on the obstacles of demonetizing Brand from the Rumble streaming network. In an interview with the BBC, Ondrak gave a curious condemnation, noting Brand stops short of airing any actual conspiracy theories or falsehoods but is guilty of giving audiences “the ingredients to make the disinformation themselves.”

Dinenage, the member of Parliament who spearheaded the CDU anti-misinformation push with Logically during the pandemic, also leapt into action. In the immediate aftermath of the scandal, she sent nearly identical letters to Rumble, TikTok, and Meta to demand that the platforms follow YouTube’s lead in demonetizing Brand. Dinenage couched her official request to censor Brand as a part of a public interest inquiry, to protect the “welfare of victims of inappropriate and potentially illegal behaviour.”

Logically’s editorial team went a step further. In its report on the Brand allegations published on Logically Facts, it claimed that social media accounts “trotting out the ‘innocent until proven guilty’ refrain” for the comedian were among those perpetuating “common myths about sexual assault.” The site published a follow-up video reiterating the claim that those seeking the presumption of innocence for Brand, a principle dating back to the Magna Carta, were spreading a dangerous “myth.”

The unusual advocacy campaign against Brand represented a typical approach for a company that has long touted itself as a hammer against spreaders of misinformation. The opportunity to remove Brand from the media ecosystem meant throwing as much at him as possible, despite any clear misinformation or disinformation angle in the sexual assault allegations. Rather, he was a leading critic of government censorship and pandemic policy, so the scandal represented a weakness to be exploited.

Such heavy-handed tactics may be on the horizon for American voters. The firm is now a member of the U.S. Election Infrastructure Information Sharing & Analysis Center, the group managed by the Center for Internet Security that helps facilitate misinformation reports on behalf of election officials across the country. Logically has been in talks with Oregon and other states, as well as DHS, to expand its social media surveillance role for the presidential election later this year.

Previous targets of the company, though, are issuing a warning. 

“It appears that Logically’s lucrative and frankly sinister business effectively produced multi-million pound misinformation for the government that may have played a role in the censorship of citizens’ lawful speech,” said Carlo of Big Brother Watch.

“Politicians and senior officials happily pay these grifters millions of pounds to wield the red pen, telling themselves that they’re ‘protecting’ democracy rather than undermining it,” said Young of the Daily Sceptic. “It’s a boondoggle and it should be against the law.”

This article was originally published by RealClearInvestigations and LeeFang.com.


The Federalist

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More