US companies are unwittingly pushing foreign disinformation, IC warns
Foreign actors are working hard to spread harmful narratives in the runup to the November elections—even hiring U.S. marketing and communications firms to help, U.S. officials said Monday.
The officials spoke as the Office of the Director of National Intelligence released an advisory entitled “100 Days Until Election 2024; Election Security Update as of Late-July 2024.” Officials said malicious foreign actors are using social media and other online platforms to plant and push their narratives, and purchasing the unwitting services of legitimate marketing and communications firms to do so.
“The American public should know that content that they read online, especially on social media, could be foreign propaganda, even if it appears to be coming from fellow Americans or originating in the United States,” an ODNI official said on a Monday press call. “In short, foreign influence actors are getting better at hiding their hand and using Americans to do it.”
Key culprits behind these operations are the “Big Three”: Russia, China, and Iran. Intelligence community officials told reporters that Russia has a documented history of building networks within the U.S. and other western entities to create content across websites to promote a pro-Russia government narrative.
Officials noted that Chinese government-backed users have also turned to social media to continue sowing discord among American audiences. Similarly, the ODNI official said that Iranian online actors rely on “vast webs of online personas and propaganda mills to spread disinformation, and have notably been active in exacerbating tensions with Israel-Gaza conflict.”
“These firms essentially offer election influence-in-a-box services,” the ODNI official said.
The origin of these firms varies; the ODNI official noted that Russian cyber actors tend to use Russia-based firms to distribute their content, but foreign actors have also employed services based in Latin America, the Middle East and other locations.
A key example of this type of firm outreach was seen in the Peace Data operation, a shell news company discreetly operated by the Russian Internet Research Agency. Freelance writers were contracted to write various news articles, unknowingly on behalf of the Russian government.
Foreign malign influence operations are incorporating generative artificial intelligence and other technologies, a CISA official told reporters. Additional tactics discussed on the Monday call include disguising proxy media sources, cloning the voices of public figures, cyber-enabled information operations, and manufacturing data indicating cyber security incidents.
These tactics are expected to worsen ahead of November’s elections.
“Foreign efforts to influence the election are part of broader influence campaigns focused on a country’s core interests and to undermine the United States’ global role,” the ODNI official said. “Thus, looking ahead, expect these actors will continue to calibrate their efforts to any shift during this election cycle to achieve those core objectives.”
History suggests it takes time for malicious actors to craft a narrative or perspective and digitally deploy it. The first ODNI official said that the IC expects these actors to pay attention to federal announcements, like this very advisory, and adjust their tactics accordingly.
“We view changes to influence themes are more likely than changes to larger strategies or preferences,” the official said.
Halting and prosecuting these operations is not black-and-white. An FBI official on the call said that, in response to covert operations on disseminating foreign narratives, law enforcement actions “run the gamut” and are case-specific.
The FBI official said that remediating actions are discussed within the IC group when a potential unwitting American entity is involved in one of these schemes.
“We work very closely with our interagency colleagues and the DNI to determine whether or not there’s instances where defensively briefing or warning individuals who are unwitting would be appropriate,” the official said. “Similarly, we will obviously look at investigative leads, whether it’s potential violation of federal crime or working on behalf of a foreign adversary.”
With the continued onslaught of digital foreign accounts combined with the advent of emerging technologies, the CISA official noted that the agency is promoting mitigation measures in the face of rapid tech advancement.
“Given the speed at which technology is evolving, where you’ve seen prior guidance of how to look for potential examples of generative AI, synthetic media…I think that’s becoming increasingly difficult to do with confidence and applicability,” the CISA official said. “Instead, the mitigation measures that we’re really encouraging and pushing folks to do is to again communicate early and promote transparency around the elections processes and for entities to ensure that their accounts are not utilized in these types of campaigns, making sure you’re securing your systems, locking down your accounts, using things like multi-factor authentication to prevent your accounts from being utilized for those types of purposes.”
Comments are closed.