Posted by Curt on 26 January, 2023 at 8:42 am. 3 comments already!

Loading

By Nick Givas

The government’s campaign to fight “misinformation” has expanded to adapt military-grade artificial intelligence once used to silence the Islamic State (ISIS) to quickly identify and censor American dissent on issues like vaccine safety and election integrity, according to grant documents and cyber experts.
 
The National Science Foundation (NSF) has awarded several million dollars in grants recently to universities and private firms to develop tools eerily similar to those developed in 2011 by the Defense Advanced Research Projects Agency (DARPA) in its Social Media in Strategic Communication (SMISC) program.
 
DARPA said those tools were used “to help identify misinformation or deception campaigns and counter them with truthful information,” beginning with the Arab Spring uprisings in the the Middle East that spawned ISIS over a decade ago.
 
The initial idea was to track dissidents who were interested in toppling U.S.-friendly regimes or to follow any potentially radical threats by examining political posts on Big Tech platforms.
 

DARPA set four specific goals for the program:

 

  1. “Detect, classify, measure and track the (a) formation, development and spread of ideas and concepts (memes), and (b) purposeful or deceptive messaging and misinformation.
  2. Recognize persuasion campaign structures and influence operations across social media sites and communities.
  3. Identify participants and intent, and measure effects of persuasion campaigns.
  4. Counter messaging of detected adversary influence operations.”

Mike Benz, executive director of the Foundation for Freedom Online has compiled a report detailing how this technology is being developed to manipulate the speech of Americans via the National Science Foundation (NSF) and other organizations.

 
“One of the most disturbing aspects of the Convergence Accelerator Track F domestic censorship projects is how similar they are to military-grade social media network censorship and monitoring tools developed by the Pentagon for the counterinsurgency and counterterrorism contexts abroad,” reads the report.
 
“DARPA’s been funding an AI network using the science of social media mapping dating back to at least 2011-2012, during the Arab Spring abroad and during the Occupy Wall Street movement here at home,” Benz told Just The News. “They then bolstered it during the time of ISIS to identify homegrown ISIS threats in 2014-2015.”
 
The new version of this technology, he added, is openly targeting two groups: Those wary of potential adverse effects from the COVID-19 vaccine and those skeptical of recent U.S. election results.
 
“The terrifying thing is, as all of this played out, it was redirected inward during 2016 — domestic populism was treated as a foreign national security threat,” Benz said.
 
“What you’ve seen is a grafting on of these concepts of mis- and disinformation that were escalated to such high intensity levels in the news over the past several years being converted into a tangible, formal government program to fund and accelerate the science of censorship,” he said.
 
“You had this project at the National Science Foundation called the Convergence Accelerator,” Benz recounted, “which was created by the Trump administration to tackle grand challenges like quantum technology. When the Biden administration came to power, they basically took this infrastructure for multidisciplinary science work to converge on a common science problem and took the problem of what people say on social media as being on the level of, say, quantum technology.
 
“And so they created a new track called the track F program … and it’s for ‘trust and authenticity,’ but what that means is, and what it’s a code word for is, if trust in the government or trust in the media cannot be earned, it must be installed. And so they are funding artificial intelligence, censorship capacities, to censor people who distrust government or media.”
 
Benz went on to describe intricate flows of taxpayer cash funding the far-flung, public-private censorship regime. The funds flow from the federal government to universities and NGOs via grant awards to develop censorship technology. The universities or nonprofits then share those tools with news media fact-checkers, who in turn assist private sector tech platforms and tool developers that continue to refine the tools’ capabilities to censor online content.
 
“This is really an embodiment of the whole of society censorship framework that departments like DHS talked about as being their utopian vision for censorship only a few years ago,” Benz said. “We see it now truly in fruition.”
 

 
Members of the media, along with fact-checkers, also serve as arbiters of what is acceptable to post and what isn’t, by selectively flagging content for said social media sites and issuing complaints against specific narratives.
 
There is a push, said Benz during an appearance on “Just The News No Noise” this week, to fold the media into branches of the federal government in an effort to dissolve the Fourth Estate, in favor of an Orwellian and incestuous partnership to destroy the independence of the press.
 
The advent of COVID led to “normalizing censorship in the name of public health,” Benz recounted, “and then in the run to the 2020 election, all manner of political censorship was shoehorned in as being okay to be targetable using AI because of issues around mail-in ballots and early voting drop boxes and issues around January 6th.
 
“What’s happened now is the government says, ‘Okay, we’ve established this normative foothold in it being okay to [censor political speech], now we’re going to supercharge you guys with all sorts of DARPA military grade censorship, weaponry, so that you can now take what you’ve achieved in the censorship space and scale it to the level of a U.S. counterinsurgency operation.'”
 
One academic institution involved in this tangled web is the University of Wisconsin, which​​​​​ received a $5 million grant in 2022 “for researchers to further develop” its Course Correct program, “a precision tool providing journalists with guidance against misinformation,” according to a press release from the university’s School of Journalism and Mass Communication.”

Read more
 

0 0 votes
Article Rating
3
0
Would love your thoughts, please comment.x
()
x