https://www.rt.com/information/605918-pentagon-fake-persona-tools/Pentagon desires higher AI instruments to reinforce its on-line fakes – report

0
7
https://www.rt.com/information/605918-pentagon-fake-persona-tools/Pentagon desires higher AI instruments to reinforce its on-line fakes – report

The DoD’s Particular Operations Command is searching for superior expertise to deploy falsified human habits, in keeping with The Intercept

The Pentagon’s Joint Particular Operations Command (JSOC) desires higher instruments that may fabricate a dwelling individual’s on-line footprint utilizing superior generative applied sciences, The Intercept reported on Thursday, citing a procurement doc.

The unclassified wishlist for acquisitions expresses curiosity in producing faux imagery, together with that of people with completely different facial expressions, digital environments, and “selfie movies” that may stand up to scrutiny by social media algorithms and actual people. The options also needs to present audio layers particular for areas of simulated footage.

The Pentagon’s use of faux on-line personas, or “sock puppets,” dates again a minimum of over a decade. Such digital constructs are used to unfold American propaganda, form or falsify public opinion, and to gather intelligence, in keeping with media reviews.

Earlier this 12 months, Reuters uncovered a US navy operation to undermine public belief in a Chinese language vaccine in opposition to Covid-19 within the Philippines, a rustic Washington desires to maintain in its orbit whereas curbing Beijing’s regional affect.

In 2022, The Pentagon ordered a evaluation of its psychological warfare operations, after social media giants Fb and Twitter (now X) reported detecting and banning dozens of bots operated by US Central Command.

The US authorities has commonly accused its geopolitical rivals, together with China, Russia and Iran, of conducting “malign affect operations” on-line utilizing AI-generated content material. Amongst different types of meddling, international governments had been alleged to be influencing elections within the US.

The purported strategies resemble what The New York Instances described in June when it uncovered an Israeli affect operation concentrating on Americans. The marketing campaign, sponsored by Israel’s Ministry of Diaspora Affairs, used AI-generated content material to spice up narratives favorable to the shut US ally, the report claimed.

Daniel Byman, a professor of safety research at Georgetown College, commented on the disparity between US denunciations of its adversaries’ strategies and its obvious intention to make use of the identical techniques in its personal offensive operations.

The US “has a robust curiosity within the public believing that the federal government persistently places out truthful (to the perfect of information) info and isn’t intentionally deceiving folks,” he mentioned. “So, there’s a authentic concern that the U.S. might be seen as hypocritical.”

You may share this story on social media:


Supply hyperlink