Log In


Reset Password
  • MENU
    Local News
    Friday, April 19, 2024

    New London native getting traction with report on social media disinformation

    Jack Nassetta, a New London native and graduate of Saint Bernard, poses for a photo in his New London home Saturday, Sept. 22, 2018. (Sean D. Elliot/The Day)
    Buy Photo Reprints

    A New London native and Saint Bernard School alumnus has been getting a lot of attention for a report he recently co-authored that details how "synthetic actors" on Twitter, presumed to be Russian, have sought to influence American public opinion on military strikes in Syria.

    A junior at George Washington University, Jack Nassetta spent his summer as a visiting fellow at the James Martin Center for Nonproliferation Studies in Monterey, Calif.

    The result was the report titled "All the World is Staged: An Analysis of Social Media Influence Operations against U.S. Counterproliferation Efforts in Syria," which he co-authored with another student, Ethan Fecht.

    Since last Monday, Nassetta has published an analysis in The Washington Post, summarized his findings on Sky News and given a lecture at Middlebury College. Nassetta said he also will be sitting down with a U.S. Department of State official next week to talk about the research.

    "It's kind of overwhelming but fantastic at the same time," Nassetta told The Day by phone last Thursday evening, shortly after his flight to Boston landed.

    The report primarily focuses on tweets in the aftermath of — and pertaining to — the April 7, 2018, chemical weapons attack in Douma, Syria. Russia's ambassador to the United Nations, and its defense ministry spokesperson, claimed there was evidence that the chemical attack was staged and that the UK was involved in its execution, putting Russia at odds with the Trump administration.

    The U.S., the UK and France launched retaliatory strikes a week later.

    The report found that on Twitter "synthetic actors generally masquerade as reluctantly disillusioned supporters of President Donald Trump who disagree with his stance on launching strikes." It defines a synthetic actor as one "masquerading under false pretenses in order to accomplish a political end."

    Nassetta found that nearly every account he saw was pretending to be a conservative, because the goal was to undermine Trump's political base.

    "Russia understands that this president does not care what liberal America thinks, so they went right where it hurt him," he said.

    Nassetta and Fecht began their analysis by using the analytics tool DiscoverText to do a raw pull of 31 categories of data on Twitter. From 850,000 tweets, they analyzed 3,740 tweets that 3,081 unique users posted between March 28 and May 5.

    The tweets were all in reply to Trump and contained one or more of 14 terms, such as "chemical attack fake," "douma staged" or "soros white helmets." Nassetta explained that he and Fecht included the billionaire and political activist George Soros because he is a common target of conspiracy theories.

    The authors then used Tableau software to create a timeline of account creation dates.

    They found that nearly half of the "counternarrative accounts" created in the week between the Douma attack and the retaliatory strikes from the U.S., U.K. and France were synthetic actors.

    So how did they assess these accounts as synthetic actors?

    Nassetta and Fecht looked for a combination of 13 features. For example, bot accounts — and sometimes trolls, which in this case are undeclared state actors pretending to be an average Westerner — often have seemingly random alphanumeric Twitter handles, like "12wewvT5dWLwtyK."

    Bots, trolls and cyborgs also frequently use stock photos or appropriated images for their profile pictures; for example, the report found photos taken from a U.K. fashion model and a Russian nature photographer.

    Other signs of fake accounts include inactivity for months between periods of posting dozens or hundreds of times per day, out-of-place political commentary and little to no interaction with genuine users.

    While fake accounts were "extremely obvious" in 2016, Nassetta said, it now could take him hours just to decide on the validity of one or two accounts. The report acknowledges that "classifying our sample dataset into the synthetic actor typology is an intrinsically subjective task."

    He and Fecht determined the tweets "almost certainly" came from Russia in part because accounts repeatedly tweeted about domestic affairs in Russia when they weren't tweeting about Syria, and in part because of similarities with previous studies on Russian actors.

    The report recommends that Twitter scrutinize accounts created immediately after controversial events, ban scripted bot accounts, and add an optional level of verification less than the blue checkmark, which could be provided through a phone number or photo ID.

    It also suggests that Trump reserve contentious statements, like military plans, for more formal channels than Twitter.

    An interest in electoral politics shifts to the theoretical

    A 2016 graduate of Saint Bernard, Nassetta credits teachers like Frederic Smith and the now-retired Art Lamoreaux for being influential on his writing style.

    Nassetta interned with the Hillary Clinton campaign and then in the office of U.S. Sen. Richard Blumenthal. He is studying political communication and entered college looking to get into electoral politics.

    But it was through taking a class with Steven Livingston — a professor of media, public affairs and international affairs at George Washington University — that he became more interested "in the theoretical side."

    Nassetta was Livingston's research assistant this past spring, and Livingston introduced him to international security and disinformation analyst Ben Nimmo, who connected him to the Center for Nonproliferation Studies. Nassetta spent about three months in Monterey developing the report.

    After graduating from George Washington, Nassetta hopes to study computational propaganda at the Oxford Internet Institute.

    e.moser@theday.com

    Comment threads are monitored for 48 hours after publication and then closed.