Page 27 - FCW, Sept/Oct 2018
P. 27

                                not be possible or prudent to disclose” those activities due to operational, investigative or other constraints. Even when it takes action against those operations, a federal agency “will not necessarily be the appropriate entity to disclose information publicly” about the activities.
At an event in Washington in August, Adam Hickey, Justice’s deputy assistant attorney general, told FCW that the question of when and how to publicly identify and mitigate an ongoing influence campaign touches on “a real sensitivity.” Even though many officials believe inaction by the U.S. government might have emboldened Russia during the 2016 elections, agencies must avoid delegitimizing certain points of view or appearing to favor a particular political party when it chooses to push back more forcefully against foreign influence.
“I don’t want to suggest the government is likely or largely or frequently going to be weighing in on the truth of a particular argument,” Hickey said. “In fact...there are a lot of reasons why we might not do that, and one of the principal reasons is avoiding even the appearance of partiality. So if you’re talking about misinformation in the context of an election, that’s going to be a situation where we’re particularly cautious.”
By contrast, Hickey said disinformation about the government, such as online or text campaigns that spread false information about voting locations or times, are situations where Justice and others might have a higher interest in alerting the public.
In a conversation after his speech, he declined to say what kind of technologies Justice’s or other agencies’ task forces might be using to track and monitor campaigns in the digital space, saying only that they would need to be lawful and focus on passing relevant evidence gleaned from technical forensics and intelligence sources to social media companies and other online providers.
“Obviously, there has to be a way to lawfully deploy technology to do that,” he said. “I don’t have a clean, easy answer for you [about what that is]. The folks who have an edge here are providers who obviously see what happens on their systems, who have more of a technology base and who are in a position to make decisions about how best to make sure that platforms are used in the way they want them to be used.”
Whether current technology is capable of accurately identifying and separating malicious foreign content online from legitimate speech is an open question. Similar efforts outside government, such as the Hamilton 68 dashboard, purport to identify and publicize Russian influence operations by tracking what that country’s bot networks are amplifying on social media. However, the dashboard has been criticized in the media and even by one of its creators for issuing dubious assertions or misidentifying organic online activity as nefarious and coordinated foreign influence.
More harm than good?
During a hearing of the Senate Select Committee on Intelligence in July, Sen. James Risch (R-Idaho) asked a panel of experts how their methods of tracking foreign influence operations online were able to surmount the enormous if not impossible task of segregating malicious actions from American citizens engaging in protected free speech.
In response, Todd Helmus, a senior behavioral scientist at RAND, said: “That’s challenged our bot detectors. There are bot detectors available that can detect some types of content that mimic the characteristics of bots, but it is an arms race as developers develop ways to detect bots based on their inhuman levels of content, timing of tweets or what have you. Producers of those bots will then identify other ways of circumventing that and staying covert.”
DHS’ 10-member task force is currently looking for partners in the research, policymaking and technology communities. Speaking on background, two DHS officials said the task force wanted to build long-term capability in that space and examine past disinformation campaigns related to incidents such as the poisoning of Russian dou- ble agent Sergei Skripal or the mass shooting at Marjory Stoneman Douglas High School in Florida to spot common behaviors or anomalies in online discourse that indicate a coordinated campaign with the intent to spread false information.
Like representatives from other agencies, the DHS offi- cials said they are concerned about whether the govern- ment could do more harm than good by taking a more active role in publicly identifying such campaigns.
“We are fairly certain that in a lot of contexts, we are not the right messenger,” one official said. “Hasty policy is rife with unintended consequences, so part of this is mapping out an idealized solution and figuring out a right way to tack a policy solution onto it because it’s really easy to say, ‘Hey, the government’s not doing this. Why isn’t the government doing that?’ But it’s more useful to understand government’s role in context with the other actors in this space.”
Nevertheless, some officials said doing something is better than letting misinformation and disinformation go unchallenged. According to the State Department spokesperson, when the government is faced with a choice between publicly identifying an untrue statement that is harmful to U.S. interests and letting it fester, pushing back against false information should win out.
“If there’s a campaign out there in the Baltics with NATO allies and there’s a narrative — whether foreign-pushed or not — that they’re engaged in horrible behavior and raping and pillaging and we know that’s not true, it’s our job to blunt that narrative,” the spokesperson said. n
September/October 2018 FCW.COM 19



















































































   25   26   27   28   29