Page 39 - GCN, June/july 2017
P. 39

3Being overly reliant
on automation
The logs and event data generated by security tools and network devices can reveal a great deal about the security status of an organization’s network. Many agencies have implemented tools for collecting and correlating such data from multiple sources in order to bet- ter understand the nature of security events on their networks.
Karen Evans, former administrator of the Office of Management and Bud- get’s Office of E-Government and IT, acknowledged that such automation is critical to incident detection and re- sponse, but she said it is a mistake to rely on automated tools exclusively.
As one example of the danger of such over-reliance, Evans cited a situation in which an automated tool keeps fix- ing the same hole without triggering a follow-up investigation by a human ana- lyst. If a vulnerability that has already been patched is repeatedly fixed, it means the automated tools have missed something on the network that is ca- pable of exploiting the weakness, Evans said.
An intruder with legitimate creden- tials, for instance, could be moving around inside the network and exploit- ing the same vulnerability over and over again to exfiltrate data. When an organization trusts automated tools to do too much, it is easy to miss such red flags, she added.
“Don’t over-automate,” Evans said. “You want to push and do automation to maximize the way you use your analysts, but you don’t want to automate to where you don’t involve your analysts at all.”
4Neglecting to
retain evidence
Touhill said cybersecurity is a risk man- agement issue, and although IT teams are typically under pressure to restore operations as soon as possible, they should resist that urge.
“In rushing to do so, many cyber inci- dent victims — both public- and private- sector — take tactical actions to restore the assets through actions like wipes
and reloading,” he said. Unfortunately, those moves can also clear out system logs and other forensic evidence that is essential for responders such as US- CERT and law enforcement officials.
Furthermore, “if you don’t have the forensic info to figure out how the bad guys got in, you may rebuild your net- work with the same flaws the bad actors exploited to penetrate your network in the first place,” Touhill said.
It is vital, therefore, for public and private entities to have clearly defined procedures for retaining forensic infor- mation when an incident occurs. It is equally important to regularly test those procedures through drills and exercises. And agencies should include incident responders such as US-CERT, the Indus- trial Control Systems Cyber Emergency Response Team and law enforcement in their incident response plans, Touhill said.
Gathering all the evidence needed to determine what happened can take time. Calkin said that in several situa- tions in which MS-ISAC assisted a state or local government in a breach inves- tigation, it took one to two weeks to complete the necessary analysis. Some- times, organizations need to make a fo- rensic image of infected systems before shutting them down to ensure they cap- ture any memory-resident malware, he added.
5
After US-CERT informed OPM that its network had been breached in March 2014, agency officials spent over two months monitoring the hackers’ move- ments within its network. In May 2014, after determining what they thought was the full extent of the compromise, OPM officials initiated a coordinated remediation plan, internally dubbed Big Bang, to eradicate the intruders from its network and restore compro- mised systems.
Confident that the monitoring and subsequent remediation had worked, OPM officials missed the activities of a second hacker, who used an OPM con-
tractor’s credentials to log into the agen- cy’s network and then plant a backdoor around the time officials were monitor- ing the first hacker. Over the next sev- eral months, the second hacker or hack- ers systematically exfiltrated millions of records containing Social Security num- bers and information on background investigations.
Such lapses are not uncommon. A big mistake that organizations often make is declaring victory after finding the most obvious machines that are in- volved and re-imaging them, said John Pescatore, director of emerging security trends at the SANS Institute.
That approach often causes serious business interruptions because of lost or corrupted data, he added, and besides, most threats embed themselves into a network in ways to survive all but the most thorough eradication plans.
Schmidt agreed that once they have compromised a network, intruders will almost always install multiple back- doors and other mechanisms to stay hidden and defend their presence in the network against eradication efforts. Some will even close or patch the vul- nerability they used to gain access to prevent other intruders from finding their way in.
“When cleaning up after an intrusion, assume there is an adversary acting against you and avoid a net reduction of controls or security posture during the process,” Pescatore said. He advised agencies to avoid installing or patching systems in place. Instead, they should be patched or reinstated from a known clean or isolated environment, especial- ly when the network’s trustworthiness is not clear.
“Even brief windows of vulnerability can be used against you,” he added.
Schmidt said agencies must make sure they understand the nature of the initial infection so intruders cannot per- sist through the cleanup.
“Look for backdoors, new accounts, new services, new open ports and other mechanisms intruders use to at- tempt to survive cleanup operations,” he added. •
Declaring victory
prematurely
GCN JUNE/JULY 2017 • GCN.COM 35






































































   37   38   39   40   41