The Forever-expanding Attack Surface
One more example of the real cause of data breaches.
We continue to do it to ourselves. I have been around this space a long time. And if I had a dollar for every time a self-assured software and network engineer told me in no uncertain terms that âIâve got this.â, I could endow a cybersecurity chair at Swarthmore. Turns out that every freaking time, he (almost always a he) didnât really have this at all. And it was always sometime later and well into the distant future that we would discover the âoopsâ that led to a system crash or logic failure.
These days it shows up most frequently in devops or server configurations or network design or cloud implementations or SIEM tuning. Every critical aspect of cybersecurity is reliant on some form of human decision making and is as a result, doomed.
Micro-segmentation is a good example. In this case, we have well-intentioned network engineers and operations staff designing and configuring segmented networks to reduce our attack surfaces and corporate LoB owners pushing for productivity gains and business goals that are promised by the adoption of digitization strategies.
Itâs a convocation of human decision-making fallibility that inflames a highly complex and increasingly combustible problem space. And, hereâs why.
The typical user base and the devices and applications on a corporate network are increasingly geographically dispersed. As we have accommodated mobile and Internet-of-Things (IoT) technologies and adopted Software-as-a-Service (SaaS) applications in multiple public clouds over which we have limited security control, our attack surfaces have become increasingly difficult to protect
These rapidly expanding and fragmenting attack surfaces have created an array of new paths through which criminals can attack. The threats are increasingly more sophisticated, automatically seeking and taking advantage of any exposed vulnerabilities.
Itâs almost as if we felt that the cybersecurity challenge wasnât big enough, so we decided to throw in cloud services, IoT connections, open-APIâs and third-party threats just for laughs. Whatâs next? Oh. Security for wireless sensor networks using identity-based cryptology. What could possibly go wrong?
In most organizations, cybersecurity has become a reactive exercise, as we have thrown in the towel on trying to predict behavior from the monster we unleashed. In this new, highly dispersed and heavily connected environment, IT is unable to prevent lateral movement of intrusions across the devices and applications connected to and traversing the network resulting in complete proactive impotence.
Historically, the response from smart network engineering and operations leaders has been to move to micro-segmented networks. The best techniques for doing so were based on IP addresses augmented with VLAN segmentation and VMware NSX segmentation for virtualized workloads. Networks based on Cisco hardware relied on Cisco ACI segmentation using physical switches and VXLANs.
These micro-segmentation techniques enabled access control policies to be defined by workloads, applications, or by architectural attributes such as the virtual machines (VM) upon which the applications, data, and operating systems resided.
In these segmentation approaches, firewalls were commonly used to separate the network resources for each group. In theory, this approach was to prohibit any unauthorized traffic from moving between segments. When an attack breached network security in one area, micro-segmentation techniques would prevent the spread of attacks laterally to other areas of the network.
In reality, as is the case with so many of the other well-intentioned paths to the cybersecurity promised land, micro-segmentation isnât the panacea that many sought. It turns out that unless the network infrastructure is designed properly, dividing a complex corporate network into a large number of small segments may limit visibility into threats and attack mitigation activities across the entire network.
There are three primary problems with the segmentation techniques currently in vogue:
1. Access control for internal network segments is designed from the architecture up, a tactical approach that cannot easily adapt to changing business needs.
2. The trust valuations on which access policies are based tend to be static and become quickly outdated.
3. Access control policies cannot be effectively enforced due to a lack of advanced (Layer 7) security components from the data center to the network edge and moreover are unable to see and control these components efficiently.
These all stem from the fact that IT network engineering and operations folk plan the segmentation architecture without adequate attention to cybersecurity best practices and without consulting their counterparts in information security. This is not a criticism of IT network engineering or operations personnel. We never have and donât now put sufficient emphasis on cybersecurity in the design stages of anything we do whether it be devops or network engineering or cloud migration.
If we want a more risk-wise approach to network segmentation, we need to start at the beginning.
On paper, the design of the corporate network is dictated by the needs of the organization as it evolves. The rules governing who and what can access which network resources are determined by business policies, industry standards, and government regulations. The network operations team should follow these rules when configuring the access control settings which permit users, devices, or applications to access specific network resources.
Network engineering and operations leaders will immediately recognize two downsides to this approach.
First, the business processes, compliance requirements, and network access needs of an organization are vastly more complex than the structure of its network. Consequently, it is very difficult to use the network architecture to define secure segments for network resources that will be simultaneously accessible to all authorized users and applications and completely inaccessible to all others.
In practice, there will be security gaps; access scenarios that the network architects did not envision, which bad actors can take advantage of. With todayâs advanced, sophisticated malware, they are doing so already.
Second, any process, regulation, or organizational structure is liable to change. So, even if the optimally secure network design were achieved, it would have to be amended. Once again, there are numerous opportunities for security gaps, not to mention the time and cost involved in the reconfiguration, which few networking teams can afford.
To effectively manage cyber-risk, network engineering and operations leaders need to have current and accurate information on the trustworthiness of users, applications, and network assets. Their internal firewalls or other access control mechanisms that enable or prohibit traffic flow between network segments must always be from up-to-date trust data. If trust assessments are out of date, the segmentation technologies become useless at preventing potential threats from moving laterally through the network.
The quality of trust data is becoming a pressing issue in network segmentation security because the actual trustworthiness of network resources can change unexpectedly. Lots of companies have been surprised by attacks from within the ranks of their trusted employees and contractors.
More than one-third of reported breaches involve internal users, and 29% involve stolen credentials.
Some businesses have responded to these dangers by locking down their networks, trusting no user or application and creating layers of verification before permitting access. This is of course, not a practical solution as network engineering and operations must protect sensitive assets, but not at the expense of imposing extremely unnecessary burdens on those who legitimately require access to those assets.
Access control policies cannot work as expected if the network is missing key elements of an effective security infrastructure. Traditional approaches to network segmentation assume that all the necessary network security components are in place to execute whatever access control policies the IT team defines.
However, this assumption is usually wrong.
The network engineering and operations team driving segmentation may decide that some network segments with smaller attack surfaces are adequately protected without Level 7 advanced enforcement.
Due to lack of budget or simply because deployment and management requires too many resources, network engineering and operations teams seldom deploy next-generation firewalls (NGFWs) and other advanced threat-protection solutions everywhere they are needed (like in every cloud in which they operate, and at every endpoint and IoT device).
The security components that are in place are often not fully functional. Examples abound but the most common is a network team that intentionally turns off secure sockets layer (SSL)/transport layer security (TLS) inspection in their NGFWs in order to optimize network performance.
We all want fast networks but opening the door (front or back) to illegitimate traffic negates all other efforts to secure the computing environment and is a really bad idea. Again, this is not a critique of network engineers or operations folks. In almost every case, these folks are simply trying to do the impossible job of serving a multi-headed monster with conflicting requirements under extreme pressure.
The effectiveness of cybersecurity components is reduced if they are not tightly integrated.
Lack of integration has several implications. For example, when one firewall detects a suspicious packet, it takes several hours until the information is picked up by the security team and disseminated to the rest of the network. Additionally, disparate security solutions cannot easily share threat intelligence, neither globally acquired intelligence on known and emerging threats nor zero-day threat intelligence regarding newly discovered threats.
It is the principal reason that the mean time to identify a breach remains high, at 197 days.
These conditions present network engineering and operations leaders who believe their segmented network is well-protected with a false sense of security. An ongoing end-to-end security assessment would tell them how their security platform is performing and whether their access control policies are achieving their business intent. But, without breadth of security and end-to-end visibility, a reliable assessment is just not possible, and it prevents network engineering and operations leaders from reporting accurately on their companyâs cybersecurity posture.
As weâve acknowledged, networks in which the segmentation architecture constrains business intent do not support progress toward organizational goals, but if performance priorities supersede security concerns, segmentation will likely result in reactive and ineffective threat mitigation.
Without adequate visibility into the exposures caused by network segmentation, well-intentioned network engineering leaders may be adding fuel to an already combustible attack surface. We are all humans and humans make mistakes. Especially under pressure. And in todayâs cybersecurity world, we operate under constant and unreasonable pressure.
Having said all of that, no amount of warning or lengthy arguments in favor of cautious planning and testing are ever going to stop or slow this train. We see it every day.
So, as I always try to offer some positive steps that we might take to avoid a train wreck, my advice with network segmentation is to accept the inherent dangers in the design and implementation processes but invest in a third-party audit to assure that another set of eyes can peruse and certify that the work was done correctly and makes technical sense. An external audit of your segmentation design and a related risk assessment might be the best money you can spend.
(Full disclosure: Blackhawk doesnât do network segmentation design audits or vulnerability assessments. But there are many companies who do. My only dog in this hunt is reduced cyber-risk. Good luck.)
Strategic Office Networks, LLC and Advisor to the Autonomy Institute
5yGreat article Steve. Thank you.
Strategic Office Networks, LLC and Advisor to the Autonomy Institute
5yI think we need to focus more on developing 'safe harbors' and secure regional intranet designed solutions.
EU Defence Solutions Lead @Google
5yThanks - great article! Indeed with the expanding level of complexity, a multidisciplinary approach is essential. With inherent doubt concerning the origin of behavior given prolific exposure, inferential statistics can assist in sorting data wheat from chaff. Social science statistical methods are based upon many years of dealing with incomplete and poor quality data, drawing inferences from âfaint tracesâ. Until stricter authentication methods are adopted universally, we will have to deal with the status quo: finding ghosts in the machine with statistical insights. As the article points out, any attempt to develop situational awareness depends upon an informed understanding of primary entities within the operational context (the logical and physical infrastructure). âWhat is a user?â is a complex question with several possible answers depending upon where one views events on and within the network.
EX-CEO/Founder NimbusID.com
5ySteve King very well written article, thanks Steve. I would like to add if I may, that if the methodology for current authentication add Identity Test to any of the current access credentials , a huge portion of prior access breach (such as 81%of stolen access credentials per Verizon 2017 report ) would simply significantly go down.
Information Security Officer
5y"An external audit of your segmentation design and a related risk assessment might be the best money you can spend." Et in saecula saeculorum Amen.