Story image

In plain sight: Avoiding loss of visibility and security in virtualised environments

21 Jan 16

‘Check your blind spot’ is a message most of us remember from our driving instructor. The hazard you can see is one problem; the hazard you did not even know was there is potentially a much bigger one. 

The same principles apply to enterprise data security, where dealing with network blind spots is an emerging challenge for many organisations.

The volume and variety of data carried by a typical enterprise network is growing all the time, creating a complex, changeable and noisy environment that makes analysing security and performance increasingly difficult and yet more critical than ever.

Looking into blind spots

To counter these blind spots, many organisations are using Network Packet Brokers (NPBs) as a core element of their network visibility environments to receive packet-level data from their virtualised and physical networks. 

It is the job of the NPB to sit between the network taps and the organization’s security and performance monitoring solutions, aggregating and filtering all data packets, and feeding them to the security and monitoring tools. This enables the tools to analyse the information and detect any potential security or performance issues. 

Intelligent NPBs perform a range of packet operations to preprocess data packets they pass on to monitoring tools, such as data deduplication and packet trimming, that are intended to reduce total solution cost by improving tool efficiency.

An effective NPB intelligently processes all data packets – without losing any. At least, that’s the theory. It turns out that one of the most hazardous blind spots facing IT and infosecurity teams today can actually be caused by an intelligent NPB that is intended to improve visibility! 

The first issue is that some NPBs can drop data packets while aggregating and deduping them. Consequently, the security and monitoring tools do not just receive filtered, streamlined data – they receive incomplete data.

Packet losses of up to 30% are not unusual with some NPB solutions under typical operating conditions. Any packet loss in the NPB directly and dramatically reduces the effectiveness of security tools. 

For example, if a hacker uses packet fragmentation to split an exploit across multiple packets, then an Intrusion Detection System (IDS) will likely be unable to detect an attack if it loses a number of the packets involved.

The second issue is how can you detect this data loss? As the function of an intelligent NPB is to reduce load on security and monitoring tools, it will normally discard extraneous packets during proper operation. This fact makes it practically impossible to notice when the NPB is also dropping critical packets just by examining the counters on the NPB. 

Live networks are constantly changing, so it is impossible to identify what the data packet counts should be at any point in time. The only way to determine if an NPB is subject to dropping critical traffic in your deployment is to evaluate it with a controlled load before making a purchasing decision and placing the unit in service in your live network.  

So some NPBs potentially cause not only data blind spots – but blind spots you do not even know about.  The implications of these blind spots for security and your company are business critical.

You cannot secure what you cannot see

If network visibility information is missing, then it does not matter how effective your organization’s security tools are; they are always going to miss security events that they cannot see.  

So the blind spot could be concealing a network intrusion attempt, signs of abnormal bot traffic, or data exfiltration following a successful exploit.  The longer it takes to identify abnormal network activity, such as a hacker infiltrating your network, the more information that hacker will steal.

It also creates a compliance issue. Organisations subject to data security standards like PCI DSS or government regulations like HIPAA could be vulnerable to compliance violations by failing to monitor 100% of their data traffic.

Such violations can be costly in terms of reputational damage as well as government fines. Even if sensitive data is never actually compromised, the inability to demonstrate comprehensive data monitoring is enough to fall short of many regulations.

Incomplete data monitoring also means inadequate understanding of traffic volumes, and therefore an inability to predict when systems may be about to fail – none of which helps IT and security teams to gain control of their networks. 

Ensuring total visibility

So what is the solution?  Not all NPBs are created equal.  NPB solutions are still the most intelligent and effective way of gaining visibility across network environments, ensuring that security and performance monitoring tools have efficient access to 100% of enterprise information – providing that they do not drop data while aggregating packets. 

As such, it is critical when looking to implement a new NPB to ask serious questions of its chosen vendor, such as:

  • How does your solution trim data packets and eliminate duplicate ones?
  • How does it carry out these functions without introducing any additional packet loss?  
  • How does it perform under varying network loads? 

Savvy decision makers will not rely on vendor claims, but will be sure to test the solutions using known loads prior to making a purchase decision.

Clarity on these issues will help to guarantee the selection of a truly efficient NPB solution:  one that ensures network blind spots are eliminated, and potential threats can be seen – and secured.

Article by Glenn Chagnot, Ixia senior director product management

Dimension Data nabs three Cisco partner awards
Cisco announced the awards, including APJ Partner of the Year, at a global awards reception during its annual partner conference.
WatchGuard’s eight (terrifying) 2019 security predictions
The next evolution of ransomware, escalating nation-state attacks, biometric hacking, Wi-Fi protocol security, and Die Hard fiction becomes reality.
Rimini Street hits NZ shores with new subsidiary
The third-party support provider for Oracle and SAP has opened a new Auckland-based office and appointed Sean Jones as NZ senior account executive.
Why the adoption of SAP is growing among SMEs
Small and medium scale enterprises are emerging as lucrative end users for SAP.
Exclusive: How the separation of Amazon and AWS could affect the cloud market
"Amazon Web Services is one of the rare companies that can be a market leader but remain ruthlessly innovative and agile."
HPE extends cloud-based AI tool InfoSight to servers
HPE asserts it is a big deal as the system can drive down operating costs, plug disruptive performance gaps, and free up time to allow IT staff to innovate.
Digital Realty opens new AU data centre – and announces another one
On the day that Digital Realty cut the ribbon for its new Sydney data centre, it revealed that it will soon begin developing another one.
A roadmap to AI project success
Five keys preparation tasks, and eight implementation elements to keep in mind when developing and implementing an AI service.