Network Security, Network Security

It’s the complexity, not the size, that makes DDoS effective

Share

DDoS‘s popularity as an attack method can be explained by how important availability is to most organizations' ability to function. 

Availability is as critical to an organization today as electricity. If an organization is taken offline, it can lose the ability to generate revenue from its customers, or the ability to access cloud-based data and applications. And, if publicized, the downtime can damage its reputation and brand.

Arbor Networks' data, gathered from more than 240 service provider deployments, shows that, without question, DDoS attacks are getting bigger. Much bigger. Consider the statistics:

  • The average attack in September was 1.67 Gbps, a 72-percent growth from September 2011.
  • The number of mid-range attacks, ranging 2-10 Gbps, also has increased, up 14.35% so far in 2012.
  • Very large attacks, 10 Gbps+, were up 90 percent during 2011.
  • The largest attack this year measured 100.84 Gbps.
Hackers seek out pain points for an organization, like maintaining availability, and look to exploit weaknesses in infrastructure and existing security defenses. From that perspective, DDoS is a great tool. There are three main categories of DDoS attack:

Volumetric attacks

These attacks attempt to consume the bandwidth either within the target network/service, or between the target network/service and the rest of the internet. These attacks are simply about causing congestion.

Volumetric attacks first emerged in 2001 when Microsoft, eBay and Yahoo were taken offline by what back then was considered large attacks in the 300 Mbps range – a relatively low volume attack. With DDoS attacks now exceeding 100 Gbps, internet service providers are faced with new challenges of how to protect their networks and infrastructure.

TCP state-exhaustion attacks

These attacks attempt to consume the connection state tables that are present in many infrastructure components, such as load balancers, firewalls and the application servers themselves. Even high-capacity devices capable of maintaining state on millions of connections can be taken down by these attacks.

Application layer attacks

In 2010, there was a dramatic shift in DDoS, from primarily large volumetric attacks to smaller, harder-to-detect application-layer attacks that target some aspect of an application or service at Layer 7. These are the most sophisticated, stealthy attacks, as they can be very effective with as few as one attacking machine generating a low traffic rate (this makes these attacks very difficult to proactively detect and mitigate). 

**

Each of these attack types present unique challenges to network operators.

The easiest attacks to mitigate are volumetric, which can be effectively mitigated by cloud-based managed security services. Attacks targeting existing infrastructure, and those that are “low-and-slow” targeting applications, are the most difficult to identify and mitigate. What makes DDoS such an effective weapon in recent years is the increasing complexity of attacks, the blending of attack types, targets and techniques.

Take, for example, the recent attacks on financial institutions in the United States. These attacks used a combination of attack tools with vectors mixing application-layer attacks on HTTP, HTTPS and DNS with volumetric attack traffic on a variety of protocols including TCP, UDP, ICMP and others. The other unique characteristic of these attacks was the targeting of multiple companies in the same vertical at very high bandwidth.

Compromised PHP web application servers were used as bots in the attacks. Additionally, many WordPress sites, often using the out-of-date TimThumb plug-in, were compromised around the same time. Joomla and other PHP-based applications were also leveraged. The attackers uploaded PHP WebShells to unmaintained servers and then used those shells to further deploy attack tools.

The attackers connected to the tools either directly or through intermediate servers/proxies/scripts, and therefore the concept of command-and-control did not apply in the usual manner.

This complex, rapidly evolving attack vector requires purpose-built tools, both on-premise and cloud-based, to provide comprehensive protection against both large attacks and those that target the application layer. And until we see pervasive deployment of best practices defenses, we can expect to see DDoS in the headlines for years to come.

Winston Churchill offered some great advice that IT security professionals should keep top of mind as they adapt their defense to the threat landscape,

“Success is not final, failure is not fatal: It is the courage to continue that counts.”

It’s the complexity, not the size, that makes DDoS effective

Distributed denial-of-service attacks are becoming more potent, and truth be told, they're often difficult to stop.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.