biggest lesson from rsa: security really is hard

The RSA breach details will spark discussion and armchair quarterbacking for years, that’s a given. But I can at least pile on a little bit more here and there, yeah? The SANS ISC weighed in on some of the RSA details, and I wanted to pull out small bits to tackle briefly. Here’s what I consider the prefacing assertion:

There are just too many ways to circumvent the perimeter, spear phishing being just one.

1. “The thing is I don’t think this new paradigm is so new. Many have been advocating for years moving the prevention and detection closer to the data.” – We wouldn’t be *quite* (important word!) as concerned about the circumvention of the perimeter if we didn’t have such awfully porous technologies sitting on the desktops. Yes, you, web browsers and attending tech (Flash), Adobe, and Office. You’ve all become too bloated to be secured anymore, and it’s your own damn fault. Just think if these were much better secured by default. We’d have less updates which should mean better ways to keep them updated on desktops, etc. (Some may argue that others will just take over the role, and perhaps that is the inevitable result…but you can’t ignore that these porous softwares are making things worse. The insecurity of the interior is making the security of the perimeter worse. If you improve the middle, the perimeter is better valued…from a certain perspective.)

2.”There are a lot of approaches that can be used here, but in my mind it begins with segregating servers from the desktop LAN and controlling access to these protected enclaves as thoroughly or better as we do our perimeters today.” – This is one area where the “cloud” is actually useful; it moves data away from these workstations…sort of. One could still argue that access is access, whether you have 3 firewalls or 0 between them. But any push to get users better segregated from servers when so many are in a shared network by default, is a good thing. If nothign else, this can push better documentation on data flow needs. This should also include better egress controls…yeah, I’m looking at you FTP-exflitration. (Of course, lock that down, and even more people/devs will just use 80/443 more…)

3. “It means classifying your data and installing protection and detection technologies appropriate to the sensitivity of the data.” – I imagine the most common way of classifying data in an SMB is saying it’s all secret. Classifying data is great and makes a lot of sense, until you get into the reality of *gasp* actually doing it. This is where the CISSP hits the road and suffers the knee and elbow scrapes.

4. “It means installing and tuning Data Loss Prevention (DLP) technologies to detect when your sensitive data is leaving your company.” – Just don’t fall into these three traps with DLP: First, don’t expect to plug it in, do a few hours of tuning, and then forget about it. It’ll need ongoing love. You will have false positives and small incidents constantly, or it’s not tight enough. Second, don’t think you can tackle DLP without first coming to terms with data calssification, or at least doing *something* to identify your data and flows. Third, don’t think that DLP will block/detect everything. Does it interrogate 443? Should it? And so on…

5. “It means instrumenting company LANs and WANs so a network baseline can be determined and deviations from this baseline be detected and investigated” – This is another idea I find compelling, but the cloud isn’t helping, nor are consumerland technologies that just spray garbage into your baselines and everyday traffic patterns. Still, if someone FTPs a large amount of data to an external source you’ve not seen before, you really want to know that happened. But again, this is just a part of a blended network security posture and not something to even do until you’ve a maturing security team/process.

The end result of all of this is: SECURITY IS HARD. And it’s only getting harder.