helping home users be more secure: just a dream?

I started out the week pointing towards people doing some thinking. I figure I’ll end the week the same way.

Bruce Schneier posted an article about home user security knowledge I really like, since I’ve been saying the same thing, roughly.

At work, I have an entire IT department I can call on if I have a problem. They filter my net connection so that I don’t see spam, and most attacks are blocked before they even get to my computer. They tell me which updates to install on my system and when. And they’re available to help me recover if something untoward does happen to my system. Home users have none of this support. They’re on their own.

Absolutely true. When I purchase a car, do I have a manual on how to tune and maintain it or troubleshoot it when things go wrong? Do I even get to see the standard specs for safety and security? Hell, do I get a lesson in changing my oil? Nope. And we expect people to “get” the much more ephemeral workings of a computer when not everyone has nearly the logical mind that most techies have? Yikes!

If we want home users to be secure, we need to design computers and networks that are secure out of the box, without any work by the end users. There simply isn’t any other way.

I agree, although that doesn’t mean we should dump user awareness totally. But really, corporations (and us geeks!) need to buck up and help their own employees at least a little. Training at work about security and computer usage will carry over into their home life. If nothing else, perhaps they can bounce home computer questions off the cyber talent present in the organization. I know us techs hate troubleshooting home PCs, but giving free advice is not nearly as painful.

What digs at this approach, however, is while advice is free, most people just want someone else to do it and do the thinking, the dirty work. Not everyone is into computers as much as us geeks, and they simply don’t want to be. Just like I don’t change my own oil, and really don’t want to be troubled with it, despite how necessary it is to protect my investment. Anything beyond “don’t install random things,” and “don’t click links in email,” is still too much to trust most end users to understand.

Sadly, we have a huge computer security industry now, and they simply will not let someone like Microsoft put out a solid, more secure OS. Which puts us in a real bind… In the end, insecurity may just be a permanent reality, just like crime in general is a permanent reality, or home insecurity is a permanent reality (when assuming cost is realistic).

fake web filter pages

April Fool’s Day idea for sites bigger than mine: Replace the site front page with a fake Websense/SurfControl blocked message and get everyone to ask their admins what’s up. “I swear, we’re not blocking it! I don’t know what’s going on!”

training the technologists

As this year has gone by, one thing has become pretty solidified in my mind: training for security and IT/developers is necessary. I’d rather have training for them than for users in general. Not all security measures can be adopted in every organization, so not just technical training, but training to be aware of the risks and how they affect the business needs. For instance I can see some organizations thriving while users run as local admins. Why? Because the risks are known and dealt with in other, often-times more creative ways. And yes, this may incorporate user awareness training. I’m not against user awareness, I just put it lower on the priority list.

If you can’t build things securely, or secure them accurately and quickly, then business needs will almost always win over security. From tasks to projects to software.

One might think training should be for manager levels as well. But I would counter that managers can learn a hell of a lot from their employees, with good, trusting communication.

has security gently guided technology development?

Does information really just want to be free? Or systems that is?

In the beginning we had ports on systems running their own services. Port 80 had HTTP. We blocked ports we wanted to stop.

Then services started tunneling themselves through port 80. We started inspecting traffic over port 80 and denying what was obviously an improper request, usually HTTP. We even added software installation denials.br>
Applications started going to the web, because then they look like the normal HTTP traffic we didn’t want to block, and used an application on the desktop they knew we couldn’t fully deny. We need more application-aware blocking (deeper inspection, HIPS, and even DLP types of technology).

Soon, I suppose Google will offer up the OS on the web, and we’ll connect to a portal that will offer us everything we need, a veritable AOL “walled garden” on the web. What then? Vista is portending the death of the OS as we know it…right? A return to dummy terminals, only this time enabled on the Internet through the browser?

Is security to blame for part of this?
(Let’s say we do get back to client-server types of architecture, does that mean we’re done with endpoint security because the endpoints will become expendable plastic? Will the Web OS go the way of AOL? Sure, it may eventually offer a ton, but do users really want the freedom to do what they want, even if those choices and risks are bad? Do you want to decorate your house one way, and just adhere to slim building and fire codes or rather have a cookie-cutter home with small cosmetic differences? Ahh…)

document your code

Over on Chris Shiflett’s blog is a guest post from Elizabeth Naramore, php/web developer, in which she talks about commenting and documenting code, using a dishwasher as a common analogy. The post is well-written and can apply not just to code documentation, but security process documentation as well. Many of my colleagues hate doing documentation and as such we have painfully little of it, but I’ll always do my best with it because I think it is especially valuable. I think some people think it is so simple, they never get around to it, and as such, this “simple” thing never gets done.

linux networking cookbook

Today at the bookstore I ran out of magazines to browse over lunch, so I meandered to the book section and picked up Linux Cookbook by Carla Shroder. I really enjoyed the parts I skimmed through, both her style of presentation (excellent!) and the subject matter itself. Very good stuff! Unfortunately, the book is getting dated, and I just really can’t justify buying a dated book when many of the topics I could find updated through Google.

Still, I really do like to find recent material on authors whose style I really dig and really works for me. I saw she did some weblogging with O’Reilly, meandered over there, and I see she has a brand new book out, Linux Networking Cookbook. Oh my, right up my alley! If her style of writing has not changed, I will be picking this book up from BookPool once it is available. I may actually pre-order it now to get the current sale discount… I really like the “cookbook” style books from O’Reilly. I totally enjoy being able to put a book on my desk, and look up various things that I want to do or learn at various times; as opposed to reading a book cover to cover.

the application aware firewall

DarkReading has an article up about next generation firewalls including true IPS and application awareness. First, read the article.

Second, the inadequacy of firewalls that only go by ports has been known for what, a decade now? And the trend of applications moving over port 80 is about as old. I just don’t like reading “news” about such ideas. But that’s my only real complaint on this article.

This is all an interesting topic; getting firewalls more in touch with the applications, and as Hoff suggests, getting more in touch with the data. “Even so, giving the firewall an application protocol view still isn’t enough, security experts say. ‘The problem is that applications are merely conduits. Data is the real problem,’ Hoff says.”

Unfortunately, in 20 years from now, will we be saying this new next gen application firewall with its signatures and traffic inspection is yet another colander, where all applications not only tunnel through port 80, use the web browser, and also avoid known bad signatures? Will this be any better than blacklisting traffic/domains/ports and trying to keep up with them? Perhaps, perhaps not. But technology has moved more emphasis on applications (or even just one application: the browser), and thus firewalls (and security) need to keep up.

Regardless of the effectiveness or IPS-like ability of such firewalls, we still cannot begin to replace a human analyst looking at such gathered data. And we can’t begin to properly protect the networks without being able to inspect application traffic. We can’t stop what we don’t know is happening. If nothing else, I welcome the day when firewalls will be able to be their own IDS, with the ability and accuracy of a best-of-breed standalone IDS.

laptop users who replace their hard drives

Do your laptop users complain about lack of admin access when on the road and trying to install a new printer or some such device? Are you *sure* they’re not just buying their own laptop hard drive, replacing the corporate one with their own, and running anything they want?

Of course, should you care?

security requires imperfection

Yup, it’s still a thinking week! Rybolov has joined in posting about security vs a zero-defect perception.

Of course, what does this have to do with security? Well, in most companies and the government in particular, you’re trying to project a zero-defects image to your customers. That’s the way the business and marketing side works. Marketing and security don’t mix precisely for this reason: one is trying to project an image of perfection, the other needs understanding of flaws and risks in order to make informed decisions.

Yup! That’s why people get their faces all scrunched up when the security guys say, “well, we could still be penetrated by a really skilled hacker…” They want zero-defect perfection; a state where they can sit back and be ultimately secure, even if they realize technology changes they still want some state of secure for the now. We actually require the imperfection in order to evaluate and improve (and prove!).

passing back values to a calling powershell script

I’ve previously posted on how to call use a PowerShell script to call another PowerShell script, even with a variable passed! What about returning a value to the calling PowerShell script? This is actually pretty easy and intuitive for a single variable. In my case, I want to know if the called script failed or not.
This first script simply calls the second script, test2.ps1 and sets its result to $return. Then I echo back the $return value to make sure it stuck.

script1:
$return = & “c:\script2.ps1”
$return

This second script simply prints text to prove it was called, then returns back $true.

script2:
Write-Host “Hello World!” -back yellow -fore green
return $true

And this is the result:

PS > ./test1.ps1
Hello World!
True
PS >

There are no doubt more sophisticated means to return multiple values and even objects back, which may or may not be the same thing as I’ve given above, but this sufficed to meet a need I had to just pass back a complete/fail variable.

sans top 20 has lost its flavor…

Yes, SANS has released their latest top 20 Internet Security Risks report. And Dark Reading points to it.

Tim Wilson at Dark Reading opens up: “There are two major problems with the security of computers: the people who use them and the people who write software for them.”. You don’t say?!? I think that covers everyone except my grandmother…

Ok, so Tim’s article gets better and I like his pointing out that home-grown apps are big threats, which will make people think a bit more about open source and other, well, home-grown apps. Paying for software every cycle sucks, but is the cost of the software worth the possibly improved security and support? Good question.

My biggest complaint about the SANS Top 20 list these days? It’s too nebulous. Let’s see…web browser, email clients, media players, and office software. Did they leave anything out?!?! Yes, IM services…oh wait, they got that too.

Windows, *nix, and Mac. Uhh..again, did they leave anything out? Well, yes, they may have missed something, but the catch-all Zero-Days kinda covers the ass end of the list.

Yeah, thanks for this wonderfully nebulous list that really is far less actionable than it used to be. Sure, it illustrates our security risk landscape fairly well, but it is definitely targeting managers and less involved/informed people these days.Rather than being the top 20 risks, it is basically an all-encompassing “here’s all the risks you need to worry about,” list for CSOs and journalists to care about.

Fine, there is at least one thing missing. Wireless issues, both with regards to 802.11 devices and Bluetooth. Sure, they mention it twice, once in Unauthorized Devices and again in Instant Messaging, but that’s just lame and really does downplay the issues. Sure, you can’t have someone in Russia sit down and pwn every Starbucks wireless user in 60 seconds, but the problem still exists on a microscopic level. Want to fly under the radar or target an exec because you’re being paid by competition to do so… Hell, it would have been trendy to include this with the simple mention of the alleged intrusion vector for the TJX breach.

Alas, I still like the list because it gives us something to point to when management thinks the world is peachy-keen and full of rainbows in our office. Still, I’d rather this list were still interesting and relevent to me, rather than trying to be a “list” that tries to capture everything. Maybe it’s just a sign of a maturing industry and a much wider interested audience that needs to be included…

deep thoughts by jack infosec

It’s that time of year. We can sit back on cold near-winter nights in front of a fire with a pipe in hand, rocking back in a comfy chair and muse. Yup, it’s a time this week for discussions in information security!

Hoff has been talking about valuing information security, always a passionate subject for everyone, and one without a clear (or even muddy!) answer. He’s also talking about security and disruptive innovation. Good stuff to read! Oh, and while you read what he has to say, try to convince him to change back to “Rational Security.” I tried to register rationalsurvivability.typepad.com but wasn’t willing to pay the initial fee…doh! There isn’t even a category on his site for surviability! Fad! Fad! I predict he’ll quietly revert back after the start of the year. 😉

It really felt like Bejtlich was gearing up for some revelatory posts, and he pushed one out in talking about how controls are not the solution. Instead, look at the outputs.

And Mogull had a nice comment in a recent post of his, “While the encryption market isn’t nearly as big as most of the world wants you to believe…”. I agree. I think many are waiting for this “market” to turn into the inevitable fea…no, it won’t be a “feature,” it’ll eventually be standard and just accepted. For now, HDE/FDE is still difficult to manage across an enterprise, wrought with frustrations, and managers would rather see less mobile devices anyway. Why protect the laptops we really dislike deploying? Just deploy less! And so on…

morrill’s top ten things in info security to do now

Dan Morrill posted the “top ten information security issues to tackle now” which I find extremely cool. I’ve jotted some reactions below.

Get an Evangelist– I just wanted to highlight this option as an alternative to the misguided efforts to “make IT more business savvy” and the vice-versa option. A liaison is truly what is needed. You don’t tell Accountants to be able to throw down a sales pitch to a client, nor ask Sales to troubleshoot their own PCs (oh christ do they try though!). You get people to interface across the boundaries, not try to get everyone able to do everything. Sure, IT people do need to come out of their shells a bit and yes, be a bit more business savvy, but lets not turn that into the savior of “IT vs business side” heartaches like I’ve seen attempted.

Train IT– YES! And remember that training can also include self-training. Give us some time during our days to properly self-train on new technology. This can save a new hire or formal (spendy) training. Most of us are in IT for various reasons, the most common I bet would be our joy at solving problems and puzzles. Yes, we also do get depressed when we can’t tackle the new VOIP system properly because we just don’t have the free time in our schedules…

Develop a defense in depth program for the company…Listen to your IT department; they know where the bodies are buried.– Amen! Talk to IT, and have them list their pet projects or things that just have never gotten done but they’d like to get done. I bet a lot of those projects are solid projects that would fit into a defense in depth strategy. Keep that master list and start ranking and evaluating the options. Then start knocking some of them away! Sure, the list may be a depressing list at times, but we all need roadmaps and IT workers have their fingers down on the pulse of the company’s technology and information.