Mobility has its limits, especially if your ISP prefers you use their DNS servers, but then does not want you to use their DNS servers from IPs that it does not own. What to do? Many tech geeks have ways of finding DNS servers they like, some use their own or DNS servers from their work, but your average home user probably wouldn’t know what to do. OpenDNS sounds like a nice idea to get free DNS use. In fact, it offers up some services that may be of limited (read: better than none) security in blocking phishing sites and doing some spelling correction (for commonly misspelled sites that take you places you’d rather not see). Sounds like a nice enough deal to try out. However, in reading their marketing material about being blazingly fast and such, its really just all talk. It is no faster-feeling than any other DNS server, really. I suppose, however, that this thing can be programmed to adjust ad-ware and spyware and even botnet DNS calls as well, helping to quell botnets and other malware from contacting dynamic home adresses.

The only thing to keep in mind is what this service’s business model is. It is a free service, but nothing is ever really free, no? Perhaps they gather statistics on DNS queries and sell that gathered information in creative ways. Perhaps they will be able to log your queries and better tailor things to you, such as crafted DNS queries much like Google puts relevant ads on gmail or based on various searches. Either way, there shouldn’t be too much “badness” involved in something like this, and even if there is, it is only IP address and DNS query badness. For someone like me who will use this on a laptop that roams around, I’ll end up fairly anonymous as it is.

10 tips for using vpns

I know this is ComputerWorld, one of the ad-driven free mags that tend to review products and state the obvious, but this quick article on 10 tips to secure VPNs is a pretty good and quick read with some specific technical details as well as common sense items that are sometimes hard to get management levels to listen to (such as only opening the VPN to those who truly need it). I like that some of the points are actually alternatives, such as secured mail or SSL/passworded web sites when, really, the need is smaller than the justification for a full VPN solution. Unfortunately, in other instances like jailing users from the rest of the network are a bit more advanced and complicated.

Of note, this response was given on Infosec News and deserves to be read in conjunction with the original article as the author makes some excellent points.

crimeware and phishes

I think one barometer of how IT and security are moving more in tune with the business world instead of being some back room geek department, is how often I read buzzwords and newly created words.
I just read the Websense H1 2006 Security Trends Report and was amazed at all the new words I found.
We have malware, adware, and spyware. I guess I should have seen crimeware coming. Websense I guess crimeware is software use to commit a crime? I think I will stick to malware as my term of choice. I have also seen eCrime.
I also gleefully read how a host with multiple phishing sites is termed to be host to numerous phishes. Phishes…does that mean the host can be called a phish tank, or perhaps a pond? And would abandoned sites be phishheads? The report also referenced spear phishing, which is a more targeted phishing attack. Honestly, I think almost all phishing attacks are a bit targeted. While that term I have heard before, it still amused me since I started looking for these creative terms.
Screen scraping applies to those malware components that take screenshots of the users screen, a means to thwart captchas and virtual keypads kinda like a keylogger for the whole screen itself. Screen scraping just does not sound fun, and reminds me of a windowwasher or perhaps a visit to the dentist.
Now, while I might poke fun at the report for the terms used, the information presented is excellent and a very good read on the trends that Websense has been experiencing so far this year.

google desktop search forensics

This paper about the use of Google Desktop in forensics is concise and informative. The most interesting aspect of this is just how much Google Desktop indexes and makes copies of. Email, local files, network files, and even web surfing histories are stored independent of those applications of the OS. This means that even a laptop that shouldn’t have sensitive data on it may still contain copies of open network share files that the user has access to, confidential emails, or even files from other users on the same system. In addition, web surfing history and some artifacts are also retained, even if the user attempts to clear those things in the browser options or with a third-party privacy tool.

The only limitation so far is that inability to just read the files. You have to copy the files to a separate machine, make them Read Only, and then open those files in that machine’s Google Desktop Search tool. But still, this can act as a powerful tool to find some artifacts. It can also act as a surprising vector for data leakage in an organization.

application whitelisting

Read this article on about whitelisting of applications. I like this point:

But whitelisting has a down side. These endpoint tools come with plenty of administrative overhead as well as security risks. “The institutional overhead in maintaining them is extreme,” says Thomas Ptacek, a researcher with Matasano Security. “Some poor group of souls in IT is charged with deciding which applications every sales person or project manager can run, and has to backstop all the ensuing arguments.”

What are the pros and cons of application whitelisting, and where do I stand?
First, when machines are imaged or supported by IT, they should have a list of applications that need to be loaded for new hires or replacement machines.
They should also have a list of applications to expect, that IT may or may not have to provide at least a little bit of support for (yes, we’ll help you with Outlook, no we won’t help you with Alefox or IE toolbars). Related to support, security persons responsible for keeping up to date on patches need a list of applications they should be checking. IT should not be expected to be knowledgable on patches for every toolbar app that may be used in the corporate environment.
Additionally, disaster recovery may require knowledge of what is necessary for groups such as sales people to do their jobs.
Much like firewall rules, default deny with a whitelist of allowances is much easier to maintain than a blacklist. You can blacklist categories of applications (P2P, IM, etc), but even those lines continue to blur. However, we already do see lines blurring in those categories.
Take this scenario. Sales requests a new application on their machine. Those “poor souls” in IT then have to research it and either add it to the whitelist or explain why it should not be allowed. With strong policies and management support of policies, this might be ok, but I believe most companies will put those “poor souls” in the unfortunate position of either saying “yes” to requests or being in a hard place when trying to say, “no.” The end result is wasted resources, unnecessary negative feelings towards IT by the sales group, and overall less authority. What if the sales group has already been using the application for 4 months? Those “poor souls” really are poor souls.
(Honestly, those “poor souls” need to be backed heavily by a manager-level person, otherwise anyone smart enough to do proper evaluations and even backstop the ensuing arguments is not going to be in this sort of a position for very long.)
And what if each department is asked to create such a whitelist of programs that are needed? I’ve seen managers throw back every single program they can think of, whether it is really necessary or not. “All of them.” Many managers and business users do not care to be bothered by such things, but will detest IT making the decisons for them.
As long as users run Windows, run as Administrator, and all sorts of things want to get installed or used (some even as benign as a proprietary web player like Flash or similar), trying to maintain a whitelist of programs that are necessary is difficult.
Whitelisting will stifle innovation and the ability to try out new applications and tools.
So, where do I stand in all of this?
I think some whitelisting is necessary, but it cannot end up being heavy-handed unless the company has some serious security requirements, small niches for their computer use, or is a majorly large network where application management is nearly impossible. IT certainly needs to maintain a list for proper imaging and support of workstations.
This goes back to what I said in my previous post: less rules.
Less rules. Smarter rules. Better mitigation, response, tracking. Better perception of organizational IT. Let people, within reason, do as they wish on their workstations in order to have a productive, happy life with the company.

user education does not work

From a CNET article,

“I don’t believe user education will solve problems with security because security will always be a secondary goal for users,” Gorling said. “In order for security to work, it must be embedded in the process. It must be designed so that it does not conflict with the users’ primary goal. It can’t work if it interferes.”

His first sentence is correct. It is true, user education will not solve our problems. If education solved our problems, we would have a different president right now.
Indeed, as I always say, security is a secondary goal, even for developers and network administrators, let alone your average regular user. Functionality is always first, i.e. getting things done. “Getting it done securely,” while a way for managers to package in security as just as important to getting it done, is still just a qualifier to “getting it done.”
The second sentence is correct as well, we need to embed the process as much as possible. The systems needs to be more protected from dumb users or just simple mistakes in judgement. The network needs to be more protected. This is the real key where prevention systems come into play. Detection works wonders, but the assumption that users will make mistakes means you need prevention, mitigation and incident response, and audit trails (detection and logging).
And then his last few sentences are the real problem. We have to do these security things without impacting the user’s primary goal of getting things done.
Now, I really believe education will not solve our problems, but it will go a LONG way toward helping. Just because education doesn’t solve all our problems is not a valid argument to say we should throw our hands in the air and not do any education. I like the mention in the article about giving users some education while actually attending to a problem. This is highly effective and focused education that can have an impact. Education makes an impact and some people do want to learn and be better about it, but it is true, it won’t solve ALL our problems. But the speaker is correct, we shouldn’t hold up education as the root solution to our problems.
It is highly important to make sure security does not unduly interfere with employees getting their jobs done. However, this goes both ways. Employees need to be receptive to changes in their job. A security-induced change may not even impact users if they were to just adopt the new way of doing their job. Sometimes this battle between security and usability is just human nature being stubborn and unwilling to change, even if those changes result in less work for the user.
I’ve slowly become a minor proponent of having less rules and less impact on users. I detest rules and limitations on my computer use at work, which impacts my happiness and thus my productivity. Now, I may be a bit more progressive in my use of the Internet than many people that I work with, but slowly, attitudes will change as more and more people enter the business world that have grown up with a computer in their rooms and their social lives have long incorporated the use of a computer through web pages, blogs, IM use, email, music, and so on.
We still need education, but we also need to make sure we do our professional diligence on the back systems and networks before dictating what users can and cannot do. And I truly believe we need less rules, overall, in our businesses. We just need smarter rules and enhanced incident response. Rules stifle innovation and happiness, and we need both in our businesses.

productivity gain from 30-inch monitor?

A researcher has posed that it is worthwhile to get a 30-inch Apple monitor ($1999) because it improves worker productivity.
I really think some researchers are just not that thorough. Yes, you can likely get more work done with more desktop real estate, but how does this compare to a dual monitor setup with, say, 2 17-inch or 19-inch monitors, which would cost far less than $1999? I think unless you need contiguous screenspace (such as with Autocad, Photoshop, or maybe movie editing), the dual or even triple monitor approach is much more worthwhile than one huge single screen.
Do we even need dual monitors? Not necessarily. I currently work on just my laptop screen, although I certainly would make full use of dual monitors like at my last job or at home. As a networking and security geek, I could actually make use of 10 monitors if I had them, displaying things like dashboards, traffic sniffing, alerts, remote control sessions, etc. But for your normal workers, one monitor, maybe two, is sufficient for their job. Eventually, I get into the realm of wanting separate systems as opposed to more desktops or monitors.
I will say, if you want to impress pretty much anyone at work, grab a spare system or two, set it up next to you, and have it running pretty graphs, traces, and dashboards nearby. People seem to think that amazing, even if it is just gibberish. 🙂

analogy thursday: web surfing

I am going to deem today analogy Thursday, as I was looking for some ideas on analogies for how dangerous the Internet is, namely the web. It is just an odd situation that the Internet is inherently bad and malicious and that users need to take care when surfing. Yeah, like many people really truly take care…
What if television surfing were as dangerous as web surfing? This means that as you flip channels into some of those more obscure higher-digit stations, one may just hijack your television box and switch channels around, or just force them to switch much slower or only view their station until you reset the box and start from scratch. Oops!
What if shopping in a mall were as bad as surfing for places to shop online? Outside of some shops we’d have people jumping up in front of you with signs and coupons and good deals in hand, sometimes getting right in your face and flashing their goofy colorful smiles, causing young children to begin crying. In addition, random stores may put things in your pockets that you won’t realize melted in the hot sun until you get home and put your hand in there. Oops! They might even put an RFID tag on you while you’re not paying attention, and then follow you around through the rest of the mall. And city. And into your home, happily writing down everything you do on the off chance that they will learn how to market better to you. Who knows when you get into those stores!
And those free samples of chicken at the grocery store? Yeah, nothing is free. In fact, those samples contain powerful lingering doses of laxatives that will force you to stay on the pot for an hour each day for a month. But hey, the grocery store offers toilet paper and other remedies for a fee to help deal with that!
What if browsing a library for books to read were like browsing web sites? Every now and then, a book would take it upon itself to grab your arm and not let go, despite the alarms you cause when you walk out of the building and the nasty looks you get. In fact, some books may look like children’s books, but inside are pop-up porn cut-outs. Oh, those long-lost joys of pop-up books! Yay!
Now, the one place where an analogy is a lot more appropriate for the web would be roaming around in nature. You never know if you might turn a bend and run into a bear, a rattlesnake, or even swim up on a stringray. You might just get chomped, bit, or speared if you’re not constantly careful and aware of the dangers. And the more dangerous a particular area seems, the more likely it is dangerous. Thankfully nature typically provides warnings such as a snake’s rattle or colorful markings on dangerous creatures. Likewise, web sites give off warnings too, if you know how to look for them. And would you stick your hand in a strange hole in the ground or sleazy looking pond without first doing some risk analysis on the odds of a badger or water-borne parasite present? And lets not even think about ninjas and how they might stealth up on the trail when you least expect it.
The web isn’t what it used to be. While it has become prettier (not including MySpace pages which is the new GeoCities) and more useful and informative, it has certainly become a lot more dangerous, insidious, and complex.

is security possible?

This topic has been buzzing around in my head a while now, and is finally ready to trickle out. But first, I need to set the stage. (This is going to sound more preachy then I intend, and has also become the unfortunate victim of me being interrupted a couple times at work and unable to put all of this down coherently…sigh. )
– You are never 100% secure, nor is there any silver bullet device, application, or methodology to security in this information age.
– Technology keeps moving at a fast pace, faster than it takes for any security team to dig solid trenches and fox holes and fortify the hills.
– And it keeps getting more complex, sometimes piling more complexity on top of insecure technologies. Complexity yields less security.
– Just today I read a couple doom-and-gloom articles by Richard Grimes, one recent and one from a few months ago. He has a point that security is largely lip service until AFTER “the big one.”
– Also some talk about more appropriate consulting and pen-testing from Dan Morrill and Wendy.
– Let’s face it, with so many different technologies, business needs, solutions (in-house and out-house), people, and problems, no two corporate networks are alike. Not even close.
Based on all of this, I am convinced of a number of things. First, we should all continue to share as much information as possible, and keep working at those communication lines. One thing that I don’t think there is enough of, is on-site tours and demonstrations. Case studies are one thing, but get me and some buddies in the industry into each other’s NOCs and systems and let’s see first hand what is working or not working. I would love to see how a company like Boeing manages and works their campus wireless systems. Yes, it might be a security concern to let me know, but like Schneier would say about crypto algorithms, if disclosure hurts it, it’s not secure anyway. Many corps have some excellent processes and setups, but they can never get talked about in meaningful ways that can help the rest of us. This is one reason I would love to become a pen-tester, assessor, or consultant…so I can see these solutions and build upon other people’s hard work and loving efforts.
Second, we need to look to securing our own islands first, before we’re going to be able to help with the whole world’s picture. What works for one island may not necessarily work for another island. We need to be aware of that, such that not only is there no one device or application that can give 100% security, but there is also no such device or application that is appropriate for all environments (something the sales people don’t understand). If we can’t handle the microcosm of our own networks, we have no hope to make sense of the macrocosm of the Internet and the world’s networks. Your island may be the only place you’ll be able to experience a wave of security nirvana…at least for a few moments. Besides, if internally we are unable to quickly show who has access to our client XYZ’s data that we are a custodian of, how can we begin to counsel other islands on how they should handle information?
Third, we need to fight the battle of complexity. Technology will move on and keep getting complex, but many attacks and defenses and competencies of security and security professionals remain grounded in simple basics. We need to keep those basics at the forefront of our minds, not make the security process so complex that we all stand up so high on rickety scaffolding as our foundation to climb to the clouds. Yes, it can be complex and full of frills and thrills, but never compromise the basics for those complexities.
Yes, security seems like a losing battle, but that is what makes this field exciting, ever-changing, a challenge, and a solid career. 🙂

google reader

I’ve tried a number of stand-alone and web-driven RSS readers in the past few months, but none really gave me what I wanted or presented it in a way that was compelling and simple and, well, just right.
Much to my surprise, I tried out Google Reader and was immediately hit by, “this is exactly what I wanted.” I added a few of the feeds I most regularly check, and I’ve been amazingly happy with this layout and simple feature set. I hope SufrControl doesn’t add this to the list of things denied outright (yes, web filters are evil, more on that in another future post).