it ain’t broken if we don’t see it

Brian Krebs, WashingtonPost.com, writes:

…far too many sites are compromised each month by hackers and scammers while their owners remain completely oblivious or in denial.

Logging and monitoring are hugely important, especially for catching break-ins and data theft. Data destruction is easy to see, but data theft is just copying data silently.
IT and business are becoming more and more enamored with feeling secure, or rather the attitude of, “We’ll look at the logs when something bad happens or we suspect something bad has happened,” which really means, “If we don’t look at the logs, nothing’s wrong, so let’s just go about our business.” Or a company will throw in an IDS/IPS device or log parser, but not devote the on-going manhours or staff to properly understand the device and be able to accurately monitor/parse while also being given ample time to investigate and acknowledge the various alerts.
Data theft will not necessarily get better soon. Large-scale regulations like PCI and others are really pushing the standards higher, but they are still ambiguous at times, and can make companies look better on paper than they really are in practice. Legislature and laws on disclosure of breaches has only really results in negative reinforcement for business, but a feeding frenzy for media as companies and agencies now have to divulge incidents that have always been happening anyway. This makes it seem like it is on the rise, when in fact we’re just getting the problem more out in the open finally. I don’t see this dying off for at least another 6 years. Once all the big businesses are shored up, we’ll see tons of smaller businesses like those mentioned in the article posted above.
I foresee for a number of years, yet, businesses stepping as lightly as possible on this issue. Doing just enough to avoid negligence and satisfy regulations, but not enough to really have to admit to any problems or divule them. “Yes, we log and monitor, but we don’t see anything, so everything is a-ok! I’m sorry you had your data stolen, but we do what we can, so better luck next time.”
While this may feel good today, this is not a scalable or sustainable approach.
From my vantage point in IT, I can also say that logging and monitoring and even security are not high on the lists of execs to spend money on, managers to raise issues about, or staffers to spend time on. Our #1 priority is making sure the network and systems are up for the company. This can be 100% time utilization. Our #2 priority tends to be projects that either enhance the functionality (not security necessarily) of the current network and systems or projects that are directly related to revenue-generating people or processes or clients.
Security is not yet up there, let alone logging and monitoring and responding to those logs in an ethical fashion. This is true also of software and web application developers. Functionality and deadlines and bottom-lines first, then maybe performance. Security added later (and too often just never added).

the little things, the fundamentals

For want of a nail, the shoe was lost;
For want of the shoe, the horse was lost;
For want of the horse, the rider was lost;
For want of the rider, the battle was lost;
For want of the battle, the kingdom was lost;
And all for the want of a horseshoe nail.

For one missed log entry or one shortcut taken…

being ornery about the corporate ethics compass and security training

A Canadian article discussed the results of an IT security survey. A couple blurbs caught my attention.

According to the 2006 Global State of Information Security survey, 53 per cent of Canadian companies surveyed said their reputation was driving their information security spending. The global average was 41 per cent.
“Poor information security that loses data such as customer profiles can seriously affect a company’s brand,” says Greg Murray of PricewaterhouseCoopers. “The cost of handling the public relations issues associated with losing customer identities can be devastating.”

Now, while companies are economic entities, and realistically, this may be the real deal honest truth when execs look at IT security and the effects, I can’t help but think of how unethical this attitude seems to be. So, in the absence of a government forcing disclosure of losses, these companies would not divulge the information. In addition, if customers do not care or the company would not be affected financially, they wouldn’t disclose it. That attitude is also degrading to security/IT staff for those companies. “I only do good just because it helps me avoid getting into trouble.” It’s a classic example of negative reinforcement. I would prefer that we didn’t need that reinforcement and that the actions were done ethically due to the company just being that way. But that may be way too idealistic of me to expect… (Then again, avoiding negligence issues can also be the same way, so maybe I’m being nitpicky on something I really should not be…quite likely in fact, so I will strike this whole paragraph, but leave it for future reference by me.)

Mr. Murray was surprised to find that 61 per cent of Canadian respondents surveyed have limited or no security training for the end-users of technology – their employees.

Training is a fun debate and can go both ways. Fundamental training should be necessary for employees. I’ve known way too many people who truly didn’t know something like surfing web pages willy-nilly was bad, and they were genuinely receptive to the information. Some of whom may even have changed their behaviors due to the new knowledge. But much like teenage pregnancy and drug use and various crimes, you can only inform the “general public” so much. Security will not become suddenly solid when all users are given excessive amounts of training in the workplace. I mean, if that were possible, perhaps we could have had a much different president these past 6 years if we had just informed the US public more? 😉

5 security steps for small businesses

Tate over at ClearNet Security made a post about a friendly debate over the top 5 things a start-up company (read: small company) can do to start out the right way in regards to a safer computing environment. I thought this would be a good exercise in determining what my own top 5 recommendations to a similar fictional company would be. Granted, doing a top 5 instead of a top 6 or however many top picks it takes to do security right is a little limiting for no real reason, but this limit does help focus a bit more. This can also act as a general checklist for consultants or any outsourcing of solutions a start-up does, especially ones without in-home IT staff. I also try recommend free solutions as a starting point, especially for small companies without IT budgets.
1. Backups. This is the #1 thing to do to keep a business alive and running. My underlying assumption is that incidents will occur. If you don’t have data backups, you will not survive many larger incidents. A requirement would be offsite backups, even if it is just at the CEOs home and maybe the CFOs home. Everything else for security should be dropped until this is done. Backups can be as simple as some batch files like Robocopy dumping data onto firewire or USB drives every night, with manual swapping of cables every day or week. Desktop systems can be set to perform regular system backups to a central storage if need be. Test backups regularly, test restore procedures regularly to ensure that they are working and to keep someone knowledgable about the process. Make sure workers copy important data to central servers every night or Friday, or a location that is backed up. Having even an elaborate file server and backup scheme is defeated internally if users keep their data on their systems and those systems are not backed up themselves.
2. Network firewall on the Internet link. Put up a network firewall on the Internet link and be draconian in the rules. Default Deny, and limited access elsewhere, even if it means nearly zero access from the outside. Small start-ups might be able to contract out to a local Linux expert or friend of the company to throw in a largely free Linux solution. Something like SmoothWall/IPCop may be better, as a slightly tech-savvy worker may be able to understand and work the web-based configs better than Linux iptables and such. But, if possible, invest in a Cisco Pix or Juniper NetScreen or Windows SMS/ISA solution and contract someone to set it up for you.
3. Desktop Antivirus. Evaluate some robust and light-weight products for Antivirus protection. For the most protection, I would not pick Norton of McAfee (most malware that is truly dangerous looks for and disables them anyway), but rather look into Kaspersky or F-Secure instead. For freeness, AVG and ClamWin are decent enough. A good case can be made for network-based Antivirus on the gateway in a smaller company, but most new desktop/laptop systems come with host-based AV anymore, so may already get half done without the extra burden. Obviously, the apps should be set to regularly scan the systems, automatically clean/delete, provide realtime scanning and stopping of virus execution, and be set to update no more than daily, every 8 hours if possible.
4. Patch Management. Turn on your Windows Automatic Updates to force installation upon a subsequent reboot. Try to do this with Office if at all possible. Updates should be done as soon as possible, preferably once a week on a Thursday or Wednesday. Workers should regularly do manual updates, even if it just verifies that automatic updates are working just fine.
5. Man, the dreaded last spot. Do I use physical security here, as losing the time and equipment for a small company can cost dearly? I guess when it comes down to such a short list, I have to look at what will best help the company survive and prosper to a point where the luxuries of security can be afforded. I would side with physical security here. Make sure doors are locked properly and possibly invest in an alarm system. If the company is in a business park, get to know the security stance of the business park owners and possibly work with them to provide for alarms or anything else they may do for you. If possible, lock down all systems at the desktop and secure any server equipment behind another locked door or at least out of sight behind some other door. The costs of these protections far outweighs the loss incurred in their absence.
I will cheat and put in a 5.5, since it is not only dealing with security, but insurance purposes as well. Inventory all systems and keep that up to date. This can just be some spreadsheet available with dates of purchase, serials, hardware details, software licenses, etc. Starting this early helps. Inventory can be morphed into talking about baselining an environment. Know what you have and what is normal in your environment. What systems are expected, what software is expected, what sort of traffic levels you expect, what log entries are normal. This baseline effort can then lead to quickly recognizing when something is abnormal and needs investigating.
A really close next consideration is to acquire desktop/security help either with some low-cost outsourcing or just hire a guy internally to manage systems, clean spyware, try out new software, help test new products, etc. This can help provide a company with someone to turn to for slightly more authority than your average user, and help a budding IT professional get his chops cut on some real experience. There are plenty of IT professionals out there who would be glad to consult either on the side of their daytime gig (be open to only getting support outside business hours) or add you as part of their already established clientele.
Lastly, if the small company insists on a wireless network, then I have to include wireless security as part of the list. The wireless network must not remain open and needs to be protected using WPA. Yes, this might be a hassle with visiting guests and potential clients, but the consequences of some high school kid driving by and mucking in your network can be dire.

weekly it stream of consciousness ramble: relics and creep

HostGator was apparently not alone. At least two other companies had reportedly also been hit with the attack, an exploit for a previously unknown–or “zero-day”–vulnerability in a popular Web-site management application known as cPanel. (SecurityFocus)

One thing that scares me about many companies is their propensity to have what becomes a highly heterogeneous environment with lots of little things purchased and installed or freely downloaded and implemented in their environments, sometimes circumventing IT involvement. And one little thing like a third-party web-based app can cause an entire server or network to become owned and jeopardize a company’s existence.
I had more of a purpose for this post, but I ended up turning myself in circles. Homogenous environments vs heterogeneous environments, simplicity vs defense in depth, all-in-one devices vs separation of duties…
In the end, companies simpy have to keep control of what they install and run in their networks, especially Internet-facing exposures, and maintain a process (with proper staff devoted to it) to keep up to date with patches and alerts for those exposures. While OS patches and “big” apps like Apache and OWA are typically tracked, far too many little things that slowly seep into the network landscape get overlooked. That ticket management system that was put in 2 years ago, or that survey “engine” on the corporate web site, or how about that php bulletin board that isn’t hasn’t had an update in 12 months. What about that port that was demanded to be opened 3 years ago to allow a temporary FTP server that was never cleaned up? Does marketing really need that nifty new tool on the web site, or WebDav turned on because that’s the only way their contracted, at-home employee knows how to update websites?
While I like to call some of those things “network relics,” I think I will also start applying a term, “network creep,” to all the various little things that slowly make their way onto or into the systems and network that IT manages. This creep slowly expands the exposure for a company and unless there is strong change management, follow-up, and staffhours to devote, these creeps turn into relics.
Policy and processes (retirement of systems and apps…). Inventory and documentation. Standards. Logging and monitoring. Staff. Change management.
I’ll stop now before I get to rambling too much more.

beginning work with an ips system

I have been working at my current job now into my 5th month. A lot of my time has been spent getting used to the environment and culture of working here, along with a majority of the time spent supporting and working with our .NET/ASP application development team. This basically means I’ve been more involved in Windows systems administration than I’d like to be doing, especially for someone who is not pursuing .NET programming. Windows sysadmin is not that difficult in the long run (you can make it as difficult as you want, by adding scripting, etc), but it is not all that fun or glamorous. I’d pretty much rather be doing anything but, however, I will admit there is plenty of demand in the role in business.
Anyway, starting this week I get to begin working on and taking control of our McAfee Intrushield IPS device. This device sits inline with our external firewall and our internal DMZ firewall and logs intrusions attempts. Right now it is passive and set to IDS-mode only, as no one has had time to really sit down and configure it properly while minimizing the risk of preventing legitimate traffic. That will end up being my role here, forthcoming.
I’m not the biggest of fans of IPS devices. I believe that a company like ours which is small and has a good amount of money to spend on IT is better served by installing only an IDS system and staffing to monitor it properly, as opposed to an IPS that will automatically block traffic based on various turned-on rules.
However, this is still majorly exciting and almost as good as managing the firewall. This device straddles the two areas I would like to grow in: networking and security/insecurity. So, that was some good news in the past few weeks in regards to my job, and I’m really looking forward to talk to our Accuvant guest this week and getting my fingers deeper into this device.
I will be very disappointed in the device if I am not able to see the actual packets and payload for various detections and alerts. Installing and playing with an IDS (Snort) at home has been on my extended list of things to do, but I have some bigger fish to try lately. So to be able to do this at work is actually the first ray of sunshine that I have had at this new job.
UPDATE: I did some research on case studies for Intrushield and found one (pdf warning) that doesn’t name the company, but it does name the CSO. Turns out it’s the CSO from McAfee itself. While I can say, “d’oh” to see a company use itself as a case study, I have to say I like the idea that a product is in use internally. In my short career, I’ve already felt the irony of a company that doesn’t use its own products or follow its own paradigms that it tries to sell.

passing the torch again

I started read this article about Windows XP just to fill time, but by the time I got to the second page, I was noticing some subtle and poignant things being said.

The initial simplicity [of Windows XP] almost never survives contact with software installers. Most of them ignore Microsoft’s programming guidelines by dumping shortcuts and icons across the Start Menu, the desktop and the “tray,” that parking lot of tiny icons at the bottom-right corner. Good luck finding anything on the screen after you’ve let the likes of AOL Instant Messenger or RealPlayer have their way with XP.
With all that extra software, Microsoft needs to persuade other companies to play by its rules, but it’s had trouble getting even its own programmers to do that. The mere presence of Windows Vista can’t change this failure to communicate.

From device drivers to installed software, it all basically does whatever it wants to do, due to Microsoft’s approach to system architecture. I am fully convinced that Windows is a product of consumer usability, and not of any intelligent security design or means to be solid and stable and loved for decades. Now, whether that is good or bad is another story, as Microsoft has grown rich and huge for those choices.

The operating system has done little to ensure that programs move in and move out in an orderly manner; they can throw supporting files and data all over the hard drive, then leave the junk behind when software is uninstalled. As a result, something that should have been fixed in Win 95 — the way Windows slowly chokes on the leftovers of old programs — remains a problem.

This is all too true, but again, what alternative is there? And with moving forward in Vista, how exactly will that fix everything? So many programs are bound to act funky or outright break with the new OS. People who have paid for these programs will clamor for support with upgrades (which thankfully software vendors have gotten consumers used to purchasing these upgrades). But, in the end, turning this huge ship that is a Windows-based community around is not going to be easy, or maybe even possible with the Windows OS architecture.
Imagine having Windows running so many important things for years, or even 20 years from now. The world is also becoming more PC-literate, but you can bet that 99% of all the next generation users are growing up with Windows, as opposed to other OS flavors, although I will give that next-gens will be better able to adapt to other OS options if they so chose to. This means that there is a very real threat to *nix servers and tools that they will slowly be bred out of existence (of note, putting *nix into the hands of developing countries can then be both a saving grace or also further stratification…).
Hopefully Windows gets some things right with Vista, but somehow I really doubt it. XP was a major step for Microsoft and it has lasted 5 years during the stabilization of the PC in our daily lives, young and old. I think it will look prettier, be larger, be more complex, will have more layers and layers of cool graphics and security apps, but it all just covers the same buggy and outdated architecture underneath.
At least it still means job security. 🙂

payphone warriors

Now this is a really fun-sounding idea for a metro game: players attempt to control as many payphones in an area as possible by calling from the phone to a central scorekeeper. The link gives plenty of information. This isn’t necessarily something to be done in say, my state of Iowa, but would be amazingly fun in a very payphone-heavy metro area. What would be most interesting, though, is seeing how it is set up and run. Checking out the Asterisk setup behind the scenes, as well as how the payphone signatures are determined. I wonder if a game like this can be devised for DefCon? I wonder if payphone signatures can be spoofed such that a player can just adjust the variable and keep calling back from one phone?
Now what about expanding this to, say, the entire city of New York in a never-ending game where you can call up at any time? What about doing this for wireless hotspots or networks? Granted, you can spoof your IP and stuff, but what about needing to maintain a solid session with a central server from a wireless network, and submit data about that network? And note that I’m not saying open, public wireless networks… This whole idea is similar to a capture the flag competition, only mixing physical movement along with travesing the digital landscape. All the more reason to move to a more urban location. 😉

removed links

Just removing some links. First, Ubertechnica appears to no longer exist. I have long read Xatrix, but ever since they had some legal woes they’ve slowly eased up on updates. Looks like no one is maintaining the site anymore.

Since I have moved on from using SuSE extensively, I no longer need the SuSE Security page. The antiforensics section of Metasploit is looking a bit old, so there is no need to keep it on its own link. I can get there through other means if need be.

I’ve always hoped Erin would finish work on her site, amoebazone, especially the log part, but I guess development has stalled for other pursuits. I do still like the layout and design though, which is one of the real reasons I am making notes when removing sites. This site was here as a reminder of the design as much as wanting to see the completed work. Another largely personal site that predates real blog/journal apps is Thor’s site, Hammer of God. Dunno really why I kept it or even included it, but it no longer will be.

Insidethebeltway seems to have disappeared. I really just don’t read any of the blogs from the RStack white hats. The Lost Olive offered me nothing either, other than an awesome 404 page

security outside the box: car keypads

This is just a little bit old, but there are still plenty of cars that sport the numbered keypads to unlock the driver’s side door. There are really only 5 keys here, and thinking outside the box, one can quickly test that this is just a password entry, but there is no end bit or anything. It just sits and listens and waits for the proper combination no matter what preceeded it or followed it. Turns out, it only takes 3129 keypresses max to get the door to open. The article states this takes about 20 minutes. Just imagine reciting the cheat sheet into a recorder like an ipod and then just listening to the sequence as you key it in.
The more I think about it, the more it makes sense that this whole idea didn’t last very long and not all that many cars used it or still use it.

the career it writers

I diss on the blogosphere a lot for being bad reporters of news, but great reporters of experience and opinion (which in a way is news as well). I guess the difference is journalists have a level of ethics to maintain whereas bloggers can basically do whatever the heck they want.
Anyway, one question I had in my head lately are the career writers. There are bloggers and journalists in IT that I sometimes see or read and I frequently look at their bios or background, just to see where they are coming from. Often, I see they have 15-20 years of writing about IT and journalism and papers and 15+ books written or contributed to.
I don’t get this sometimes. Are they career writers? Do they actually do any IT stuff either in an enterprise or at least at the consumer level? Or do they just play at home, talk to others more knowledgable, and just write about it? Those people kinda bug me…

linux as main box – part 2: the score

I’ve used Linux in the past, Red Hat, SuSE, Slackware, Knoppix, and various other livecds, but have never been able to make it a regular box that I use 95% of the time. Hopefully this will change.
But first, I want to just out and say it: Linux is not ready for prime time. Not even Ubuntu. Unfortunately, Windows is far easier to wield and get things done on. It might be less secure, but this is the classic usability vs security relationship. Thankfully, Ubuntu is not just for the uber-geek elite anymore, and can be adopted by hardcore geeks and even casual geeks, but it is not ready for the average consumer or user, and has a long way to go.
What better way to compare the two than by keeping score. Now, keep in mind Ubuntu is going to win in the end, as Linux will for me. I plan to stick with it and hammer away at it until I’m firmly on the “other side.” It might be painful, but this is just part of learning and becoming a better geek (read: IT professional).
The install, as stated before, was amazingly fast compared to any other OS I’ve run. I literally thought I was still running the livecd portion of Ubuntu when I first rebooted (Ubuntu +1). However, the partition options leave a lot to be desired. While Windows is simple with partitions, Linux has always been arcane with them and knowing how many you need and how to carve them up is, in my opinion, the single biggest detractor for new users to try out Linux. Right from the start, it is complicated and difficult and unknown. Many people put it down right there without really giving it a true try. Ubuntu is an all or “know it yourself” install. Either it takes the whole disc or pre-made partition, or you have to know what you’re doing. Sadly, I don’t, and many people won’t either (Windows +1).
So, last night I went about making sure I could do the typical things I want to do. I first updated Ubuntu, which, like Windows, prompted me with a nag screen saying there were updates. Nice! The updates were relatively quick for having 170+ updates, and of course required no reboot (Ubuntu +1).
Synaptic is really cool, and I’m happy with it. One bad point though, is that you’re stuck with Ubuntu’s packages and you need a little bit more knowledge to open up the universe and multiverse to more downloads. But, I always have liked having a central repository for many programs, all of which are free (Ubuntu +2, Windows +1 [how many people really catch the universe/multiverse updates without work?]). My biggest complaint about Synaptic, though, is how easy it is to do something and say, “omg, wtf did I just do?” I did this by selecting some packages and not paying close attention to the required packages or things that needed removal. After walking away to pop in a movie, I came back and hit “Apply,” only to see Ubuntu quickly remove some things. I have no idea what they were, but I hope they were not important. I have learned, however, that I really should do one thing at a time, and scribble down what is added and removed, at least until I’m comfortable with this process.

sudo gedit /etc/apt/sources.list
add in: deb http://us.archive.ubuntu.com/ubuntu/ dapper universe
deb-src http://us.archive.ubuntu.com/ubuntu/ dapper universe
deb http://us.archive.ubuntu.com/ubuntu/ dapper multiverse
deb-src http://us.archive.ubuntu.com/ubuntu/ dapper multiverse
save, then: sudo apt-get update

And this is the second biggest issue people have with Linux, and myself: the installs. Windows has a huge boost here with automatic installers that take care of everything. You don’t need to unzip things usually (and if you do, it’s easy). You don’t need to compile from source code. You don’t need to hunt for packages that work with your OS flavor (Windows flavors don’t run concurrently, there’s really only one active one at any time now, not counting Server). You don’t need to wonder what the executable is or how to run it, it appears automatically in your Start->Programs list. Ubuntu is not so helpful all the time. I installed about 10 different packages from kismet and airsnort to lxdoom and tcpdump. Over half the installed packages were installed, and then promptly hidden from me. They were not in the Application list nor did I find them in the filesystem. Granted, most of the ones now found seem to be command-line apps, but this is a huge hole for most casual users. “I installed lxdoom to play it, now it doesn’t appear, what gives?” (Windows +1) Not only that, but at least Synaptic takes care of linked packages or things you need before something you want. Trying to track these down and align the planets just to install one program can be a huge headache in Linux. (Windows +1)
So, an OS that is going to be a “Windows killer” better do some basic things without fuss. Ubuntu’s wireless works, Firefox is installed by default, Thunderbird is installed by default, but is not the default mail program and does require being added into the Application list (Windows Start->Programs list). I installed GAIM without problem and promptly got on my IMs without issue at all. (Ubuntu +1 Windows +1)
I then popped in a DVD. Totem, the default media player threw an arcane error. Ok, I didn’t want Totem anyway. So I installed mplayer. It also threw an error, even more arcane than the first. I then installed Ogle and Xine, both of which also could not read my DVDs. Wow. I did some research and it turns out encrypted DVDs are just enough of a closed format that Ubuntu decided not to include the ability to play them out of the box, or even after installing new players. In fact, I couldn’t find the libraries I needed in Synaptic. D’oh. I found libdvdread3 jus fine, but libdvdcss2 had to be downloaded from some guy’s FTP in Sweden. (Windows +1)

use synaptic to get libdvdread3
install libdvdcss2: sudo /usr/share/doc/libdvdread3/examples/install-css.sh

Whoa, wait a minute here…what version did I just download? What command did I have to run to make it work? I have to download some weird library that may or may not be 2 years old from some guy’s FTP site in Sweden? I did more searches and found more German and other foreign sites, none of which looked commercial. This is the kind of thing in Windows that we, as security people, work to avoid: downloading from sites that make us stop and get paranoid about. (Windows +1)
After putting in the new library, though, all the players could play my DVDs without problem (I think I like the Xine interface best, but it doesn’t fill my whole screen, sadly…which may be a graphics driver issue, but with the player…). However, this sort of hassle and *need* to Google up and understand uber-geek Linuxspeak to get it to work is going to keep Ubuntu from being used by my parents and friends. (Windows +1)
So that is where I stand right now. I can do most of the things I want to do on a daily basis (email, web, IM, and accessing my external drives for media like music, and dvd playing [with effort]), but where Ubuntu makes up ground on Windows in the install and ease of deployment, it loses ground in the places Linux has always lost ground: packages, not doing the necessary things out of the box, and needing to put on the geek cap just to work around things. Does Windows necessarily do this better? Perhaps not, but at least 99% of the computer-using world is used to it.
The score appears to be about how I expect, with Windows leading at this point, because this is all the hard, preventative stuff from Linux and Ubuntu so far. Windows 8 Ubuntu 5.