a general cynical moment

[Update 3/19/09: I’m cleaning out some unfinished posts that I didn’t want to lose, so I’m just publishing them as is. This post is a bit of a rant from summer 2008, but I feel I wanted to make some points about how IT may talk all pretty about ‘aligning with business’ but really we’re probably always going to be stuck in some ‘silo’ of some fashion no matter what. Also, entities are simply not doing the simple security things correctly. This compounds the ‘silo’ problem… I wonder if it would help if ‘business aligned with security?’]

Talking in our team meeting this morning at work, and it became a bit of a cynical day to start out. That is one thing about being in IT and being security-conscious (or being in security)…you can become cynical and negative extremely quickly, and often. At least for many of us, we keep the venting in the back rooms.

We were talking about some of the breaches that have been occurring in recent years and how they are still only slowly pushing proper security measures. Interestingly, it seems that most, if not all, of the media-covered breaches are the result of stupidity on the part of users, or very simple mistakes on the part of the victim company or person. Perhaps really talented hackers are not getting caught and maybe a lot of those more subtle attacks are being buried in corporate bureacracy and fear, but I truly think most of the incidents are borne out of mistakes or opportunity for the attacker.

This means that a depressing number of these were preventable. And a depressing number of these make us corporate goons highly frustrated because we talk and talk and demonstrate and warn about the same issues. Not much of this stuff is new to those of us with half common sense.

Ask your employees who is responsible for data security, and I would be willing to bet that half or more will say IT. Another small slice will act smart and say everyone, but they’re just supplying the right answer without really believing or living it. Very few will answer and truly believe that it lies with everyone. So that puts the burden on IT, for the most part.

Companies complain when we work in a silo, vacuum, or do things on our own that affect their job without other people’s input, no matter how inane or useless that input may be. Which is weird, since we are supposed to do things on our own, like, you know, security.

We can often complain about lack of action or preventative planning in the upper ranks of a corporation. “It won’t happen to us,” is a common refrain, whether explicitly spoken or implicitly implied (I wonder if you can explicitly imply something…). But one that really annoys me is the statement, “We already have adequate security.” I really hate that, especially when you ask the IT guys if we have adequate security and we immediately either give an “I-know-better” smirk or we look suspicious wondering what politico-business trap we’re about to fall into based on our response. Top-down, there is a gap where eventually a C-level just doesn’t know the nuts and bolts and lives in their own little reality. Not all of them, but that is a very easy cloud to fall into, especially if they feel they should be a leader by example and trust their employees without validating that trust with nothing more than, “it’s never happened yet!”

the lost battle for the desktop

[Update 3/19/09: I’m cleaning out some unfinished posts that I didn’t want to lose, so I’m just publishing them as is. This post was written nearly as year ago.]

update: Odd, there was just talk about this, maybe I was influenced in a round-about way by this discussion at slashdot: Should Users Manage Their Own PCs? (read the comments!)

Also more here.

There is increasing talk about worker angst with IT teams locking down computers and being dictators when it comes to adding software their computers. Thin clients and terminals are suddenly becoming sexy again. Likewise, most office workers seem to have their own array of gadgets and devices that they want to use, IT policies be-damned.

Rather than tackle that debate which swings both ways, I want to play devil’s advocate and assume the direction is going to be taken where employees have full rights on their own fat systems. Let’s say I work at an SMB that values employee happiness and creativity (software shop, video game shop, design group, etc). And the decision has been made that employees are responsible for the software on their own systems, although the company itself may front the cost of any needed software; pirating is not allowed.

What does this mean to security of that organization? I know plenty of security geeks will go into immediate defensive mode, but I’d rather delve into what approaches are needed in such a situation.

The assumptions and setting:

  • Users have administrative rights to their systems.
  • IT also has administrative rights.
  • Users won’t install pirated or illegal software, but instead get comped by the org.
  • Servers are still the realm of the IT teams, so let’s just not think about them for now.

What are some issues that can arise in such an environment?

  • Systems may slow to a crawl as they become infected with crap upon crap.
  • Internal and external networks may slow to a crawl or becoming unusable due to worms, viruses, scanners, bots; both internal-only congestion and externally targeted congestion.
  • Information may quickly get stolen, ala the program that installed and steals your aim/wow/bank account and password, either actively or triggered or keylogged.
  • IT may have to answer questions and provide support for non-standard programs across a huge range of possibilities.
  • Users may install tools that have malicious side effects, especially if they have a laptop that goes home. Things like BitTorrent and p2p apps tend to pop up on such systems.
  • Most systems will have one or several IM programs installed and in use, opening the user to phishing/spam, an potential avenue to send information beyond the corporate garden, and lost productivity if abused.
  • Users will use their personal webmail accounts, opening up the same avenues.
  • Any type of development or creation processes may not be possible to move from the user’s computer to a server. “You want *what* installed on the web server?!”

And here are some measures to pursue. These are not in any specific order.

  • A strong perimeter with aggressive ingress and egress rulesets with active logging on egress blocks. Yes, many apps will just tunnel through port 80, but that doesn’t mean we should forget the floodgates.
  • Strong internal perimeter to protect the DMZ and the suddenly rather untrusted internal LANs. Isolate print servers, file servers, and others from userland, letting only what is absolutely necessary past.
  • Strong internal network monitoring to identify traffic congestion and unwanted communication attempts.
  • The staff to attend to the alerts this stronger network posture will require. With such an untrusted userland network, bad alerts can’t sit for very long, and there may be plenty of them.
  • Consistent and regular user training about security concepts.
  • Regular communication amongst employees and IT about how to properly solve various problems, use programs more intelligently, and so on. If one program can solve problems but everyone is just using what they know, perhaps opening communication may get everyone on a standard page. It certainly is better than everyone trying the same 10 programs to solve the same problem. [update: I’m not sure what I was saying here…]
  • Foster an open environment where users can talk candidly with IT and security, without expecting laughter or a quick rebuke.
    This is going to be much like the TSA assuming every passenger is a threat.

  • Will need an aggressive and automatic patching solution to keep the OS and major applications patched as much as possible.
  • Have a strong imaging solution and architecture in place. People mess up their computers now and then and require them to be re-imaged. People who control their own computers will mess them up even more.
  • Have strong network and file server anti-virus or malware scanning. Chances are pretty good that users will store their backup installs on your file server. Try to separate the screensaver crapware from the necessary stuff.
  • Be proactive in supporting the software inventory needs of your users. If a user has a piece of software they had the company purchase, keep an inventory or even a backup of the install disk and serial under lock and key. This is far better than letting users manage (or steal! or lose!) their own copies. A photoshop disc left on a desk is a pretty easy crime of opportunity.
  • Plan to have strong remote management of user’s systems, especially when it comes to inventorying various things, such as accounts, installed software, running processes, resource consumption, log gathering. You likely won’t parse these out regularly, but some you might want alerts for, such as new user accounts appearing.
  • Proactively offer to assist users with any PC questions they may have. Often, users have lots of little annoyances they live with, but offering to help with the fixable ones can often go a long way towards satisfaction not just with IT but their job as well. If a system is running slow or they don’t understand why a window displays as it does, assist them with fixing it.
  • When assisting users, take extra effort to include willing users in your troubleshooting. This not only opens lines of communication, but also teaches them as you go. Maybe next time they’ll already have checked for that rogue process before you get to their desk!
  • Might be wise to evaluate DLP technologies. While administrative rights for users on their desktop means many forms of malware will do things like disable AV before it can interject, many users are not nearly as sophisticated when they purposely or accidentally move important data from the safety of the corporate environment to an outside entity. It might be enough to implement DLP to stop all but the truly crafty and determined insiders. That might be risk avoidance enough to deal with the determined ones on a case by case basis.

Sadly, the reality is a company that likely wants to have local administrative rights is likely too small to meet the needs listed above without some assistance.

responses to concepts for managers to understand

Mubix has posted his summary on things we wish our managers would learn, which I commented about the other day.

The #10 entry was about company buy-in and had only 1 vote, but I wonder if that single issue may drive a majority of the rest of the problems. It might not be that our managers don’t get these topics, but they may be in the same boat as we are in feeling unsatiated with current results.

If there is any bias, it might come from how we read the question and how far up the chain our manager is. If my manager were the CTO/CSO/CEO I think I would answer more along the lines of #10. Maybe a good question would be, “what one concept would you want your company leaders to understand?” That would probably limit those technical responses and probably broaden the basic concepts part?

Or maybe what would be your security-related mission statement (and maybe a few supporting statements in case you think of mission statements as “make the world a better place”) for your company?

from vulnerability to root in a few taps and clicks

SANS has published a story on an attack that bypassed a .NET/ASP web front end and poked a local escalation. The tools mentioned can be found: Churrasco (has the full description), Churrasco2 (updated for win2008), and ASPXSpy (.NET webshell). Note that McAfee AV does detect the file aspxspy.aspx as naughty.

…developers wonder why I don’t let their apps write locally…or publish directly since my replication removes rogue files automagically…

a security incident when you have no security posture

I didn’t expect to be quite as entertained by this story as I was. I apologize for not knowing where I got linked to this, but CSOOnline has the first part of a two-part story on how a company that suffered a data breach did everything wrong. These are the sorts of stories that need to be told. Repeatedly. I don’t care if authors are anonymous and specific details scrubbing to protect the guilty and victimized. But this sort of stuff shares details, and that’s what we continue to need. We need it to learn from, and we need them to show others tangible illustrations of the risk.

…They lacked the equipment to detect a breach and, even if they did, lacked the human resources to monitor such equipment. He told us his staff consists of one full-time employee and one half-time assistant who is shared with the help desk… [ed.: a company of 10,000 users, 127 sites…]

“What logs? Remember that each business unit is different, but here at corporate we don’t have logs. In fact, logging was turned off by the help desk because they got tired of responding to false alarms. Help desk reports to the IT director, not to security.

Everything starts with a basic policy from senior management that says security is important. From there flows talented staff who aren’t going to just disable pesky alerts or be pulled in the IT operations/support direction 100% of the time. And so on…

it’s ok to break into things if you’re just demonstrating

Speaking of ethics, the BBC decided to do some of its own hacking for a show, Click.

The technology show Click acquired a network of 22,000 hijacked computers – known as a botnet – and ordered the infected machines to send out spam messages to test email addresses and attack a website, with permission, by bombarding it with requests.

Click also modified the infected computers’ desktop wallpaper.

If the BBC doesn’t get hurt by this, the lesson we can all learn is: Make sure you have a hobby in journalism/reporting/television. That way next time you get caught cracking into something, you can just say it is for research and part of a report you’re doing. Then we can all laugh and share a pint because it’s all good then!

Oh, and next time I accelerate my fist into your abdomen, just let it be known it was without criminal intent. Over and over. Maybe I’ll laugh during that too, to show I have no ill will in it.

powershell: executing remote scripts from a script

This is a complicated issue and may only make sense to me, but I’d like to document for future reference. I’ll try to simplify as much as possible to stick to the crux of the matter: remotely executing powershell scripts from powershell scripts.

Pretend I have 3 web servers. On each server a powershell maintenance script perpetually runs (infinite loop). If I have a new web site to build, I edit a text file on a network folder. The maintenance scripts see this and execute a “createsite” script. Sometimes, due to down time if IIS needs to be stopped, I need these scripts to run in an orderly fashion. So one maintenance script is always a “master” of the others.

I’ve finally gotten sick of having a perpetual script running on each server (using resources and requiring an interactive login). What I want is one server which coordinates the execution of all my other little task scripts on the 3 web servers. Yup, I need to figure out remote execution!

Yes, Powershell v2 has decent remoting capabilities, but I can’t effectively leverage them quickly. We still use IIS6, my web servers have Powershell v1, and there is a lot of rewrite time for the scripts to update them properly. Instead, I’d like to quickly get going with this architecture with as little effort as possible.

I’ll use psexec and Powershell.

First, I need to make sure the account that Powershell will run under has a profile file set up. All of my scripts run out of d:\setup\scripts. If I want to start a remote powershell under a user and be able to relatively reference other scripts inside my first one, I need that user’s profile to start in d:\setup\scripts.

Create the file profile.ps1 in ..\documents and settings\script user\my documents\WindowsPowerShell\. The contents:

set-location d:\setup\scripts

This is the call I do on another server with an account that I designate as my web installer:

./psexec \\WEBSERVER1 -u DOMAIN\USER -p ‘PASSWORD’ /accepteula cmd /c “echo . | powershell -noninteractive -command `”& ‘d:\setup\scripts\createsites.ps1’`””

Whoa, wait, what’s the “echo . |” thing in there? That allows me to see the progress of my script, and properly lets psexec work on the target machine so my calling script can continue on with life. I found that just calling a powershell instance led to powershell/psexec never executing properly.

Did I need that -u and -p declared? Strangely, I did, even though the script was running as that user. If this wasn’t declared, I don’t think the Powershell profile was loading properly.

Questions?

Why don’t I use functions in my maintenance script instead of a separate script for my major tasks? I have many other pieces beyond a “createsites” task, some of which I call separately anyway. I’d much rather manage smaller scripts than one large beast of one. I’m not a software developer. 🙂

Why not use Task Scheduler? Let’s just say I don’t want Task Scheduler running on production web servers. And I want all my web servers to be managed the same.

disclosure risks when you trust others to host your data

While I’m not all caught up reading blogs, I hadn’t seen this yet so thought I’d share. This report from Tech Crunch of a Google Docs issue where some docs were shared out beyond the intention of the authors can be filed under, “Illustration of why you lose security control when using other people’s services.” While this issue may not have been leveraged by anyone, it is just one in the inevitable series of issues such services will create, especially as they want to break into the enterprise markets.

In a privacy error that underscores some of the biggest problems surrounding cloud-based services, Google has sent a notice to a number of users of its Document and Spreadsheets products stating that it may have inadvertently shared some of their documents with contacts who were never granted access to them.

It is not my choice to confuse cloud computing with Web 2.0 in the linked post…
One thing I’d love to see: Your exec team hosts and shares highly confidential docs on Google that detail an upcoming, confidential takeover. Google decides to start serving your exec team ads on Google AdSense or search results pages plastering up the name of your competitor you’re intending to purchase. Or takeover mediators…

on the general ethics of tones in hacking tutorials

Mubix recently posted a how-to on OzymanDNS; basically how to create an SSH tunnel over DNS.

And he has now posted thoughts on whether tutorials like this are unethical (or the situation he is in himself as a Hak5 host and mentioning how this can circumvent hotspot captive portals). I highly suggest reading both posts as he makes great points (and I love me simple tutorials).

My position is probably fairly easy. I’m fully in favor of such tutorials, but I do appreciate, and add to any advice I give, any information on whether something is potentially illegal or something you can get fired over if you do it at work. Sure, I hate those “only for educational purposes” blurbs in almost every 2600 article as much as anyone (know your audience!), but they are useful when someone truly doesn’t think about those consequences.

Sure, some teens watching Hak5 might turn into tomorrow’s black hats, but they may also turn into tomorrow’s security geeks because of the information they received in pushing systems to and beyond their limits, or challenging controls that are not fully secure, or simply trying out something new that sparks new ideas.

I appreciate that Mubix thought it over, and I do as well whenever I give advice. However, if we don’t toe the line on ethics of that nature, we’ll continue down the road of not sharing enough information, which I believe harms our collective knowledge and security.

upgrade your wi-spy for a discount

Thanks to Douglas Haider for posting that MetaGeek is offering a “turn-in/upgrade” deal for Wi-Spy users. If you send in your ol Wi-Spy, you’ll get a discount on a newer one. That’s not bad, especially since I got mine back when they were still $99 I think.

However, I don’t use it nearly as much as other things I could buy at that price point (yay opportunity costs!).

Upgrading to the Wi-Spy 2.4x is $200, and might be worth it. Paying $400 to get a DBx which really just adds 5Ghz monitoring, might be a bit of a stretch, especially since I have yet to encounter a wireless LAN operating in that range. A full product comparison is available.

So, a total of about $300 for a device like this is right about the borderline for me. It is nice and really awesome to have when you have a use for it, but otherwise tends to sit around doing nothing special. If the device itself were just a bit cheaper, I would consider this a no-brainer. Even still, I’m really considering the 2.4x upgrade…

another rambling non-technical postwhinerantsigh

I do Get It that IT needs to align with business. But that doesn’t mean I think everything is then rosy in the house and all the puppies are happy. It’s an easy thing to say, but a hard thing to adhere to (or easy, if you like statistics and can twist anything into a business value-add!).

My boss’ boss recently related a story about a VP who was tasked with turning around a company that had the right technology but the wrong business strategy. This included constantly evaluating whether the technology (and projects) is serving the strategy of the business.

That’s great, but to me that reinforces the idea that you only do enough in IT to accomplish the job, and that’s it. You let the rest languish and most likely don’t do any housekeepping. Housekeepping includes things that make security work: logging, alerts, detections, testing to make sure things you put up 6 months ago still work, audit settings, patches and updates (that don’t add any new features you care about), etc.

Yes, that is a way to go. For example you don’t need absolutely spotless event logs on your Windows servers. But that also is a way to foster a completely reactionary culture in regards to existing technology. I think that approach works more for new technologies and projects.

It just means that someone has to value security and housekeepping. And I’ll always go back to the idea that so few people value personal security in their lack of security measures for their own home, let alone for the business they own, until they suffer for it. It’s like finding your God only when you’re deeply fearing your own mortality (or feeling excessively guilty about something and need an explanation).

the security concept you want your boss to understand fully

Mubix posted an excellent question via Twitter today. Twitter promptly decided to poop out on me…but even so, I thought it a question worthy of blogging about.

mubixPolling the audience (serious answers please). If you could get your boss to understand one security concept fully, what would it be?

Take a few moments to think about that one. Grab a stess ball, sit back and sip some coffee, whatever it is you do when absorbing something, but just take a moment to think.

Lots of things come to mind. Trust no one! Audit and change management! Patch! Hire, retain, and train competent staff to do the heavy thinking! You can never have too much information (just bad consumption of it). Support the business securely.

I finally posted back the following:

@mubix Hard question, and worthy of a blog post. I’d say “You *will* have a security incident. Plan for it and plan to find it.”

I was hoping for something more profound like, “Wax on, wax off,” that would encapsulate a whole zen-like frame of mind where all security pieces fall into place. Alas, this was my contribution. At least I feel it states one of our fundamental laws of security, and sets the tone to properly detect, monitor, check, audit, and response to incidents.

a little late to the ie7 horrible interface party

I’ve casually used IE7 on a test work machine and on my gaming machine, i.e. not very much and certainly not enough to play around with the interface. Last evening at work we rolled it out to all desktop users. Holy sweet mother is that top bar a cluster of a mess! I normally wouldn’t mind it if I could fix it, but IE7’s customization is pretty much half-assed.

Optional menu bar? What are they smoking?
Can’t move the menu bar to the top where it belongs without a registry edit?
Can’t remove the Search box without a registry edit?
Can’t drag pieces up into the top bar?
The Home button is now broken away from the Back/Forward/Reload/Stop buttons?
Can’t edit or move the top bar?
Star (Favorites) buttons I can’t remove?

Again, I wouldn’t mind it if I were allowed to reset them all and move and disable what I want, but I don’t see a way to make this look decent at all! 🙁 I tend to be as mimimalist as possible with my browser, while still being functional. Small top bars, only 2 rows, and nothing that I don’t otherwise use regularly. I’m a computer user and thus I am fine using hotkeys or Menu bar dropdowns for occasional stuff. For tabbed browsing and a bar of Links that I only use on a work system, I’ll suffice with 3 rows of junk on the top. IE7 has me stuck with 4 at the moment.

And while I’m not against registry edits, it is obvious Microsoft did not intend for these options, and I dislike adjusting a corporate browser away from the standard settings.