defining information technology on its relation to data

[Note: I accidentally left this in my unpublished bin for a couple months. I can see why, as this is a bit unpolished and confusing, but I wanted to post up my thoughts on “data-close” and “data-distant” objects and how they relate to the changes in security and even IT consumption in general over the years.]

Last week I posted about Bruce Schneier’s latest essay on product suites and the course of security purchasing. I see Bejtlich has also posted, and has some really good comments going on it. Two thoughts kinda struck me.

First, Bejtlich says, “…what are the ‘crown jewels’? It’s the data, not the hardware and software.” Second (Bejtlich did not say this), trying to get an outsourcer to manage one’s security or even IT as a whole is a lot like Nicholas Carr likening IT to a utility like eletricity.

So can some utility provider manage a company’s data? I’d have to say I don’t think so, unless the company is such a cookie-cutter company that the data offers zero differentiation from its competitors.

From there, we can create this spectrum with data on one side and electricity on the other. Data, the applications that gather/hold/report the data, applications that interact with others to glue all that data together into something useful…on up to the very commoditized desktops systems, networking hardware, 1s and 0s on the wire, the electricity powering it, and the Internet access. I can describe this spectrum as “data-close” objects and “data-distant” objects.

I can also explain one aspect of the rise of web applications. Web applications can be pretty specific to a company, especially the internal apps. They are pretty “data-close.” Desktop systems are “data-distant.” The configuration or maintenance or even presence of a desktop machine is rather unimportant unless it needs a lot of fat applications to consume data. Since a web browser is in every OS (we’ll just assume this, since that’s really the case in any business endpoint system), we have now moved the data-consuming app closer to the data where it should be, leaving the guts of the desktop system to be “data-distant,” where it should be. Hard disk encryption continues this trend, since the hard disk is a bit more “data-close” than the rest of the desktop system.

risk mgmt and freak helicopters falling from the sky

From HardOCP I tonight read news that some kid was wearing earphones and died because he didn’t hear a helicopter fall on him, so the issue of safe use of headphones has been raised up. Huh? I normally wouldn’t post something like this, but the discussion potential of this is both disturbing and hilarious (definitely fark fodder and sure to be revisited for the 2008 Darwin Awards despite not entirely being the kid’s fault).

I’ve taken a few snippets from the discussion thread over on HardOCP:

Why do people look to freak accidents as examples for change?

I mean, if he had been mowing the lawn would they be arguing that lawn mowers are too loud[?]

An RIAA representative said that a fleet of 400 choppers have been acquired and you should expect similar actions taken against people suspected of violating their rights.

Yes, there is a personal choice made whenever someone is wandering around and wearing headphones turned up too loud, from cars to other pedestrians to bikers sneaking up behind them, that is typically their own choice to lose their sense of hearing for that period of time. At the very least, a kid wearing headphones too loudly is only a danger to himself.

At the end of the day, a helicopter fell out of the sky and killed someone. Really, what the fuck are ya gonna do?

office lock ninja

Sometimes the obvious does make sense. I-Hacked.com shows how to do some office lock-picking if you see a key laying out in the open. Snap a picture of the key along with something to provide scale, cut the outline of it out of an aluminum soda can, insert and open! (I think I’d add a tension wrench which can also easily be made from common office items, for instance a paper clip or the clip part of a pen.)

marcus ranum caught cybering

I missed this the first time around, but I see Marcus Ranum has a couple postings on the Tenable blog. His first talks about cybercrime. The second alks about cyberterror. I think this link will end up holding whatever he posts.

I really like the first piece on cybercrime, although I think he misses one aspect that should be brought up: efficiency. Stealing data 30 years ago would have required reams of printed paper or boxes and boxes of tapes or discs. And that might have just impacted few people. Today, a USB stick can contain millions of important records that are all worth money. Just like software pirating, these issues have always been around, only in the past they were so inefficienct that we could accept those risks or mitigate them indirectly. Efficiency plus Ranum’s Automation make this a huge deal. Criminals can steal huge amounts of digital property that we attribute high value to.

re: twitter foray

So I’ve given Twitter a chance, and I think I’ll let this novelty slide. I find the following/followers thing a bit disconcerting. It is the equivalent of lots of people in an IRC channel, but everyone having rather extensive ignore lists so that half of what is said is lost on half the people present. Almost turns into a virtual ego trip or popularity measure, sometimes.

I think if I had more real life friends on Twitter it might be far more worthwhile to know what’s going on in their lives at 7:34pm at night, but it just feels like another tacky social tether to the computer. I’ll stick to IM and IRC for that, I think. 🙂

Update: I reserve my right to change my opinion on this. I do much of my work right now on a laptop and at a workplace where I can’t necessarily have something like a Twitter client running all the time as a distraction. Catching up on the past day of Twitters is retarded. But now with the gaming rig built, my old machine will become an always-on desktop system and running a Twitter client for popping up new messages as they come in may change my experience. I doubt it, since it offers nothing beyond what I get with IRC/IM other than less hardcore geeks and more links, but it could.

searching for a new portable music player

I am looking for a new portable music player. I have my trusty 4th gen 20gb ipod sitting in my car most (99%) of the time, hooked into my stereo. I have a small need desire to get a second music player that I can load more music on, but also keep more on my person rather than in my car.

  • I don’t mind keeping the 20gb ipod in my car permanently, so I don’t need my new one to be compatible with my stereo.
  • I have over 40gb of music.
  • I don’t use nor want to ever use iTunes or some other “marketplace” software. (die DRM!)
  • I use Ubuntu Linux and would much prefer to use Linux to manage my new device.
  • I currently use Winamp on Windows to manage my 20gb ipod. (I’m simple.)
  • I don’t plan to browse photos, cover art, or crap like that. Movies, maybe, but I don’t need that. My need is just music player and simple shuffled playlists based on genre.

Does anyone have any suggested music players and Linux software (probably Amarok and its clones) that I should look at? Sticking to an ipod classic is definitely an option.

would you snoop on britney spears’ records if you could?

Another link to ongoing stories coming from the UCLA Medical Center where employees improperly accessed confidential medical information of celebrities and even co-workers. I consider this situation an important illustration that policy does not ultimately work. The article mentions 68 persons snooped on 61 personal records. That means this is not an isolated incident. The article also mentions the sharing of passwords. Whoa.

Human curiosity or even greed (if any info was sold) was beating policy. I believe such impulse will always beat policy, in fact. These are crimes of opportunity, and technology/process should be limiting that opportunity. Yes, that might impact the ability for people to get some things done, but there is always that balance between getting things done any way you can and getting things done in a secure, trustworthy manner that limits unlawful opportunity.

However, in the end, someone has to have access to the information. Usually, someones so they can make decisions or even perform clerical work. This is where audits, logs, policy, managerial oversight, and hiring practices come into play. Does someone need to be watching the audit logs and report possible violations? Maybe, maybe not, but that could certainly be a measure for an organization that really needs to provide a high sense (or real state) of security.

automatic security tools and chinese p2p info leaking

Couple articles for security fodder.

The 7th Cyber Defense Exercise recently took place, which places networks run by various military departments under attack by the NSA in a controlled, scored, exercise. I found this nugget an interesting observation:

The choices in software tools for responding to any attack really boiled down to “automatic” versus “custom,” says Eric Dean, a civilian programmer and instructor. He adds that while automatic tools that do most of their own work are certainly easier, custom tools that allow more manual tweaking are more effective. “I expect one of the ‘lessons learned’ will be the use of custom tools instead of automatics.”

And a classified Hong Kong “watch-list” was leaked out onto the Internet. It appears a user brought some classified data home and stored it on a computer running a popular P2P application (Foxy). That’s a nice series of poor decisions.

The blunder occurred after a newly-recruited immigration officer working at the Lok Ma Chau border point took home some old classified files to study without authorisation.

His computer contained the “Foxy” programme and when he connected to the internet, the files were distributed without his knowledge.

Both stories came to me by way of the Infosecnews service.

where did the top 50 game lists go?

I like gaming, PC games, consoles, etc. I’ve been a console player since my Atari 2600 and was a *huge* Nintendo nerd. I also had a subscription to Nintendo Power shortly after the magazine debuted.

One of the things I most liked in Nintendo Power was the ongoing Top 100, 50, 25 games (it changed in length over the early years). These were games that ranked based on submitted votes and were not always necessarily obvious games. I found a few gems that way, most notably Chrystalis which appeared in the top 50 regularly for *years.* I finally rented a copy of this popular but rather rare game and absolutely loved it.

Modern mags don’t seem to have this anymore, or if they do it’s so small I miss it! I don’t care which games are selling the most or have had the highest editorial ratings for the past 6 months or some aggregate score as rated by gamers on the web site. I would like to see ongoing user-submitted lists of the most favorite games per platform every month.

Fine, I’ll concede this approach breaks with PC games (I could still vote for Doom 1), but in the console world which flushes the toilet every few years (not including backwards compatibility), this works nicely.

my tuesday rant on developers

Now, let me first start out that I’m not a racist or stereotyping kind of person. But I am using this as an example of the sort of developers I am supporting lately. This isn’t my trouble ticket (it’s for the desktop folks), but I did see this come in from a software (.NET) contractor of ours:

> This is Brahma
> My pc RAM capacity is 2.00 GB. But Speed is very slow ,could you
> make it Increases my pc RAM. It is very helpful for me.

These are the software developer contractors we have coming in and out through a revolving door pretty much. This is infuriating for me, systems/network support, for several reasons.

1) Most of these contractors know a little bit about coding .NET (for instance). But that’s it! DNS, IP, IIS, SQL Reporting Services, even SSLs are completely foreign concepts… I know they are being hired to fill a cube and poop out some code, but it really is frustrating to know most of them cannot relate or understand what impact their code may have elsewhere or what their code depends upon. Seriously, I could only hope someday to have such a small sliver of responsibility as opposed to supporting every system, software, hardware, and process that involves (even to the most remote sense) electricity or the magical Internet! No, I’m not trying to trivialize real software developers who know their shit really well and have adapted to new languages over the years. But there are many a developer I have worked with who couldn’t properly run their own IIS server, even though their code depends on it (or they try to them make these wild dependencies that result in “But it works on my machine and now I have a deadline tomorrow…”)

2) Because they keep coming in and out, they have very little structure in their requests for support. It typically consists of “Make X work,” where X is as vague a description as you can get. By the time I teach one to make a proper ticket for me to do work quickly rather than a back and forth interrogation, they’re replaced and I get to start all over. This happens far too often: “I get an error on the site.” Me: “Uh, which site? What error? What were you doing? Was this working yesterday?”

3) I’m pretty adaptive in how I explain things to people; it’s something I’ve been complimented on professionally over the years. But, typically, foreign contractors with limited breadth of knowledge stymie me. I can explain something 5 different ways in various contexts and still get blank looks. I often have to get into that really negative zone where I have to be very direct with my words. Things that normally get me in trouble like, “I cannot do this until blah blah blah,” or, “You’re not doing this how we expect, please go talk to your supervisor/mentors,” or, “No, I can’t troubleshoot your code for you, I’m not the coder. What do you mean you don’t know what an application is in IIS?”

4) They all have varied backgrounds and may have an idea to implement XYZ. Sadly, this issue is compounded by the first two issues, and also because they are almost always short-timers. I wonder how many companies have implemented Crystal Reports because of a short-timer, and now regret it deeply.

p2p and the campus network

ComputerWorld has reposted a campus P2P network story from WPI (Worcester Polytechnic Institute).

I could have my own story on campus P2P.

When I started school at Iowa State University in the fall of 1996, they had recently tapped into a nearby backbone and were sitting on a sweet T3 connection or better (this is one reason I was addicted to Quake my first couple years there!). The local network was pretty damn nice too. I could open Network Neighborhood and immediately browse a listing of hundreds of systems in the residential network. Some had files shared, many did not, some didn’t know they were sharing things like c:\.

With trial and error, I could find the systems that actually had files shared. Music, warez, porn. And later on movies. At the time, mp3s were only just taking off.

Within a couple years, a couple guys down the hall started hosting a new website on their resnet connection. StrangeSearch (appears to no longer exist at ISU) indexed all the files shared out on the network and provided a nifty, simple Google-like search box to search for whatever you wanted. So now if I wanted to browse files randomly I could do so, or I can look up specific things I wanted. I could even search for the biggest sharers so I didn’t have to trial and error on individual systems anymore.

In fact, StrangeSearch and the ISU network was the entire reason I never had to use Napster or Kazaa back in the day. I completely skipped that development because everything I actually ever wanted I could find locally at LAN speeds. Movies, obscure music, newest versions of Photoshop and other cracked games, etc.

Now, this freedom is interesting. Back in the day, I didn’t have a huge musical collection; most everything came from friends or (gag) radio. But because I could browse stuff on the network randomly, including people who had similar tastes, I was able to find new music, artists, and even movies quite easily (I had never heard of Heat, for instance, until I saw it on the network; I have since purchased it three times over and remains a favorite). In fact, I was opened to and spent money on people and things I likely would never have found before.

tjx details starting to come out in testimony, nothing juicy yet

The methods used in the TJX breach have been widely “known” for some time now; crack WEP, remotely connect upstream, sniff transactions. Prat Moghe has posted an organized list of details supposedly from actual testimony. There is nothing new yet, but this at least lends more weight to some facts.

Still, there are gaping questions not covered, or at least not covered yet. I posted a couple questions on the comments for the link. Here they are, plus a couple more.

1) What sort of protection was or was not in place to filter and detect fraudulent traffic from the store to the datacenter? My guess would be a leased line or site-to-site VPN that was wide open.

2) How did the attackers gain admin rights to the “RTS” server(s)? If it was just a little wave of a magic wand, then here is another breakdown with patching or HIPS protection.

3) How did the attackers install “custom sniffing” software on the “RTS” server(s)? Did this show up under installed software (gah!) or in a task listing? If so, this should be ideally monitored (yeah, ideally anyway), or some sorts of tripwires set up.

4) Outbound FTP from the data center? I guess, but this could be blocked or at least alerted upon. I mean, how often would this bank of servers really initiate FTP connections or any connections to the Internet cloud?

5) I’m curious at what level the sniffing occurred. For instance, was it grabbed right off an unencrypted connection, or pilfered lower in the OS?

And I’m still just scratching the surface. Interestingly, everything above is not integral to actually making the payment transaction system work as needed. All of this is added on as security tightening. Kinda illustrates that priority is getting things working, not getting things working securely.