Proof of execution in forensics is a fun topic, but it becomes a bit tedious to keep track of all the different places to look in various versions of Windows. So I wanted to quickly save a link out to this table from Adam Harrison (1234n6).
I find it crazy that I’ve not seen this before, but I got linked today over to the MITRE Ten Strategies of a World-Class Cybersecurity Operations Center free book (pdf). Holy crap this is awesome. The rather large first section talks about building a SOC and the various considerations that go into it. And then the top 10 strategies build on that foundation to further guide the growth of the SOC.
Every section has wonderful nuggets of truth like this one in strategy #5 (Favor staff qualify over quantity):
Analysts must be free to analyze. It is indeed true that Tier 1 analysts have more structure in their daily routine for how they find and escalate potential intrusions. However, those in upper tiers must spend a lot of their time finding activities that just “don’t look right” and figuring out what they really are and what to do about them. Overburdening analysts with process and procedure will extinguish their ability to identify and evaluate the most damaging intrusions.
Honestly, this might be my second favorite technical book, up there with The Practice of System and Network Administration (Limoncelli).
I still have a few months left for 2018, but I feel like I’ve been pretty successful already with my goals on the year. This is really year 2 of me specifically tracking my career growth and learning. In 2017, I earned two offense/red team certifications, and this year I earned one defensive and one forensics certification, amongst other learning accomplishments. So, largely for my own benefit, here’s my summary on the year of the important stuff.
training and career goals for 2018
- keep doors of learning open for both blue (defense) and red (offense) sides of the field – This isn’t a goal so much as a lifestyle statement, but I feel like I’m on track here. Even as I plan to alternate learning year over year, I’m keeping both sides in mind every year. I ultimately want to make sure my offense, defense, and forensics can all test and improve the others.
- balance career growth opportunities along with actual learning – Going well on this! My enthusiasm has gone up quite a bit, and with the exception of the CCNA CyperOps cert, everything has been chosen for learning opportunities and not marketability. I think this pendulum will continue to swing permanently over towards learning as I get older and need certs and letters less.
- balance of work-driven and self-led growth learning opportunities. – Even without leaning on corporate support financially, I feel like I’m achieving this. Like other items, this is less an item to satisfy and more of a theme or lifestyle statement to keep at the top of my yearly goals. I also try to keep a balance of formal and informal learning tasks.
- Cisco CCNA Cyber Ops course/certification (2 exams: 210-250/210-255) – completed in March and lasts 3 years. Keeping this depends entirely on what Cisco wants to do with this line. Did I learn much from this? I actually did, but it was also all pretty basic to me and easy to approach, consume, and test on. I honestly would not have done this had it not been free. The biggest benefit is now knowing where this fits into my recommendations for other students and newbies, and it’s a pretty good cert for someone looking at an analyst/SOC role.
- SANS FOR508 (May 11-16 San Diego) + NetWars – completed in May. Absolutely loved my time on site in the course and studying later for my first SANS/GIAC endeavor. I purposely aimed at something challenging that was going to put me into some deeper waters (memory analysis), and I couldn’t be happier for it. Participating in NetWars was amazing, and set up my only remaining engagement yet this year: SANS CDI.
- GIAC GCFA certification exam passed – completed in September and lasts 4 years. I likely won’t need to sweat renewals for this for a while, as I have a backlog of SANS courses I want to take, and certs I’ll opt into testing for. Overall, loved this process, and having an exam as an excuse to study more really made the material sink in and click for me. This is also an example of me stepping a little bit outside my comfort zone, as I’ve never done forensics like this before. I have a deep Windows administration and security background, but much of these methods and materials was a new approach for me.
- Maintain CISSP – Completed, of course.
- spunk .conf 2018 – Completed in October. Not only my first time at a Splunk event, but honestly, I think this is my first vendor-specific conference in my career. I really enjoyed this con, even if I didn’t actually learn a ton. But, I think I’ve learned how one should approach such a con like this, i.e. come with questions to start a discussion with vendors and subject matter experts or fellow attendees as needed.
- BSidesIowa, SecureIowa, SecDSM – Kept up with the annual cons and the monthly SecDSM meetings this year so far. A bit of a softball in terms of goals, but I find it is important to keep a ling item for cons, local and remote, to stay current on.
- SANS CDI Netwars ToC – Decided to opt into doing this as I may not get the chance again. Occurs in mid-December and I’m all set up to attend.
- Metasploit Unleashed Course (OffSec) – incomplete. I admit, this isn’t a big deal, and I’m just being stubborn at this point in keeping it on my TO-DO list. But it’s here, and some weekend I’ll just knock it away. (It’s not like this is updated and current anyway…)
- finish LinuxAcademy RHCSA/LFCSA courses – All of the completed items stole time away from this and reduced its priority. Even if I still don’t get to this in 2018, it’s going to be a thing in 2019 for me as well.
- SLAE-> CTP/OSCE (tentative, or just prep) – I knew it would be super aggressive and difficult to maintain sanity and also prep for this path, and I’m not surprised I have not even started it. It’s still on the list for possible late 2018 inclusion, or another lower priority in 2019.
- HTB VIP Progress/Habit – Completed. I got back into HTB with a vengeance after realizing my offense skills were rusty during the SANS NetWars event this past spring. My goal was to hit 50% completion in HTB, shake off the attacker rust, and just build a small habit to keep with it. But, after getting going, I met some folks on the platform and got help when I needed it to achieve 50% completion by July, and 100% completion by August.
- Burp Suite improvement/growth – Doing HTB got me good practice and experience with Burp, but I want to consider this only about 25% done, and something to continue working on.
- Web Hacking 101 book – Haven’t started it yet.
- Python (+scapy) improvement/growth – on hold, I still need to figure out how I want to tackle this
- PowerShell improvement/refresher – on hold, I still need to figure out how I want to tackle this
- CTF participation (as it fits in) – This was definitely the lowest priority of the year, so I feel even my minor work here completes it.
- survive at work (work topics) – Completed!
- incorporate Feedly, Pocket, Discord, Slack in day-to-day habits – I feel mostly completed on this one, with the very notable exception of the things piling up in Pocket.
- expand OneNote use – Successful in moving from EverNote to OneNote.
- work on better anonymity online/VPN service for personal use – I don’t feel I really started this.
A week ago I flew down to Orlando, Florida to attend Splunk .conf18. In thinking back on this, I have to say this is the very first vendor-specific conference I think I’ve ever attended in my 15 years in IT. Based on who you ask, the con itself had 7500-9500 attendees in its largest event to date. That’s pretty impressive! I attended as many talks as I could, and I left pretty happy with the content I consumed. The talks and slides are all available online for consumption.
Day 0 – Sunday
My goals for this day were just to get to Orlando and settled into the hotel and do some recon of the grounds and environment. On the plane, listened to some Darknet Diaries; finally finding some time to do some podcasts! Took some time to hit the Boardwalk on the ground and already get sick of the heat and humidity.
Day 00 – Monday
Goal today was to get registered for the con! The line was super quick, even at 10:30am with the masses to get checked in, get a badge, pick up the backpack/water bottle freebie, and then pick up the freebie hoodie. Beyond that, this day was pretty casual until the evening.
First Timer Orientation talk – This was a nice intro to the con, even though the room was moved and I didn’t hear about it until a co-worker texted me. I guess I need to click update notices in the event app! (Come on, I’m in security, I don’t click accept/download buttons unless I have to.) Also, this was the only talk that I attended with a drink-in-hand speaker. (I’m not a huge drinker or want others to drink, but to me, this still sets a tone and statement for the sort of partially or fully informal a venue may be. This is why I like smaller cons over larger vendor ones.)
Welcome Soiree – This was a neat way to get people to the vendor floor: an evening event with free food and alcohol stations throughout the vendor floor. Scoped out vendors, splunk experts, projects, and plenty of swag. And I will admit, I evaluate vendor booths on three things: 1) whether I know and like them as a product/company and want to say hi, 2) whether I want some of their swag or not (either for me or to give away to others), and 3) whether I want to buy them (and I’m not a purchasing approver, so that’s pretty much no one). I had fun down here, though someone kept turning on music every now and then and it was ridiculously loud.
Day 1 – Tuesday
Visionary and Roadmap Keynote + Breakfast – For the morning keynotes, buses took us to ESPN Arena where we picked up breakfast bags before taking seats. After the talk, I don’t think the bus crews were ready for the flood of people, and organization broke down pretty hard on one side of the venue, but we all got back in decent time (albeit later than intended due to the overlong keynote).
Security Super Session: Splunk Security Vision and Roadmap
Security Super Session: Splunk Security Vision and Roadmap – A strong, high-level look at Splunk and using it for security operations. Not much to say on this one. The diagrams are wonderful (and would be used in several talks I’d see over the course of the con) for designing your security operations around.
Find and Seek – Real-time Asset Discovery and Identity Attribution Using Splunk – I didn’t actually see this talk. Tuesday was the one day where I was all over the grounds for various talks, and required buses to get me places in time, and the buses were still a little chaotic. I was on time getting to this talk, but after about 15 minutes after the start time, we were all still waiting outside the room. Thankfully, it was right next to a sandwich distribution station, so I just left with my lunch to eat elsewhere. I’ll have to catch this recording later.
Let’s Get Hands-On with Splunk Enterprise Security, Splunk Phantom, and Real Boss of the SOC Data – This was the one “laptop required” talk I attended, and honestly one could have been just fine sitting back and watching along. This session had several hundred people in it, and as such you have to expect them to move on and not wait for anyone, and move on they did! Thankfully, this is the introduction talk for a broader and slower workshop for security people to get from Splunk throughout 2019. As it was, I really enjoyed getting hands-on a bit with some practice data for finding attacks. The data itself was used in the BOTS competition the previous evening. While I’m new with Splunk, it’s these hands-on demos and doing actual things with the data that get me excited, rather than high-level, perfect-situation statements.
Threat Hunting and Anomaly Detection with Splunk UBA – I really liked this talk and speaker. While nothing about Splunk and anomalies and hunting were new to me, I really loved the best/worst practices examples. That’s the sort of detailed, technical stuff that I eat up, rather than non-filling high-level statements.
Pub Crawl – Similar to the soiree from the previous night, only with craft beer stations and less food overall. Other than the alcohol and snacks, I didn’t really need a second round through the vendor hall.
House of Blues – We also got invites to a party at the House of Blues. The music was just passable, but it was an excellent buffet, and I got a chance to sample the infamous Voodoo Shrimp (which was basically forgettable, to me). The best part was just getting another evening without a food bill!
Day 2 – Wednesday
Product and Technology Keynote – I’m not a huge breakfast person, and I found out you can watch the keynotes online, so I didn’t even bother heading out to see this one live. I opted to stay near the hotels and not fight lines for a latte.
Hacking Your SOEL: SOC Automation and Orchestration – I love technical talks, less so high level ones. But if there is one talk that I’d recommend that is high level about SOEL, and SOAR, and SOC automation, I’d point people to this one. The speaker just plain made sense of all of this. Sure, it was high level, but also detailed enough to formulate a roadmap for the future on the topic. One of the more solid talks I attended.
Attack Surface Reduction: Using Splunk to Spot the Security Flaws in your Network – The description for this was probably reflective of a longer talk that got cut down. This talk ended up being basically a firewall review 101 session, but using Splunk to view your logs for activity on firewall rules under review. I did learn just one thing from this: monitor for sessions that hang, i.e. no endpoint listens on the target port anymore. I probably would have done that, but I think it’s important to keep that situation in mind. The rest was really pretty newbie material.
Which brings me to one of my main challenges: Finding the right level of talk for the topic. For instance, I’m a newbie with Splunk, but security concepts I’m very deep with, both defense and offense. I would love to have known this talk would be at a newer level of security, as I would have avoided it. This would apply to some of the threat hunting and SOC automation talks, which sometimes felt like they were just saying the same high level things over and over without a ton of deeper substance (i.e. for people less senior than I). This might not be a con issue, as it might just be my inaccuracy with using the con properly, i.e. less talks, more 1-on-1 and breakout discussions.
Cops and Robbers: Simulating the Adversary to Test your Splunk Security Analytics – Came into this very interested, but also skeptical on why the heck I’d want to spend time automating attacks like I’m some QA team. But this talk made a great case for why you do this, and how you approach it, particularly with Phantom and some other tools. Looks very cool for use on an internal testing team that evaluates not only internal response and controls, but also can test security products and even do some training exercises with your Splunk teams.
WMI – The Hacker’s Chocolate to their Powershell Peanut Butter – Probably the deepest technical talk I saw at the con dealing with attackers using WMI, WinRM, and Powershell in modern attacks, often going fileless, and how you could use Splunk and general logging to hunt these compromises down. I really enjoyed it, and was a great reflection on the Splunk security research arm.
Monitoring and Mitigating Insider Threat Risk with Splunk Enterprise and Splunk UBA – As a Splunk newbie, I wanted a mix of talks on some of their products and how I can wrap my security team around them and my own priorities and goals. This was a good talk about implementing insider threat detection using Splunk UBA. I’ll likely revisit this again as we start our own projects on this in the coming quarters.
Search Party! At Universal Islands of Adventure – Such an absolutely fun time having the park to ourselves to avoid lines and endless children in order to ride Hogwarts Express, Harry Potter’s Forbidden Journey, and the Jurassic Park river ride. The Express was super fun, and Forbidden Journey ride absolutely awesome, and the Jurassic Park ride a fun mess that stopped 3 times and ended up taking about 30 minutes to get through. The walk around the park was fun, though the back half through Marvel and the Comic Book zones were plenty unexciting compared to the other areas. Really wish we had more than 2-3 hours, but fun and free nonetheless!
Day 3 – Thursday
Guest Keynote: Steve Wozniak – I don’t really have a huge desire to listen to Woz; smart dude with lots of money and the ability to opine about technology. Fine. To make sure people made it to this talk, it was not broadcast like the other keynotes, so I just opted to skip.
Overall on this day, the food stations and snacks were far skimpier on this day. I still never had to visit the main food tents, but I definitely had to look for food myself otherwise.
“MAKE IT RAIN!” How to Save Money Monitoring, Managing, and Securing Your Cloud Using the Splunk App for AWS – By now, I know that I should expect high-level statements when I see CEO, CTO, or other high-level manager titles in the speaker list for a talk. And then a talk like this comes around to prove me wrong. (I’ve honed my stance on this to apply only to Splunk as a company itself when its higher-level managers speak.) This talk was an actionable demonstration of tying some important AWS logs into Splunk and showing how that is valuable for operations and even security. A slightly short talk, but really nice to sit through as someone new to Splunk, new to AWS, and subsequently new to doing them at one time.
From Threat Modeling to Automated Response – Identifying the Adversary and Dynamically Moving to Incident Response – Yet another talk about threat hunting and TTPs and adversary profiling. A good talk, but I don’t think it included anything that I didn’t already know.
If there’s anything in my year that will define it, it’ll be the prevalence of Kill Chains, Threat Profiling, and Threat Hunting. I can’t escape the same ol’ statements about them. I had it throughout the Cisco CCNA Cyber Ops course, the SANS FOR508 course, multiple talks at Splunk .conf, and beyond. I’ve long had a post waiting about how and why threat hunting is such a deal these days (it comes down to getting internal value and blending offense into the internal blue teams, plus trying to make sense of the new breeds of security tools that don’t just alarm on bad, but require human decision-making to piece together multiple things…).
Blueprints for Actionable Alerts – This apparently is a version of a talk done for several years, and it kinda feels like it. For some strange reason, I didn’t get much out of this, though on the surface I should have. It’s really a discussion in figuring out how to tackle an environment with 4000 alerts in a day, and reducing that piece by piece to be manageable and useful. I think everyone sort of does this their own way, which all sort of dance around the same gameplan.
Splunk P30X: Become a Lean, Mean, Splunkin’ Machine in 30 Days – Probably the best and most useful talk I attended at the con. The point is to have an actionable, lunch-hour plan to tackle and do various Splunk activities to culminate in being able to pass the Fundamentals exam at the end. I loved the actionable approach to this, as well as the follow-up activities the authors are releasing to support it that I can directly consume. Not only the 30-day plan, but also additional materials for newbies. Wonderful talk!
Day 4 – Friday
Nothing much exciting here, just a full day of getting back home.
I loved the overall experience and benefits to going; it was fun, got to visit a fun park, and so on. This could double as a family vacation if you brings the family along. Next year, the con is in Vegas, and I’ll admit that has less appeal to me as a venue/area.
If I go again, and have others with me, I’ll lobby somewhat hard to get signed up for one of the competitions they hold, either Boss the of NOC or Boss of the SOC, where teams pour over and parse out data to answer questions about operations or security incidents, respectively.
A revival of sorts on content from BHIS on getting into the Infosec industry, including A Career in Information Security FAQ Part 1. Pretty good stuff! But this section really stuck out to me:
The customer service, tech support, help desk, etc., these jobs are crucial to forming a solid background in computer science. Learn how to solve problems effectively. Learn how to discern between useful web search results and wastes of time. Employers don’t want to hire you for what you know. I generally believe that anyone (some computer background) can be trained to accomplish digital tasks. I can’t train you to manage your time well. We can’t train people to be nice, treat others like human beings, or to be steady under pressure. And truly, those are the skills that will put you at the front of the line. It worked for me and everyone else at BHIS too.
I would include other skills such as asking questions, being curious, being tenacious, looking at ways to break and fix things, and having a quick mind to solve puzzles.
And to be honest, that whole post is a wonderful bit of encouragement and advice for anyone to read, newbie or jaded veteran. Things like, “That motto is ‘Fail Fast, Fail Often, and Fail Forward’. When you are working on solving a problem spend more time failing and less time analyzing the problem from a distance,” and “One of the most critical skills in information security is the ability to go off script.” That’s gold right there, alone!
Addendum: I do want to point out the question towards the bottom of the post about the biggest hurdles in first getting started. And it might be obvious, but it bears constantly repeating that the two biggest items are 1) experience, and 2) imposter syndrome symptoms. The former is just something you get past after a few years of work. The second is a lifelong personality and internal compass issue where we just have to come to terms with the scope of infosec and how no person can begin to swallow that whole ocean. Learn what you can, balance your life, fail fast, move forward, get better, succeed.
I don’t often find fairly general articles to have enough interesting nuggets and quotes to bother saving, but sometimes they just flow so well and include plenty of head-nodding things to agree with, all with wording that I appreciate. One such article came across from Dark Reading, Think Like an Attacker, How a Red Team Operates. Dark Reading seems to like limiting the ability to read articles, so I don’t mind being a bit liberable in pulling out quotes I like.
“The whole idea is, the red team is designed to make the blue team better,” explains John Sawyer, associate director of services and red team leader at IOActive. It’s the devil’s advocate within an organization; the group responsible for finding gaps the business may not notice. I just love that sound byte. I want that to be my elevator job description.
“The main function of red teaming is adversary simulation,” says Schwartz. “You are simulating, as realistically as possible, a dedicated adversary that would be trying to accomplish some goal. It’s always going to be unique to the target. If you’re going to get the maximum value out of having a red teaming function, you probably want to go for maximum impact.” The early part of the article goes a great job of succinctly comparing pen testing and red teaming while also illustrating how these have changed as time has moved on. Old school pen testing has shifted to be called red teaming as a way to further differentiate as pen testing has become commoditized.
The team ends up chaining together a small series of attacks – low-level vulnerabilities, misconfigurations – and use those to own the entire domain without the business knowing they were there, he says. Typically, few employees know when a red team is live.
Red and blue teams may work together in some engagements to provide visibility into the red team’s actions. For example, if the red team launches a phishing attack, the blue team could view whether someone opened a malicious attachment, and whether it was blocked. After a test, the two can discuss which actions led to which consequences. Beyond actually enjoying it, this is my whole value proposition for my interest in offense and red teams: It makes my defense better. Which makes me get better on offense. Which makes my defense get even better… Getting a root shell or DA credential is the addiction, the satisfaction is passing on the information to make improvements.
More and more companies are starting to realize if they limit themselves to the core fundamentals of security, they’re waiting for something bad to happen in order to know whether their steps are effective, says Schwartz. Red teaming can help them get ahead of that… Many companies are building red teams in-house to improve security; some hire outside help.
The main reason behind building a red team internally is because as it grows and improves along with defenses. As security improves, so do the skills of red teamers. Offensive experts and defenders can attack one another, playing a cat-and-mouse game that improves enterprise security, he continues. Internal teams are also easier to justify from a privacy perspective.
Overall, the pros argue a full red team can help prepare for modern attackers who will scour your business for vulnerabilities and exploit them – but they’ll help you stop real adversaries.
“The difference between a red team and an adversary is, the red team tells you what they did after they did it,” Schwartz says.
That’s such a strong ending to this article, that I had to pull a bunch out right there. Wonderful!
Rapid7 has released the second edition of their now-annual “Under the Hoodie” report, which is a compilation of information and statistics compiled across Rapid7’s penetration testing teams. There’s really probably nothing terribly surprising in here, but it’s always nice to have some raw numbers of anecdotes in pocket for various conversations. Here are a few interesting tidbits or quotes I wanted to pull out.
“Relying entirely on an automated solution or a short list of canned exploits is likely to meet with failure, while a more thorough, hands-on approach nets significant wins for the attacker.” This statement has importance for internal security testing, third-party testing, and also for defenses. The first two can be obvious, but the last one about defense helps frame models, for instance the impact of an internal threat or an attacker specifically targeting a company rather than just automating a search for opportunistic moments. It also speaks between the words that an attacker with some hands-on effort and not time-boxed like a pen tester can see success.
“Furthermore, these results imply that if the penetration tester is not detected within a day, it’s unlikely the malicious activity will be detected at all.” Detection is a big deal. I’d also throw in the practice of threat hunting to find successful attackers who have gotten past the outer layer of defense and alarms. I recently deleted a draft about the whys and hows of the rise of threat hunting/intelligence (I posited it was a combination of the reduction in AV/IPS signature success, the complexity of environments, the rise of offense-friendly staff looking for offensive things to do, and other factors…). Prevention is important, but solid and effective detection matters.
“The number one issue that causes the most consternation among penetration testers is solid network segmentation. If they cannot traverse logical boundaries between environments, it can be extremely difficult to leverage a set of ill-gotten workstation credentials to escalate to domain-wide administrative privileges; even if a powerful service account has been compromised, if there’s no route between targets, the pentester must effectively start over again with another foothold in the network.”
Other factors that cause frustration for pen testers are multi-factor authentication for accounts, least privilege practices on accounts, strong patching and vulnerability management practices, and awareness to spot and report phishing campaigns, social engineering, and other low-tech attacks. What’s fun is how these 5 items are disciplines that blend security with other, very different departments: The network team for segmentation, systems/developers for 2FA/MFA, systems for patching, IAM for least access, and everyone for awareness. You can’t just boost one area of the company (or just security itself).
Recently went through and cleaned out dead links in my Feedly news feeds. Not only did this kick in plenty of nostalgia, but also reminded me that I should update the sidebar links on my blog! While going through these, I sat back and thought about how time-consuming this process is, how annoying it is to update wordpress themes (just give me a raw txt file that I can put code in rather than wrestle with weird interpretations and random carriage returns!), and for what personal purpose this even mattered.
In short, I need to sit back and think about what exactly I am doing with this blog site and how to make it better for me. Moving to hosted WordPress has helped with site maintenance, but has made other things more difficult. In the past, I always edited files by hand and coded things directly, but these days I tend to use the WYSIWYG, but it’s not usually quite what you see…it’s more like wrestling with a slippery eel to get things to look the way I want, rather than the way the themes want. This makes updating the sidebar annoying. At best.
There’s really four parts to my blog: the posted content, the sidebar link list, comments to posts, and the links at the top that spider out to other things about me, with this blog page being the nexus point where they converge.
The sidebar links
The extensive sidebar list of links has been part of the site’s identity since the beginning, but it’s also an old school relic.
The list is somewhat save-and-forget, except for some of the most-used items. The rest, I honestly forget are here. For some, it’s still just better to use Google to get the latest, greatest.
These links are also best used by me, and probably not clicked on by anyone else ever. The list is roughly doubled up in: feedly, podcast subs, youtube subscriptions, twitter follows, discord server memberships.
I do know that clicking links will place referal pokes to the targets…maybe. It’s one of those ways to get noticed, but I’m not sure blogs and/or comments are “noticed” anymore or really followed at all. A blog used to be your focal point online that other things revolved around, but these days the social sites have supplanted them. There is also so much flow these days, that I don’t ever really “catch up” on blogs I’ve missed. They’re much like IRC or Twitter; you pop in and maybe look at the recent buffer, but the rest of the log is in the past and there’s no reason to spend that time reading backwards.
The bottom line: the link sidebar is a relic with questionable value to me, and is annoying to update.
The comments are easily forgotten, since I don’t get many and don’t expect many. The problem is the lack of two-way discussion. Comments on blogs are often post-and-forget, never looking back for an update without specific effort to do so. It’s far better to follow and tweet to someone on Twitter these days, or in extreme cases, find someone on a discord/slack/IRC.
In the past, prior to all the social networks, blog comments were useful to expand your exposure. Comment on someone else’s blog, put your own link in the comment, and likely get a poke or comment back in return. Again, though, today that is better done in Twitter/discord and by posting content that actually is useful to be consumed.
To be fair, comments are cool, akin to a Like, but dialogue anymore is best done elsewhere.
The bottom line: Ultimately, comments are an after-thought these days on any but the most popular blog sites, like Krebs’ blog.
The blog contents
But that does bring up the question of why I should ever update the blog? I honestly don’t look back on many things. The biggest two reasons: 1) shows off my interest and 2) allows me to organize and solidify thoughts. I may not reference the post itself ever again, but the act of writing something out helps ingrain the information and thoughts.
It’s not something I really do for anyone else except me, and as a way to sort of demonstrate my interest/enthusiasm/participation in the greater communities.
The posts I most-often re-reference myself are the personal ones like my yearly goals and results, or links to really informative checklists and processes; things that I struggle with putting links to in the sidebar only to forget them!
The bottom line: I still like maintaining the blog and it does have personal value to me.
The personal link nexus
I can’t see this going away anytime soon except maybe on a github page with a similar list of links. The whole point is to act as a point of convergence for my “stuff.” A place to find my Twitter link, LinkedIn page, Github page, and just a little bit about me (that age-old bio or About page that I feel is still necessary to tell your story properly).
Being able to control this convergence is still an interesting deal, as it lets me decide whether I want my personal name attached to a particular screenname somewhere, but as I get older, I also care less except with my own personal threat models.
As a bonus, I still love my personal domain.
The bottom line: I still plan to use this personal domain and resident site to be my nexus here, and I think I’ll expand the links a bit to include Github and maybe some other spots.
The links on the sidebar …could…be put into a github instead of this site, and probably more easily updated, too.
I could use github to also save backups of things like my podcast opml, feedly feeds export, and so on. Things that are not sensitive or inherently private.
A github is at least easier to update. And while it might not fix anything about my list of links and its usage, maybe it’ll help me pare it down a bit. Better yet, if I have a feedly export, why bother with the blog/news lists?
There is also the choice of having a private github for a few other things. I definitely don’t want to make it a huge “backup” of things, since that’s what other file-sharing services are sort of for, but at least some of my online presence and “home” page can be tailored a bit in private.
Recently watched a talk about a tool I’ve known about for a while, and just haven’t gotten to in my to-do list. I used the output of the tool briefly on a HackTheBox.eu target to much success. And after watching the talk at SecDSM, I’ve gotten excited again about employing this at work someday.
Bloodhound by researchers at SpectreOps is a tool that exposes Active Directory permissions and relationships with the goal to achieve Domain Admin (DA) or High-Value access into AD to pwn the domain entirely and win the game. This might sound unexciting if you only think about accounts and groups and group memberships. But Bloodhound goes deeper and wider by looking at actual underlying AD object permissions and how those objects relate to various computers in the domain.
During the talk linked above, one of the best parts is near the end when they talk about metrics, and I really loved these metrics which effectively measure how much exposure the domain has and how much effort an attacker will have to exert to pwn the environment with regards to AD permissions. It also illustrates opportunities to detect the attackers.
- Users with Path to DA (target: 5%) – The lower, the better, as you really don’t want to think that every user that could be compromised could lead to the end of the domain.
- Computers with path to DA (target: 5%) – Same story here, you don’t want to think most systems are just a few hops away from DA. Even a single malware/phishing success is dire!
- Average Path to DA Length (target: 5) – The longer the better, as you want attackers to go through as many steps as possible to get DA.
Part of the reason it took so long to take my GCFA exam was the splitting of my study time with Hack The Box progress. Earlier this year I bought into VIP access to HTB, and I wanted to keep practiced up with, and learning new, offensive skills. I did more than I was expecting, and after making some friends smarter than I, was actually able to far surpass my goals and expectations to achieve 100% completion and top Omniscient ranking sometime in mid August. I still have to go back and properly learn some of the things I found way too difficult to do alone (let’s face it, the best people are the best due to learning through teams, and the best red teams have multiple people): namely binary exploitation and reversing. I get how they work, but I need my hand held way too hard right now! I still also have the “optional” sections to complete (Fortress and Endgame), and I’d like to dive into RastaLabs and Offshore, probably in 2019.
Really, my main goal was to keep the skills and processes I developed in the PWK fresh, while also learning new and more advanced tricks and tools and techniques. And I feel like I’ve succeeded in that aspect. These skills get dusty and rusty if not practiced regularly, either on one’s own or while in the course of work duties.
This past Friday I had the pleasure to sit for the GCFA (GIAC Certified Forensic Analyst) exam and pass with a 94% score. Quite the relief after a summer of (somewhat slowly) making study progress. In May, I attended the SANS FOR508 training at SANS West (San Diego). Shortly after, I took a bit of a break, and since then have slowly studied and gotten ready for my exam attempt. I’ve blogged about the course before, so I’ll try not to rehash anything. The course was my first SANS experience, and this exam was thus my first GIAC exam experience as well.
Did you take the practice exams? Yes I did. In late August I took the first practice and scored an 83% with only about 9 minutes remaining at the end. At this point I was pretty nervous, but I also was not quite done with my study plans, either. A week later I took the second practice and scored an improved 93% with 30 minutes to spare. They were definitely helpful to see the exam format, get familiar with the interface, and also get a feel for the question style and feel. The real exam felt extremely similar, and while the questions were not duplicated, they felt written by the same author(s) and with the same feel as the practice ones. For the second practice, I turned on the ability to see explanations for both correct and wrong answers, while on the first attempt I didn’t know that option was present and just saw my missed answers. Also, I limited myself to my books and my digital index with no spreadsheet searching functions; just scrolling and eyeballing. I also had paper nearby to write down any concepts I missed, or those that I got correct, but struggled with, for review later.
Would you recommend the practice exams? Yes! I probably could have passed if I had skipped them, but they did absolute wonders for allowing me some feedback on where I stood and gave me a chance to gain confidence and familiarity with the question styles. The practice also gave me two chances to test out my index, hone it, and become even more familiar with the books, adding to my efficiency in an exam where time is precious. Most importantly, this whole study process helped me grasp and “get” the content so much better than just the course alone.
Did you have your own index for the exam? Of course! My goal with the index was to use it to not necessarily answer every question for me, but to give me enough information to come to a probable conclusion, and to then point me to the correct places in the materials to confirm that answer. My true place for answers is the books, and I wanted to provide enough context to be able to look up the appropriate information in the right place when I came across a term or subject in the exam. My index ended up being about 45 pages landscape, with 1536 rows at 8 point font. Having it top-bound was wonderful (about $13 printed online at Fedex/Kinkos).
When creating my index, I started out with a spreadsheet tab for each book. I had four columns: SUBJECT, TERM, DESCRIPTION, BOOK-PAGE. In retrospect, the SUBJECT column was never used by me, and I’ll leave it out on future exams. For the spreadsheet tabs, I’d leave the notes in chronological order. On a separate MASTER tab, I would regularly copy/paste the other contents into it and sort by the TERM column to see my MASTER index. This MASTER tab was what I would later print out.
If a term appeared more than once, it would get more than one entry. I didn’t want to squish BOOK-PAGE numbers into a single row at all. For multiple page mentions in a row, I’d make highlighter arrows in the books to prompt me to look ahead if the topic continued. If a topic had multiple terms or an acronym, I’d include all of them in their own entries. I would try not to do the whole “See Topic X.” I did early on, but hated it, and went away from that later (the one time I came across such an entry during the exam, I cursed myself). The goal was to go from Index to Books, not Index to Index to Index. I tried to be complete enough in general in the Index, but invariably questions would ask for very detailed specifics. And I didn’t want to solely trust myself to transpose the terms correctly, so I didn’t try to be exhaustive; as said earlier, get to the books efficiently! I also indexed terms on the blue and red posters. (Both of which I used in the exam, though much of the information can, in fact, be found in the books.)
I initially limited myself to a single line of description per term, but eventually I acquiesced and allowed myself multiple lines (hold Shift when pressing Enter while in entry mode to add a newline inside a cell). My index would have been longer and even more immediately useful had I not decided that pretty late.
I also used sticky tabs at the top of the books to mark key pages and sections. This way I had the option to skip my index altogether if I knew what general section I wanted to flip to. I used them a lot, too, not just during the exams, but when studying as well! I honestly think doing this saved my butt.
To be honest, I’m a natural information organizer. If I were more of a social person, I’d probably be a project manager! I’m also a note-taker, so doing this index was a loving exercise, rather than a chore. It also helps to remember that this index is a one-time use item. It doesn’t need to be perfect or pass muster for inspection by an editor. Everyone has their own level of perfection they need, but I know my index isn’t without mistakes, has holes, and maybe has more or less than it should. But that’s why I wanted to make sure it led me to the books as much as needed; trust myself, but verify the answer!
What was your study plan? After the 6-day course in San Diego, I probably took a good two weeks off. After that, I started going through the course books again. My goal was to read every word of the books (slides and notes). And yes, that took a while. I would highlight orange every tool mentioned in the books, and write it into a separate notebook of mine (my own personal list of tools). I would highlight key topics and statements with a green highlighter. After about two books, I actually started adding key terms, concepts, tools, and topics into a spreadsheet to begin my actual index. I then went back and caught up the first two books with a quicker pass.
Once done reading the books, I accessed the On Demand content to listen to the lectures again, follow the slides, and follow along in my books. This essentially was another pass through the material, and a second full pass to populate my index with things I missed or wanted to flesh out. For instance, I didn’t decide to put full command examples until my second pass. While winding down the On Demand materials, I also started going back and doing the lab exercises again, at least as much as I could (some tools expired). (I did *not* actually include the exercise workbook notes into my index, and I wish I had done so.) Doing all of those above really helped cement the material in my head, but also caused me to really actually *get* it, if that makes sense. Context fell into place, reasons for various things, and it just all feels natural and confident now.
In the day or two before the exam, I limited myself to just flipping through the books. I took the early part of that week off, and doing this allowed me to get familiar with the tabbed sections again, for quick reference and flipping to my tabs.
How was your actual exam experience? Pretty good! I got in early and got going pretty well. The exam itself is a brutal slog of 3 hours, and I definitely made plenty use of it to be as sure of my answers as I could be in a short period of time. Even with my index, there were a few questions that had me somewhat stumped or utterly unsure where to look for that information. Thankfully, in other respects, my index may not have had the proper information, but my knowledge of the books would lead me to the right sections. The exam questions were some of the best-written questions I’ve seen. To the point, clear, proper English, but you still have to read them carefully to pick up on any twists or tricks afoot. Honestly, the questions and answers were wonderful and did nothing to detract from the experience and ability to demonstrate mastery over the topics.
Is there anything with the materials brought on-site for the exam that you’d do differently? Without getting too specific, I think it would have been useful to better document or print screenshots of the output of the tools mentioned. Not all of them, since there’s a ton! But any of them are fair game for questions. Ideally, it should be enough to have used any and all tools during the labs or self-study when re-doing the labs. But that does take effort, as the labs themselves will not use every plugin and tool mentioned in the books. I also am not sure how one would consume such print-outs efficiently while taking the exam, so maybe I was better off without them!
Just posting that I recently got approved and set up to attend SANS CDI in Washington, DC this December. Back in May I qualified for the Netwars Tournament of Champions, so I will be attending for that sole purpose. I figured this chance may not come up again, so I’d jump on it and have a good time!
Many, many idealistic years ago I used to collect books of neat things, sayings, zen koans, and various other things to find serenity in life. Always very helpful. I used to keep many of these as a rotating email signature back in the 90s, which I honestly rarely ever enabled on messages.
Also very helpful are good lists of advice from people older and wiser. And I wanted to keep this wonderful, achievable list from Warren Buffet and apply it to all people not just the young:
Advice for the all the young people:
- read and write more
- stay healthy and fit
- networking is about giving
- practice public speaking
- stay teachable
- find a mentor
- keep in touch with friends
- you are not your job
- know when to leave
- don’t spend what you don’t have
I see lots of OSCP reviews, and I usually don’t post or point to many since, well, there’s so many and students should start learning to Google early on. But this one by m0nk3h is amazingly detailed and quite useful to check out for anyone hoping to take the OSCP or in the process of it. It goes above and beyond the simple PWK and experiences review, and also include tips and tricks and useful commands in one shot.
In fact, scrolling back in the dude’s history, I like lots of his posts and the formats, good stuff! (And I was curious, like everyone who attempts the OSCP exam, on his background to see how that compares or may impact.) I particularly like the SpectreOps course review, as I’d love to take that some day.
It’s been a pretty busy year so far, and I’ve been remiss in posting things. I think one influence has been my use of Pocket. Instead of posting things here, I put things I want to check out later on into Pocket. Of course, then I just don’t get back to Pocket!