Common Ground Part 3: Execution and the People Factor

This is part three of a blog series titled: Common Ground. In Part One , I discussed the background and evolution of red teaming. I dove deep into how it applies to the information security industry and some limitations that are faced on engagements. In Part Two , I discussed various components of the planning phase to help ensure a valuable and mature exercise. In this part, I will discuss how a red team can execute a thorough operation. I will steer clear of the technical components of the network red team, instead focusing on the most important outcome of a red team assessment: communication improvement and bias identification. I encourage you to read my disclaimer in Part One before continuing.

Decision Cycles

If you have been through any strategy classes or military training, there is a good chance you have heard of the overused and applied concept of decision cycles. In school, I loved the numerous military strategy and history classes I took, but at the time, I too often thought the concepts were defined to the point of absurdity (and I am sure I will get some fun comments from friends now that I am writing about them here). Later in my career, when I found myself playing adversary against major industry blue teams, I fell back to some of the military strategy lessons to help achieve our objective in the wargame on their network. Throughout my first several engagements, I learned that those lessons could be applied heavily to help achieve a major goal of the assessment-improvement of process.

Decision cycles are a set of steps taken by an entity in a repetitive manner to plan their course of action, take that action, learn from the results, and implement new decisions. In theory, they are foundational to everything we do day-to-day, even if you do not realize or formally follow a decision cycle. Formal examples:

  • Plan-Do-Check-Act — Heavily used in quality assurance

John Boyd, the person who formulated and studied the OODA cycle also theorized about how these loops could be used to defeat an enemy. He believed that if you could work through your decision cycles faster than your adversary you would be able to react to events quicker and, therefore, gain a strategic advantage. In this concept, he covered numerous aspects that affected the speed of decision cycles and methods of intentionally slowing your adversaries’ decision cycles. At each step in the cycle, many factors impact the decision maker and affected the outcome.

Here are the phases broken down:

  • Observation: During the observe phase, the entity uses the unfolding circumstances combined with current interactions within the environment to formulate their perception of what is happening. They combine this info with outside information (enrichment data) to form the basis of their understanding.

Affecting Decisions In Each Phase — Case Study

With knowledge of this process and the goal of intentionally slowing down the blue team’s decision cycles, the red team can plot certain actions to not only identify technical vulnerability, but also human vulnerability. Below are some possible ways that the red team can alter the decision cycles of the blue team. They are small examples that can be expanded upon greatly to build a strategic playbook of sorts.

  • Observation

By recognizing the “people aspect” of an engagement, the red team is better equipped to use technical expertise in the environment to operate without valid response and, therefore, achieve success. Understanding strategy and psychology of influence helps to hammer home the impact of the blue team’s lack of operational exercise of their incident response plans.

Incident Response Cycle

While not technically a decision cycle, the incident response process or “killchain” is a cyclical process that is industry recognized and applied. This process is heavily defined in NIST 800–61 R2 and their figure of the process is below. As a network adversary, recognizing the process your blue team opponent is operating within allows you to better predict their actions and plot their potential steps, which decreases the time required for your decision making and increases your strategic advantage.

Outside of the technical response that will be conducted as part of the exercise, there is an assumed phase within each step of the process. This assumed phase is the communications, command, and control (C3) that occurs within the organization being exercised. Within the C3, there can be significant time delays introduced that prevent a rapid or thorough response, essentially crippling the technical responders. The red team can exercise this process during an engagement by including the communication plans in the exercise and ensuring that the blue team conducts a complete response as they would against a real adversary. Common issues we see with C3 in an organization:

  • Delegation — Certain decisions must be delegated down to prevent time lag.

Another aspect of the process is the ability for adversaries and red teams to increase the lag time by attacking targets that intentionally increase the amount of communication required on behalf of network defenders. By laterally spreading through multiple organizations, subsidiaries, international components, or business lines, the red team is able to force collaboration and response across organizational or international lines, which is inherently slow, if it happens at all. Testing this multinational communication also forces decision makers to consider incorporating international and privacy laws into their response plans.

With the red team targeting the human factor as well as the technical, and working through these issues in advance, the organization will dramatically decrease the disparity between their mean time to detection and their mean time to response.

Bias

When analyzing decision cycles and the incident response processes, one must recognize that bias plays a big factor in sufficiently defending an organization at all levels. As a red team, recognizing the various biases that could be used as weapons against an organization can be extremely useful. As a blue team member, you must recognize that these exist and practice working through them.

If these biases impact your organization during a red team assessment, it can be identified through a rigorous debriefing process where both teams share journals of activity for time comparisons. This will allow for improvement and recognition of the risk of bias in decision making.

For more information on cognitive bias and the impact on decision making, check out this article on MindTools. Also, this Harvard Business Review article is awesome: Why Good Leaders Make Bad Decisions .

Outcome

As shown throughout this post, one major motivation for conducting a red team assessment is working through a full response and breach scenario to practice making decisions. Blue teams should recognize that their personalities are in-scope and red teams should learn to focus on utilizing the psychological aspects of conflict to their advantage in addition to the technical vulnerabilities they uncover. Rigorous debriefing and team work can benefit all stakeholders involved, allowing for a fluid and rapid response during a real breach scenario.

Blog Series Wrap Up

That wraps up my brain dump and blog series about red teaming. While these posts were not overly technical in nature, I hope this series serves people well and encourages a proactive discussion on how analytical techniques can help organizations improve, both organizationally and technically. There might be another post or two like this down the line if they were helpful, particularly on debriefing. I will state it again, any form of this analysis being conducted to better the organization is useful even if it does not apply to a strict definition. As the terms have evolved, there are subsets of study and room for additional applications in the industry.

Finally, since I understand the relationship between red and blue teams can be contentious, I offer this final advice:

To the red team: you are providing a service to your customer- the blue team. You should be as familiar with defensive techniques as you are offensive. Prepare well, conduct a thorough assessment, and do not get wrapped up in politics.

To the blue team: the red team is on the same team. While they may cause you more work, it is much more fun practicing response in your real environment than in a training lab. Share info and teach the red team so that they can better train you!

Originally published at http://www.sixdub.net on July 5, 2016.

Tech: Threat Intel | Photographer @ https://www.justinwarnerphoto.com