Common Ground Part 3: Execution and the People Factor
This is part three of a blog series titled: Common Ground. In Part One , I discussed the background and evolution of red teaming. I dove deep into how it applies to the information security industry and some limitations that are faced on engagements. In Part Two , I discussed various components of the planning phase to help ensure a valuable and mature exercise. In this part, I will discuss how a red team can execute a thorough operation. I will steer clear of the technical components of the network red team, instead focusing on the most important outcome of a red team assessment: communication improvement and bias identification. I encourage you to read my disclaimer in Part One before continuing.
Decision Cycles
If you have been through any strategy classes or military training, there is a good chance you have heard of the overused and applied concept of decision cycles. In school, I loved the numerous military strategy and history classes I took, but at the time, I too often thought the concepts were defined to the point of absurdity (and I am sure I will get some fun comments from friends now that I am writing about them here). Later in my career, when I found myself playing adversary against major industry blue teams, I fell back to some of the military strategy lessons to help achieve our objective in the wargame on their network. Throughout my first several engagements, I learned that those lessons could be applied heavily to help achieve a major goal of the assessment-improvement of process.
Decision cycles are a set of steps taken by an entity in a repetitive manner to plan their course of action, take that action, learn from the results, and implement new decisions. In theory, they are foundational to everything we do day-to-day, even if you do not realize or formally follow a decision cycle. Formal examples:
- Plan-Do-Check-Act — Heavily used in quality assurance
- Observe-Orient-Decide-Act (OODA) — Used by the US Military. A diagram of this process as formulated by Boyd is below
- Observation-Hypothesis-Experiment-Evaluation (Scientific Method) — Used to evaluate hypothesis and make adjustments in scientific research
John Boyd, the person who formulated and studied the OODA cycle also theorized about how these loops could be used to defeat an enemy. He believed that if you could work through your decision cycles faster than your adversary you would be able to react to events quicker and, therefore, gain a strategic advantage. In this concept, he covered numerous aspects that affected the speed of decision cycles and methods of intentionally slowing your adversaries’ decision cycles. At each step in the cycle, many factors impact the decision maker and affected the outcome.
Here are the phases broken down:
- Observation: During the observe phase, the entity uses the unfolding circumstances combined with current interactions within the environment to formulate their perception of what is happening. They combine this info with outside information (enrichment data) to form the basis of their understanding.
- Orient: During this phase, the decision maker takes their perception of the situation, and aligns their thoughts with the actions that are occurring. They utilize the culture of their organization, knowledge of past events, and the constant flux of information to begin to shape a potential decision.
- Decide: This simple phase takes the information from the previous two phases and makes the decision on the best course of action.
- Act: Having made the decision, the entity now executes that decision and utilizes any outcome as part of a feedback loop going back into the observe phase to mold a changing environment.
Affecting Decisions In Each Phase — Case Study
With knowledge of this process and the goal of intentionally slowing down the blue team’s decision cycles, the red team can plot certain actions to not only identify technical vulnerability, but also human vulnerability. Below are some possible ways that the red team can alter the decision cycles of the blue team. They are small examples that can be expanded upon greatly to build a strategic playbook of sorts.
- Observation
- Limit the amount of information the decision maker can gain from the environment or overwhelm the decision maker with too much competing information.
- Plant false flags pointing to multiple adversaries to produce offensive counter-information and force the decision maker to rely on outside information that is actually not relevant.
- Disable network security sensors to prevent them from gaining a perception in the environment.
- Orient
- Identify known personalities in the decision making process and leverage personality flaws or biases to hamper a proper decision.
- Prevent the thorough analysis of information by shifting or changing TTPs frequently.
- Identify cultural traditions and norms and utilize them as part of your attack path (it is normal for people to log in after hours, so log in after hours).
- Study past breaches in the environment and identify errors that you can piggyback off of.
- Decide
- Act
- Recognize that a decision has been executed and orient yourself prior to allow their feedback loop to occur.
- Nullify their actions in a rapid fashion so they feel the need to repeat the same actions.
By recognizing the “people aspect” of an engagement, the red team is better equipped to use technical expertise in the environment to operate without valid response and, therefore, achieve success. Understanding strategy and psychology of influence helps to hammer home the impact of the blue team’s lack of operational exercise of their incident response plans.
Incident Response Cycle
While not technically a decision cycle, the incident response process or “killchain” is a cyclical process that is industry recognized and applied. This process is heavily defined in NIST 800–61 R2 and their figure of the process is below. As a network adversary, recognizing the process your blue team opponent is operating within allows you to better predict their actions and plot their potential steps, which decreases the time required for your decision making and increases your strategic advantage.
Outside of the technical response that will be conducted as part of the exercise, there is an assumed phase within each step of the process. This assumed phase is the communications, command, and control (C3) that occurs within the organization being exercised. Within the C3, there can be significant time delays introduced that prevent a rapid or thorough response, essentially crippling the technical responders. The red team can exercise this process during an engagement by including the communication plans in the exercise and ensuring that the blue team conducts a complete response as they would against a real adversary. Common issues we see with C3 in an organization:
- Delegation — Certain decisions must be delegated down to prevent time lag.
- Leadership Lag — Not all non-technical leaders will understand how to respond. They must have practice to react quicker during a real breach.
- Out-of-Band C2 — During response efforts, the use of a compromised network will allow the adversary to instantly be inside your decision cycles by witnessing it in real-time. The adversary will also be able to apply deception or disruption operations.
Another aspect of the process is the ability for adversaries and red teams to increase the lag time by attacking targets that intentionally increase the amount of communication required on behalf of network defenders. By laterally spreading through multiple organizations, subsidiaries, international components, or business lines, the red team is able to force collaboration and response across organizational or international lines, which is inherently slow, if it happens at all. Testing this multinational communication also forces decision makers to consider incorporating international and privacy laws into their response plans.
With the red team targeting the human factor as well as the technical, and working through these issues in advance, the organization will dramatically decrease the disparity between their mean time to detection and their mean time to response.
Bias
When analyzing decision cycles and the incident response processes, one must recognize that bias plays a big factor in sufficiently defending an organization at all levels. As a red team, recognizing the various biases that could be used as weapons against an organization can be extremely useful. As a blue team member, you must recognize that these exist and practice working through them.
If these biases impact your organization during a red team assessment, it can be identified through a rigorous debriefing process where both teams share journals of activity for time comparisons. This will allow for improvement and recognition of the risk of bias in decision making.
For more information on cognitive bias and the impact on decision making, check out this article on MindTools. Also, this Harvard Business Review article is awesome: Why Good Leaders Make Bad Decisions .
Outcome
As shown throughout this post, one major motivation for conducting a red team assessment is working through a full response and breach scenario to practice making decisions. Blue teams should recognize that their personalities are in-scope and red teams should learn to focus on utilizing the psychological aspects of conflict to their advantage in addition to the technical vulnerabilities they uncover. Rigorous debriefing and team work can benefit all stakeholders involved, allowing for a fluid and rapid response during a real breach scenario.
Blog Series Wrap Up
That wraps up my brain dump and blog series about red teaming. While these posts were not overly technical in nature, I hope this series serves people well and encourages a proactive discussion on how analytical techniques can help organizations improve, both organizationally and technically. There might be another post or two like this down the line if they were helpful, particularly on debriefing. I will state it again, any form of this analysis being conducted to better the organization is useful even if it does not apply to a strict definition. As the terms have evolved, there are subsets of study and room for additional applications in the industry.
Finally, since I understand the relationship between red and blue teams can be contentious, I offer this final advice:
To the red team: you are providing a service to your customer- the blue team. You should be as familiar with defensive techniques as you are offensive. Prepare well, conduct a thorough assessment, and do not get wrapped up in politics.
To the blue team: the red team is on the same team. While they may cause you more work, it is much more fun practicing response in your real environment than in a training lab. Share info and teach the red team so that they can better train you!
Originally published at http://www.sixdub.net on July 5, 2016.