In 2002 the United states Armed forces conducted a war game exercise called the Millennium Challenge. The challenge involved a number of exercises, computer simulations and table top activities to test and validate the capability and weaknesses of the US Military as it transitioned to network centric warfare with more effective command and control of future weapons and tactics. An opposing force (red team) was generated and commanded by Paul K. Van Riper, a former US Marine, who during the challenge identified mechanisms to evade detection and conduct war in a manner that exploited critical vulnerabilities in the US (blue team) plan His innovative approach led to a fairly disastrous outcome within the context of the challenge. So how is this relevant to us as cyber security practitioners?
Adversarial thinking, or the role of someone questioning the validity of claims in society, is nothing new. During the early days of the Catholic Church the role of devil’s advocate was employed as part of validating the claims of miracles and ensuring cheap tricks or other forms of mysticism were not part of an elaborate act. This led to a higher quality and validity of saints. In the modern justice system, the role of a defender in the case of the genuinely guilty isn’t to prove innocence but to ensure proper processes are followed and to question and strengthen the prosecution’s case to beyond reasonable doubt or to demonstrate doubt in a less than robust case. We can similarly apply adversarial concepts to our own work in cyberspace to better strengthen our defences.
Red teams, when employed against projects or organisations by a blue team, need a few qualities to set themselves up for success:
- Rule 1 Inform, don’t dominate. I have seen it amongst penetration testers and other disciplines that take on a “red team approach” that an individual desire to “dominate” another human being takes control and results in demoralised blue teams that get locked up on counterproductive paths.
- Keep it real. Red teams’ thought process and approach must be informed by reality. Simply dropping an “offensive” thinker who does not vary their approach from what the blue team would apply will railroad the red team exercise down a set outcome. Additionally, employing outlandish and unqualified statements will also drive the red team activity down the wrong path and the blue team focusing on things that are inconsequential. Being informed of how adversarial approaches take place, even cultural attributes of an adversarial environment will provide useful insights into the blue team.
- Know and understand the plan. Red teams must be “inside and aware.” Whilst it is easy to keep a red team isolated, a collaborative approach will ensure that the red team is fully aware of the activities of the blue team, in order to expedite their adversarial actions. Whilst its easy to say “this isn’t fair, because they will never have the same volume of information”, the reality is obfuscation never actually works and over an extended period of time, a time frame projects or organisations cannot afford, the information provided will be readily gained.
- Keep it professional. Red Teams must not get invested into their approach or plan. This quality plays into the “inform over dominate” rule; Getting caught up in semantics of the red team can be counter productive and instead of finding outcomes, arguments ensue over the reality of a scenario. As a red teamer, your job is to inform and drive, not block plans.
- Get creative. Whilst quality assurance and compliance standards are typically seen to provide a similar function and have lifted the game for benign environments, the reality is when malice or contended environments come into play, compliance or expecting a scripted event to occur is an easy way to set up for failure.
So how did General Ripper and his red team improve the blue team in the Millennium Project? The reality is; they didn’t. The rules were changed, to the frustration of General Ripper. One could deduce that the shock and awe against American superiority was such an impact to the cultural psyche that the subsequent wargame was heavily boxed and the guidance that could have been gleaned was lost. When used appropriately, a devil’s advocate can challenge assumptions and thought processes with the intent to make things better, however it is the responsibility of both parties to ensure such contention is executed effectively, and the lessons from the process are applied and managed.