Red Teaming at Scale to Uncover Your Big Unknowns

During the global war on terror, a group of commissioned and non-commissioned officers in the United States military participated in a unique training event. Soldiers of various ranks, all with different specialties, assembled in a remote location where they were stripped of rank and other identifying markers. They changed clothes, adopted new names, and modified the very rhythm of their daily lives. And from there, they began planning a simulated attack on their own forces.

The exercise, an ongoing training opportunity called “Mirror Image” conducted by the Terrorism Research Center, is part of a greater philosophy called red teaming. The simulation helped participants explore their predispositions and organizational weaknesses. Changing routines helped them better understand enemy motivations and anticipate possible insurgent attacks. It also revealed how biases and expectations interfered with reality.

In the business world, cybersecurity professionals use red teaming to test their organization’s defenses before something happens. But other groups can utilize the concept to test resilience, blind spots, and continuity in the face of a crisis.

Organizations should conduct red teaming exercises at scale to manage risk holistically. The idea is to understand potential outcomes based on multiple strategic options, utilizing scenarios to identify blind spots and model threats to eliminate weaknesses before adversaries exploit them. It is a valuable tool in both policy and decision-making.

Make the Most of Your Red Teaming Exercise

Here are some actions to take to ensure your organization gets the most out of red teaming:

Ask where the enemy is going. An effective red team exercise reveals exploits in your security systems and processes. The entire point is to find failures. This isn’t always easy for professionals to accept and encourage. Foster an environment of openness that allows teams to explore threats and how they’ll try to overcome defenses in place. Ultimately, red teaming provides growth opportunities that improve threat response when the time comes.

Evaluate the response, not just the defense. Red teaming is more than a penetration test. Knowing where your gaps are is crucial, but knowing how your team reacts to a crisis is far more valuable. While defense, mitigation, and deterrence are essential, sometimes defenses fail or plans go awry. It’s useful to model a scenario where your defenses or mitigation strategies fail, so your company can react and prepare for something similar in real life.

Model information flow. Critical information doesn’t always get to the right people in a crisis. According to a recent survey, 51% of threats that disrupted business continuity or resulted in harm or death in 2022 could have been avoided if all functions shared risk intelligence and viewed it with a common approach and platform.

Test your assumptions. Many defensive measures are built on a set of assumptions about risk and their likelihood. However, our study reveals disagreement within companies and departments over which events threaten business continuity. Security gaps are more likely to occur when teams aren’t on the same page. In such situations, people overlook problems or assume someone else will handle an issue. Red teaming clears up questions around responsibility and how to weigh risks.

Test your processes. Red teaming can be used to test physical defenses and cybersecurity, but it is also a useful tool to assess processes. In security, an overly complicated or ill-defined process can be just as harmful as inadequate barriers or cameras. Tabletop exercises or wargames force organizations through their processes to see how well they work.

Explore alternative futures. Structural analytic techniques help business organizations and security professionals apply imagination to forecast alternative futures for their decisions. This kind of exercise is not about prognostication but expanding one’s critical mindset to understand the many variables that impact the organization and possibilities for the future. This red team approach offers organizations insight used for decision making by recognizing the complexity of choices and their impacts on the company or security.

Adopt a Holistic Viewpoint

The concept of red teaming is based on the Catholic Church office of the devil’s advocate, but it was greatly expanded to assess Soviet intentions and capabilities. That’s where it got its name: The US adversary during the Cold War was the Soviet Union, aka the Reds. Recently, cybersecurity teams adopted red teaming to expose weaknesses in their systems and prevent threat exposure.

The fact is that enterprises face a wide variety of threats — from lawsuits, activists, insider threats, and even workplace violence. Yet nearly every Google search result for red teaming today relates to cybersecurity. As a result, few people know whether their crisis plans are up to date or how a crisis will test them and their teams.

Red teaming is a holistic, multidisciplinary effort that arms teams with practical enterprise risk-mitigation software and other tools across the entire threat landscape. At a minimum, risk-focused teams can use it to test defenses against a wide range of threat actors and identify unseen security gaps influenced by biases and assumptions.

However, its actual value is how it shapes the ways organizations prepare for crises and unforeseen events. Red teaming opens a window to how your organization will perform under duress, making it a valuable exercise to recognize and gauge real and potential risks.

Most importantly, red teaming is a mindset, not just a set of tools or putting security on offense. Red teams are the contrarians in the room, willingly saying what other people will not to challenge the status quo. That is the essence of red teaming, and any security professional can adopt that attitude to assess problems in their organization, prevent failure, or mitigate vulnerabilities. The mindset of a red teamer is what shapes organizations for the better.

Read More HERE

Leave a Reply