Wednesday, April 1, 2020

Splattered Swan: Collateral Damage, Friendly Fire, and Mis-fired Mega-systems

Like Curly from the "Three Stooges" said,
"I'm a victim of circumstances!"
There is a type of "...Swan" that is not surprising or extreme in its aggregate effect, but is extremely surprising to a particular entity that was considered to be outside the scope of the main process.  A bomb is supposed to kill enemy soldiers, not your own soldiers. 

I call this type "Splattered Swans".

Context: Rethinking "Black Swans"

This post is nineth in the series "Think You Understand Black Swans? Think Again". The "Black Swan event" metaphor is a conceptual mess.

Summary: It doesn't make sense to label any set of events as "Black Swans".  It's not the events themselves, but instead they are processes that involve generating mechanisms, our evidence about them, and our method of reasoning that make them unexpected and surprising.


Definition 

A "Splattered Swan" is a process where:
  • The generating process involves a very powerful force (i.e. penal, constraining, or damaging force) with less than perfect aim.
  • The evidence are official rules, specifications, or scope, or experience that is limited what is intended;
  • The method of reasoning are based on the assumption that the aim will be perfect and error-free, or that errors will be "well behaved".
 

Main Features

A Splattered Swan arises when a very powerful system is prone to misfiring in very bad ways, causing damage to some entities that are considered "safe" or "out of bounds" by normal reasoning.  That these outcomes are extreme or surprising are basically due to failures to understand the total system and the ways it can fail.  

Two key features of  Splattered Swans are 1) critical error conditions are excluded from reasoning on principle and 2) those errors are potentially severe, even the first time.  A lot of systems adapt by trial and error, but that only works of the magnitude of errors (i.e. aim) is relatively small and the magnitude of collateral damage is also relatively small.  Consider airplane bombers from World War II aiming to kill enemy troops that are located near allied troops.  Even though they had bomb sights, the bombers were notoriously inaccurate.  With ordinary large bombs, the risk of "friendly fire" (i.e. killing your own troops) is high.  If the bomber is carrying a single atomic bomb, then the risk of "friendly fire" becomes extremely high, because you only get one chance to aim and drop and there is no feedback from previous attempts.  Plus the damage process is extreme.  In the other direction, if there are several bombers, and the first bombers drop flairs instead of bombs, then the cost of error is small and the opportunity of corrective feedback has the potential to dramatically reduce the risk of "friendly fire".

Another important feature of Splattered Swans are the blind spots created by the "official" or "intended" definition of the system of interest.  This can lead analysts and decision-makers to never even consider the possibility of collateral damage or unintended consequences. 
 

One Example

"Offensive cyber" , a.k.a. "hack back" is an example from the domain of cyber security.  There are many flavors of offensive cyber, but the most extreme involve damaging the targets, either physically or digitally or both.  Such extreme attacks might also be considered acts of war, a.k.a. "cyber war".  Putting aside the ethics or advisability of offensive cyber, there is immense potential for collateral damage.  First, it might be hard or impossible to attribute a given attack to the "real"  threat agents or groups (a.k.a. "Black Hat").  They might operate through affiliates, mask or disguise their tools and infrastructure, and might even intentionally implicate a different agent or group in the "Indicators of Compromise" and other forensic evidence.  Even if you can correctly identify the attacking group, it may be hard to attack them in a way that doesn't also do harm to socially-important entities or resources (e.g. cloud computing resources, networks, etc.).  Finally, in corner-case situations there is also a non-zero potential for self-harm, where an offensive cyber attack backfires on the "White Hat" attacker.

From a planning and on-going management viewpoint, it is much harder to anticipate and control the side-effects of cyber attacks than it is for physical attacks. 
 

How to Cope with Splattered Swan

 It is relatively simple to cope with Splattered Swan systems.  Don't take the "official" or "intended" system as a strict definition of what behavior or outcomes are possible.  Use Scenario Planning or "What If?" analysis to look outside the "official" or "intended" to identify potential for collateral damage.

Then look for ways to introduce error-correcting feedback or damage mitigations for the collateral damage.  Another good mitigation is to reduce the intensity of the damage/punishment process.

No comments:

Post a Comment