"I was only following corporate algorithms"
Testimony given at a future war crimes trial (riff on the Nuremberg defense)
United Airlines forcibly removed a man from an "overbooked" flight. The incident was captured on video by other passengers and the story went viral on the social networks. United flubbed its response to incident, adding fuel to the anger. The story went global overnight, sparking massive outrage (hundreds of millions of views in China, an important market for United). The next day, United stock gets hammered, losing ~$1.4 billion off its stock price by midday. What happened? This incident is a pretty good example of how rigid algorithmic and authoritarian decision making can create corporate disasters in an age dominated by social networking.
Here's how the algorithmic decision making created the incident on United.
- United employees board a full flight from Chicago to Louisville. A United flight crew headed to Louisville arrives at the gate at the last moment. A corporate scheduling algorithm decides that the deadheading flight crew has priority over passenger fares and that it needs to remove four passengers to make room for them (the flight wasn't overbooked).
- United asks for volunteers. A corporate financial algorithm authorizes gate employees to offer passengers up to $800 to take a later flight (offering a bigger incentive wasn't an option). No passenger takes them up on that offer.
- United now shifts to removing passengers from the flight non-voluntarily. To determine who gets removed from the aircraft, United runs a customer value algorithm. This algorithm calculates the value of each passenger based on frequent flyer status, the price of the ticket, connecting flights, etc. The customers with the lowest value to United are flagged for removal from the flight (it wasn't a random selection).
Here's how authoritarian decision making (common on modern air travel) made things worse. Note how this type of decision making escalates the problem rapidly.
- The United flight crew approaches the four passengers identified by the corporate algorithm and tells them to deplane. Three of the people designated get off the flight as ordered. One refuses. Since disobeyal of instructions from the flight crew is not tolerated in post 9/11 air travel, the incident is escalated to the next level.
- United employees call the airport police to remove the passenger. The airport police arrived to remove the unruly (he disobeyed orders) passenger. The passenger disobeys the order from the airport police to deplane. Disobeyal of an order by a police officer rapidly escalates to violence. The police then remove the passenger by force (the video of this is shared on social media).
- The CEO of United Airlines rapidly responds: "While I deeply regret this situation arose, I also emphatically stand behind all of you, and I want to commend you for continuing to go above and beyond..." In short, the CEO praises his employees for following the corporate algorithms and for not backing down when their authority to remove a passenger was questioned (which resulted in more negative backlash on social networks)>
What this means for Organizations
As you can see, United was designed to fail in a world connected by social networking and they are not alone. Let's recap. United employees blindly followed the decision making of algorithms up to the point of telling seated passengers to deplane. The authoritarian decision making that followed was just as rigid and unyielding. Disobeying orders of the flight crew led to the police. Disobeying the police led to forced removal. Finally, the public failure of this process led United's CEO to praise employees for their rigid adherence to algorithmic and authoritarian decision making. It was inevitable. It's also not a unique situation. We're going to see much more of this in the future as algorithms and authoritarianism grow in America. Here's how organizations are likely to respond:
- Human decision making escape valves. Smart organizations will build escape valves into the algorithms used to dictate employee behavior. In other words, if a algorithmic process is going terribly wrong, it is OK for an employee to find a non-standard way to solve it. One way to jumpstart the process is to build a "rapid response team" that can swoop in electronically (smart phone) to coach onsite employees on ways to response (and authorizing extremely non-standard responses on the spot).
- Avoid authoritarian escalation. Once initiated, authoritarian decision making can result in violent escalation. That's bad news in a socially networked world. How will this play out? Bad organizations will increasingly try to find ways to blame the victim. For example, by saying the plane can't take off until this man leaves OR deplaning all of the passengers in order to remove the non-compliant one by force in private. Smarter organizations will respond by finding ways to completely decouple business decisions from authoritarian escalation. For example: radically increase the voluntary behavioral incentives as needed or asking for other passengers to help out another passenger in trouble (this man is a doctor and needs to get back to see patients, can anyone help us keep him on the flight?).
- Admit it and fix it. If a corporate algorithm yields a terrible result, admit the failure. Admit it didn't work to both your customers and employees. Algorithms don't have feelings. They won't cry if you talk trash about them. Also, don't punish employees for raising the flag on a broken algorithm. Further, organizations that know what their algorithms are (or that they even exist) and how to fix them will do much better than those who don't. It should be easy to spot the difference.
Writing on a gloriously sunny day in New England