Humans in System Design

Published on Author Zach WareLeave a comment
The majority of systems we consider to be innovative are not unique. I was reminded of this when Uber announced two recent experiments: color coded in-car signs and virtual stops. Several years ago we designed those elements into systems at SHIFT, a company I built (and shut down).
That we conceived of the same ideas as Uber is meant not to illustrate our brilliance but rather to illustrate that some discoveries are inevitable and Uber is a brilliant executor. The brilliance is always in execution.

There is a single state of perfection waiting to be uncovered and often nature uncovers it long before we do. Discovery comes with exploration and brilliance lies with execution.

The innovation is not in the discovery of an optimal system but in building a path and the technology to execute it. In other words being able to say you know what the problem is doesn’t make you a genius. And neither does solving it with a method that people won’t adopt. Elon Musk wasn’t the first to discover that trashing multi-million dollar rockets is inefficient.

This thinking applies to every system. It centers around creating a pathway that allows a system to evolve with users (or customers) to surmount the ultimate barrier to maximum efficiency: human skepticism.

I use this way of thinking in vertical retail businesses, in studying airlines (which are the most fun to explore) and in evolving my nutrition. I’ve spend a lot of time today trying to understand the role of physical places in the purchase of consumer products.

It considers humans to be objects and explores the question to understand how humans think and what they value at and below the surface. This is the most important skill in systems design. This paper is not about creating emotional connections or, in fact, creating great companies. If you can’t understand the context of phrases like “The humans were the problem” then stop now.

I’ve written versions of this thinking in the context of various past work. This is my first attempt to articulate this way of thinking more generally.

In this illustration I use Uber as a frequent example mainly because of its clear systems design focus and because many people are familiar with it. I use SHIFT as an example because I am familiar with it.

What is a system?

All things that interact with the real world follow system logic. Each object has fixed attributes (financial costs, binary skill sets, opportunity costs, waste generated, etc) and those objects are sequenced using priorities to deliver an objective.

Determining the priorities is a function of understanding the objective. To understand the objective one must deeply study a customer’s need. And smart system designers look beyond the customer’s stated need and focus on the customer’s actual need, the one she can’t express because she doesn’t know how to articulate her caveman instincts. It is in this area that a company takes the most risk and if successful, affects the most lasting change. The decision of how to present the actual need is also what creates industry whackos. Imagine the guy telling us we have to have robots in our home tomorrow. Or the politician professing about the benefits of light rail.

How a company publicizes its interpretation of the customer’s need is not always 100% honest.

In order for a system to affect the most change on an environment over a long period of time it is sometimes necessary to tell people what they want to hear to gain their trust. Long-term companies understand that doing this is a short-term strategy designed to create a following. Your product must meet customers at their emotional point today in order to be in a position to give them what they really need in the long-term. Customers will not change their priorities to match your belief that they really need something other than what they want.

For example, in transportation, customers surveyed will say that they want to get from A to B in the most efficient, cheapest and safest way possible. The elements that can be prioritized change order based on the priority the provider chooses to deliver a service. That order is the secret sauce of execution.

1 — speed/ease of start delivery (start of trip)

2 — cost (trip price)

3 — qualitative experience (nice car, nice driver)

4 — true efficiency (speed of trip vs norm, which they are incapable of measuring)

All with the divisor of safety, a binary. In other words a great experience that’s unsafe is equal to zero.

A divisor is the thing without which all of the other things don’t matter. It is true or it is not. If not, all things are zero. Often it’s a product’s core promise. Chicken’s divisor is disease. Cheap, delicious chicken with salmonella is bad.

The customer’s indication of what they want is an aspirational indicator. They believe they want one thing when in fact their rating of the experience will be reflective of a different priority. It is why SUV sales continue to rise in the US even during periods of high gas prices or in periods where consumers indicate they value environmentally aware products.

Great systems understand aspirational tendency and shift priorities over an arc of time in order to develop customer trust that will allow the company to deliver the product the customer actually needs over the long-term.

All systems conform to this structure. There is always a customer, a provider, a perfect design and a path. The systems that don’t conform to this structure are often led by idealists who spend more time convincing people they should want what they are selling than they do building systems that account for and respect customers’ current preferences.

For example, take employee scheduling. The most efficient scheduling of employee resources uses a blend of qualifications, demand and cost with a divisor of availability to determine what resources should be deployed at any given time. The business at its core wants profitability. The cheapest employee that can respond to demand for X task is the best resource to deploy. The human element, however, makes this systemic approach impossible to execute perfectly. People want to be happy. Simply because an attorney can lift a box doesn’t mean she should. So we hire classes of workers so that some may be happier doing work they enjoy, not just work they are qualified for. This is deliberate inefficiency.

Or take a system I crafted at SHIFT, a transportation company that was actually a complex system company, both operationally and technologically. The most important operational KPI at our company was percentage of time a vehicle is available for member use. Many other factors, most outside of our control, influenced demand. The only object we could 100% control was how available our system was to users.

To maximize that availability we had to organize complex operational steps to do all of the things required for a mostly electric vehicle fleet to function. And for that system to have objects available in the right place at just the right time. That meant moving cars from A to B as demand physically shifted from minute to minute. It meant having drivers in the right place. It meant drivers needed to follow the most efficient route. It also extended to all of the required elements to deliver a well-perceived qualitative experience. Vehicles needed to be clean. Radios needed to be reset. They needed to be safe to meet a safety divisor.

They also needed to be charged to a level a member would perceive as sufficient for her driving needs. This need illustrates where the human element begins to break a system. A person generally believes he needs much more than he actually will. So an electric vehicle fleet’s charge state must be artificially elevated to be considered ready. So we had two metrics related to charge state we monitored: % of vehicles in ready-state and % of vehicles in perceived ready state. They very wildly different numbers.

For example, a vehicle class with an average trip of two miles needed only five miles range to handle almost all edge cases the system could predict. But even vehicles with 20 miles’ range triggered customer fears and inserted chaos into the system.

Elon Musk understood that range anxiety was the electric vehicles’ biggest flaw. Without mass adoption battery production costs could not lower due to scale. To gain mass adoption one must consider why someone would not buy an electric car. While 95% of a person’s vehicular uses are local, we would not buy a car that restricted our travel in the 5% of cases when we want to go a long distance. So in order to facilitate adoption of a mostly local product Tesla overbuilt its Supercharger network to solve a use case we rarely truly need. It was a solution to an emotional problem.

In both cases, humans were the problem.

What causes inefficiency?

Uber is among the most efficient systems ever conceived. Considering its investment in engineering my suspicion is that it could be much more efficient than it is. It has intentionally held back its abilities because of its customers’ current priorities and lack of trust in systems. People believe they make better decisions than machines.

The two largest impediments to maximum efficiency in Uber’s system are a) people making routing decisions and b) the chaos of the pickup. People come in the form of customers and drivers. In all systems people are the biggest risk to maximum efficiency.

People overvalue their intelligence and create chaos. Systems unaware of this, such as airplane boarding, use brute force to calm chaos. Consider the example a screaming gate agent at an airport. Boarding processes are not efficient but they would be more efficient if we would all sit down, shut up and wait to be called. But we can’t. We experience what I call “life seizures.” Everything is so overwhelming we cannot simply look at the system’s instructions (e.g. boarding pass) and wait for that instruction to be triggered. We are scared we won’t have room to store our luggage, a problem created by airline humans’ hunger to charge fees and passenger humans’ fear the airline will lose our bags if space constraints for us to check them, a problem we have a less than one percent chance of experiencing.

We tolerate the bad qualitative experience because while we say we value the experience, we really value total trip time efficiency and price. Our likelihood to return to an airline is based on whether we left and arrived on time and whether the flight was cheap compared to our budget. No matter how many complaints we make, we still buy tickets on Spirit Airlines.

In Uber’s system what customers value (or complain about) the most is the speed of the pickup. A quick pickup does not get you where you are going as quickly and efficiently as possible. It does what it says, it gets you on your way.

At SHIFT we realized this which is why our member promise was “be on your way in five minutes.” Uber is focused on the same. Those that own cars value control. Driving themselves somewhere in 25 minutes will be perceived as better than waiting ten minutes for a ten minute ride. If we had said, be there faster than doing it yourself customers would have surveyed higher but ultimately prioritized control.

For others without cars, especially those in areas covered by public transit, the two greatest barriers to movement are the distance to the nearest transit stop and the waiting time for the next fixed route mode of transit to arrive. This boils down to frustration with the time it takes to be on one’s way.

Both customer groups will say they value efficiency of trip and cost. But today, because they are unaware of their own priorities, their wallet vote goes to the company that gets them moving quickest.

Regulations are another source of inefficiency though usually deployed after the system exists. Regulations are designed to limit efficiency to protect the basic needs of people. They intentionally enforce limits on efficiency (divisors, like in a math equation) to influence a system’s prioritization of its elements. Unfortunately regulatory regimes have evolved into systems of their own designed to deliver maximum profits to historical incumbents.

Regulations are designed to prioritize human needs over tendency for efficient systems to sideline the basic needs of the humans that provided the service before the system existed. How many innovations are stopped by regulations?

Humans in the delivery system

At SHIFT we wanted employees to drive (versus independent contractors) so we could tell them what to do. We needed drivers because law and technology didn’t yet allow us to do the job without people. Simplified, the person’s role was to receive instructions that involved turning a steering wheel and pushing some levers to achieve said instruction safely. Do nothing but that. Be nice and talk (or not) depending on what the customer wanted. Use a system to indicate whether the customer was or was not talkative.

Because the person had to be there for regulatory and technological reasons, the main risk the driver posed was interfering with the customer’s perception of the experience’s efficiency by being a negative contributor to the qualitative experience. In other words the most efficient ride delivered by an annoying driver will be remembered by the customer as a bad experience.

But drivers insert an additional element of chaos: they make routing decisions. Even if a GPS guide says turn left, if the driver thinks that going straight is better, he will do it. This is chaos. The driver may not know that, for example, in a multi-passenger system there is a 70% chance that a pickup four blocks away will be requested within 30 seconds and that this right turn is the one with the least unpredictable delay (because there is a timed stop light).

The other element of chaos in this model is the passenger. The passenger, prioritizing pick up speed over all, does not want to work to be picked up. So the driver must find the passenger on a corner, in front of a door or even worse, wait for the passenger to leave her apartment. This creates the dreaded call from an Uber driver. This is chaos.

The biggest risk to an efficient mobility system is a person deviating from the prescribed or most efficient path. In most dynamically routed mobility systems this inefficiency exists on the driver and the passenger side of the system.

Aligning priorities

People have a hard time articulating what it is they actually value. While people say they want to get somewhere as quickly and cheaply as possible, what they actually want today is be on their way in five minutes with no effort.

Which of those two they value more is a per person variable that can be learned by a system to adapt how the system performs for those people specifically. Priorities change as trust in the system increases.

A system has two variables of control: a) speed of response and speed of delivery and b) the objects used (for example how nice the car is and whether or not the trip will be shared). When a system knows a user’s priority (more efficiently based on past behavior rather than explicit direction from the customer), it can respond with the right object for that customer. The optimal object will have different attributes, one of which is cost. As cost is changed to reflect the customer’s tolerance the selection of the delivery object will impact only one of two variables: response time or delivery time.

A customer’s perception of satisfaction directly correlates to his belief that he was in control as much as possible. He is, in his own thinking, the smarter of those involved in the transaction: computer, person, himself.

Well-designed systems evolve as customers build trust and cede control. You can see the shift in Uber’s system dynamics at play now. Price decreases in cities lowers driver supply in its current system. Lower driver supply means longer wait times if not coupled with other system changes. The Uber system is shifting its priorities. Uber Pool (and Lyft Line) introduced efficient multi-passenger routing. This meant longer wait times but brought lower price and optimal routing, increasing driver efficiency and earnings even at a lower price. At scale this lowers the number of cars on the road, decreasing traffic. The increase in trip time in a shared ride will ultimately balance if not outperform the same trip in a single passenger ride. It uses decreased cost to lure in users who later will realize the trip is more efficient than anything they could have otherwise done. It’s training users to trust the system.

Still the least efficient element in this system is the pickup and dropoff. And Uber is testing the solution that with a model we also designed a SHIFT: virtual stops where customers walk a short distance to a more efficient pickup point and one that is predictable for the driver, the perfect solution that many have discovered but only Uber has executed at scale. Over time I expect this feature to find its way into every element of the Uber experience even beyond its limited multi-passenger experiment in Seattle. It is the only solution to customer chaos short of transporter beams.

The path to perfection

We build systems to make it seem like customers are in control. Some systems (Lyft) prioritize the human element of its customer experience to create long-term emotional loyalty with its customer. In the long-term this focus will impede the company’s ability to deliver a product beyond a niche of people who truly prioritize relationships with people over efficiency. Those people enjoy lives of mass inefficiency. In systems thinking this is considered a flaw but this flaw is part of what makes us human. And many great companies will serve them. Those companies are not systems companies.

There are at times companies that create almost fanatical customer followings by focusing on emotional customer connections. Long-term we will see which of these companies used this approach to trick us all into allowing them to develop systems companies versus emotional companies. The emotional companies are not systems companies. Systems companies can use emotional connections to keep customers engaged while it develops perfect systems. Both are great companies but usually systems companies are larger.

How successful a system can be long term depends on a provider’s ability to create an experience that feigns control when in fact the system is in control, allowing for human interaction only in the most egregious forms of system collapse. Building these types of systems is a long-term endeavor requiring significant trust building and implicit, almost deceptive, user training.

For successful systems companies their path to success is measured over decades. As people will realize their behaviors have changed dramatically and marvel and how they’ve unintentionally evolved, patient systems designers will know how it happened all along.

Get New Posts By Email

I write sometimes. Get an email when I do..

This post originally appeared at Zach Ware's Notebook.

Leave a Reply