Illustration by Nan Lee
Illustration by Nan Lee

Work Culture

Do product designers need a code of ethics?


Published on May 20, 2019

In 2014, Harry Campbell began driving for Uber and Lyft to diversify his income. Shortly after joining the program, he began to notice subtle gamification elements creeping into the driver-facing app.

Filed under

When he tried to log off, the app triggered a full screen popup. “You’re just $6 away from making $40 in net earnings,” read the alert. Campbell, an engineer by trade, recognized the trick as a version of the Ludic Loop, a system that continually dangles a goal just out of reach.

Over the next few years, Campbell watched as more and more subtle psychological inducements crept into the app. After one update, the app started automatically accepting shared rides for him while on a trip. After another update, drivers found themselves competing against each other for limited places in prestigious driver tiers.

Each little design tweak tapped into a cognitive vulnerability in Campbell’s brain, ramping up the pressure on him to stay in his car and drive. “It all reminds me of the old pinball games I used to play when I was a kid and how much I enjoyed watching the score go up,” wrote Campbell on his blog.

It’s easy to think of these features as components of an application, but they’re really so much more than that. They’re the product of a designer’s work, the product of a conscious decision made by a human being. And that frustrates people like designer Mike Monteiro. Monteiro believes designers have abstracted themselves from their work and, perhaps more importantly, the consequences of their actions.

“Designers have been running fast and free with no ethical guidelines,” said Monteiro in an interview for Slate. “And that was fine when we were designing posters and sites for movies. But now design is interpersonal relationships on social media, health care, financial data traveling everywhere, the difference between verified journalism and fake news. And this is dangerous.”

Without design philosophies or ethical guidelines, we get by on the vague assumption that designers are probably acting correctly. Recently, we’ve seen what can happen when designers are free to work, unfettered and unguided, to maximize revenue. Since social networks need to maximize the time their users spend looking at their app, all now utilize purposefully addictive design elements lifted straight from casinos to produce literally addictive software.

As consumer behavior expert Nir Eyal explains in his book Hooked: “The technologies we use have turned into compulsions, if not full-fledged addictions. It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.”

Given the immense power that designers now wield, Monteiro and many of his contemporaries believe designers need an ethical framework or philosophical system to guide their behavior.

The transition from technology to compulsion to addiction is not accidental. It is the result of conscious design decisions.

A brief introduction to product design

When we talk about designers in this context, we’re really talking about product designers, a job title that’s completely changed over the past five or ten years. Product design has moved from building physical products to designing mental stimulus, subsuming adjacent disciplines like experience design, information architecture, interaction, user interface, and user experience.

Modern product designers design the system (how things work behind the scenes), the processes (how users use the systems) and the interface (how the product looks) of products. They’re digital architects, building the tools and services that underpin the modern economy.

Fifty years ago, the effect of poor design was minimal. A confusing concert poster may have reduced ticket sales and shoddy training manuals may have dented customer service performance—but the affected audience was typically small and the actual implications fairly minor.

In comparison, modern designers can tinker with the very fabric of society. A single keystroke change to Facebook's algorithm can influence the news that tens of millions of people see.

Unethical or just bad design?

Poor design can mean one of two things: bad design or unethical design. Bad products are simple. They’re products that have been designed without due care for their users and operate in a disruptive way.

In a post on User Testing, a design director at IDEO recalls a medical reporting prototype presented by her client. It was designed for use by nurses during surgeries and would be held with two hands like a tablet. But it didn’t work. The IDEO team discovered that before almost every surgery, the nurse would hold the patient’s hand to calm their nerves. With one hand out of action, the two-handed tablet was useless.

Bad design like this is painfully ubiquitous—email apps send loud notifications in the middle of the night; traffic signs are packed with so much information that they’re unreadable; and long, time-consuming forms reset with the simplest of errors. But bad design isn’t particularly interesting as it’s rarely the result of conscious design choices. Instead, it’s usually caused by carelessness or ignorance on the part of the designer.

The second category, unethical design, is far more complicated. With unethical design, the creators of a product purposefully target vulnerabilities in human psychology to induce users to act in a certain way.

Unethical design isn’t quite as common as bad design, but it is growing. “The problem is the hijacking of the human mind: systems that are better and better at steering what people are paying attention to, and better and better at steering what people do with their time than ever before,” said technology ethicist Tristan Harris in an interview for Wired. “These are things like 'Snapchat streaks,' which is hooking kids to send messages back and forth with every single one of their contacts every day. These are things like autoplay, which causes people to spend more time on YouTube or on Netflix. These are things like social awareness cues, which by showing you how recently someone has been online or knowing that someone saw your profile, keep people in a panopticon.”

The transition from technology to compulsion to addiction is not accidental. It is the result of conscious design decisions.

Leading figures in the design space are pushing back against exploitative products and advocating wholesale philosophical change across the whole of design.

The emergence of ethics in design

For the past few years, we’ve been buried under an avalanche of B2C technology scandals—NSA spying in 2013, Russian election meddling in 2016, Cambridge Analytica in 2018—and that’s kept the spotlight firmly on consumer-facing technology. But things are changing.

Journalists are starting to dedicate more column inches to the technology and tools that underpin our economy. And that’s important because the influence of B2B technologies extends far beyond the individual person using it. In 2014, a faulty voice analysis algorithm led to the UK government falsely accusing 7,000 foreign students of cheating on English language tests and unjustly cancelling their visas.

This increase in mainstream media coverage bolstered the internal conversation on design ethics. Leading figures in the design space, like Mike Monteiro and Valley tech whistleblower Tristan Harris, are pushing back against exploitative products and advocating wholesale philosophical change across the whole of design.

Large companies like Dropbox are paying attention, too. According to product manager Devangi Vivrekar, Dropbox’s designers are guided by a shared design philosophy. “We have an overarching set of what we call product principles that ladder up to our mission. The five product principles are: foster focus; connect the dots; build inclusive spaces; make it human; and get to simple.”

These principles help designers guide their decisions and ultimately produce features that support their users’ goals. For example, Dropbox has just rolled out a suggested content panel that analyzes your account activity and selects a list of files and folders that it thinks will be helpful for your current work.

“Content suggestions are designed to help users cut down on the time they spend searching for files on Dropbox,” explains Vivrekar. By automatically locating data, Vivrekar believes Dropbox reinforces one of its guiding principles—connect the dots—and helps users piece together everything they need. “We allow users to jump back into their most relevant files faster by offering insights adapted to the way they work”.

“Our ecosystem of integrations helps connect the dots even further,” continues Vivrekar. “Once a user has found the file they want, they can open that file with a Dropbox Extension such as a document editor or e-signature tool, which gives them the ability to kick off a workflow right from Dropbox.”

In the next few sections, we’ll look at two different ethical frameworks designers are already implementing and then discuss how you can borrow some of their ideas.

Humane Design

In late 2011, Google acquired Tristan Harris’ contextual news browsing startup, Apture. After the sale, Harris was thrust into the beating heart of the tech giant. Working as a product manager, Harris distributed a memo called “A Call to Minimize Distraction & Respect Users’ Attention” to a handful of close colleagues. The memo spread like wildfire throughout Google, eventually reaching the upper echelons of management, who rewarded Harris with a new philosophically focused job title: design ethicist.

The move gave Harris time and space to investigate ethics in the tech industry. He explored things like how LinkedIn exploits our need for social reciprocity to expand and strengthen its network and how Snapchat uses streak-tracking to encourage near constant communication between its users. Harris left Google in 2016 to explore design ethics in more detail through a new non-profit called the Center for Humane Technology. Through the Center for Humane Technology, Harris developed a new design philosophy called Humane Design, which encourages designers to understand their users' vulnerabilities and create products that treat those vulnerabilities with compassion.

Practically, Humane Design means building products that eliminate detours and distractions from user tasks, minimize misinterpretations, enhance interpersonal relationships, allow users to disconnect with minimal effort, respect user schedules, and a number of other key points.

While Humane Design is still in its infancy, you can already see shoots of growth. Slack, for example, makes a conscious effort to respect its users' schedules. If you send a colleague a message in the middle of the night, Slack won’t send them a push notification unless you specifically ask it to. By defaulting its notifications to office hours, Slack’s designers have made a conscious decision to build a product that is better for their customers at the expense of screen time.

For designers to pursue Humane Design, the Center for Humane Technology recommends designers purposefully ask themselves how they can respect the timing, frequency, and duration of a product’s use to align with the user’s ideal life. If a design feature—for example, obtrusive push notifications or autoplayed content—does not respect their schedule, Humane Design recommends removing it.

Whereas first-order thinking considers only the immediate consequences, second-order thinking considers subsequent interactions and consequences.

Second-order thinking

On July 10th 2017, design director Mike Monteiro published an influential essay titled A Designers Code of Ethics, in which he argues that designers ought to be feel responsible for the work they put into the world.

“We cannot be surprised when a gun we designed kills someone” writes Monteiro. “We cannot be surprised when a database we designed to catalog immigrants gets those immigrants deported. When we knowingly produce work that is intended to harm, we are abdicating our responsibility. When we ignorantly produce work that harms others because we didn’t consider the full ramifications of that work, we are doubly guilty.”

By claiming responsibility for their work and its consequences, Monteiro is, without explicitly naming it, advocating second-order thinking, an idea eloquently explained in Howard Mark’s book The Most Important Thing.

First-order thinking is simplistic. It deals solely with the immediate outcome of an action. If you make it easy to send a direct message to a colleague through your project management app (action), you make it easier to collaborate (first-order outcome).

Whereas first-order thinking considers only the immediate consequences, second-order thinking considers subsequent interactions and consequences. In the project management example, making it easy to send a direct message might increase a colleague’s interruptions throughout the day (second-order outcome), which, in turn, might decrease their productivity (third-order outcome).

In order to apply second-order thinking to your own design, intelligence agent turned financier Shane Parrish recommends three specific practices.

First, when considering a potential course of action, always ask yourself “And then what?” This question prompts second-level thinking and forces you to go beyond the initial first-level design choices.

Second, imagine yourself five minutes, five months, and five years in the future, and predict the consequences of your decisions. Projecting your thoughts into the future at varying points helps frame your second-level thinking in both the short and long term.

Third, create consequence charts to predict first-, second- and third-level consequences. These charts probably won’t be exhaustive or entirely accurate, but they’ll give you a holistic review of each design choice on the table.

When you apply these three techniques, you often find that many decisions are first-level negative but second-level positive.

Second-order thinking takes a lot of work,” concludes Parrish. “It’s not easy to think in terms of systems, interactions, and time. However, doing so is a smart way to separate yourself from the masses.”

Building a compassionate future

Hopefully, future generations will look back on these years as a turning point, a moment in human history when we stopped exploiting human vulnerabilities and started building technology that protects our minds and promotes our values.

At Google, the company where Harris first conceptualised his Humane Design philosophy, we’re starting to see some ethically motivated features trickling into its products. Gmail, for example, has a feature called ‘Send and Archive’ that archives an email thread when you reply to it, instead of keeping it in your inbox. With the thread archived, users are less likely to get distracted by old emails, which helps them stay focused on their current task.

But there’s still a long way to go. Companies big and small need to think both about the design ethics of specific tools but also the ecosystem in which they exist.

“Our role in this is really interesting because Dropbox currently is not the platform all apps live on,” muses Vivrekar. “But our goal is to evolve into a hub where people connect all their tools and get their best work done.” Dropbox Extensions, which integrates Docusign, Vimeo, HelloFax, and a host of image editing tools, is a step in that direction. “I think there's a really exciting opportunity for Dropbox to show what ethical design looks like for a platform,” Vivrekar continues, “We’re putting a bunch of apps together in the same arena but we’re not pitting them against each other to compete and mine for users' attention. It’s more about giving the user the control to pull in the right app at the right time to get their work done.”

Building an ethical future requires leadership across the technology landscape. It requires everyone from founders to junior designers to make design decisions that promote what’s best for the user and not solely what’s best for the company.