Chapter 34: Concluding thoughts
Perhaps you’ve read this entire book and you’ve come to the conclusion, ‘Yeah, but I wouldn’t do any of that stuff. I’m honest. I’ll always respect users and never manipulate or deceive them.’
If you’re a designer, a business stakeholder, or any kind of product decision-maker, this is a dangerous way of thinking. It deters you from deeply considering the consequences of product design, particularly in situations where you’re under pressure to deliver results.
Instead, I think it’s useful to always think of UI design as an act of persuasion. If the user’s needs and the consequences of your work are not fully considered, then your efforts to persuade are liable to start down a slippery slope and become manipulation or deception. Design is a balancing act between business objectives and user needs. Even seemingly neutral decisions have consequences – if you present one feature prominently in your product, you present other features less prominently as a result. Sometimes these trade-offs are not as harmless as they initially may seem.
I’m reminded of the horror movie cliché where one of the characters explains, to great dramatic effect, that the evil thing pursuing them doesn’t feel compassion, can’t be reasoned with, and will never stop – whether it’s a Terminator, a shark, or some guy in a hockey mask. It’s quite a good technique to get the audience on the edge of their seats for the rest of the movie. But here’s the thing: software is quite similar. Software is very good at following the same instructions over and over again, it doesn’t feel compassion, and it’s generally very bad at reflecting on the implications of its actions on the wider world.
Once a user interface has been programmed to behave in a certain way, it will continue to do this with every single human it comes in contact with, whether it’s a thousand people a day, a million, or more. The scale is almost unlimited, so the impact of every tiny design decision is magnified and should be considered accordingly.
If you give a human an unkind script to follow in a customer services or sales role, they’ll become aware of the implications. In a team, some of them will eventually deviate from the script, apply a little compassion, complain to management, or just leave the job. Unless it’s explicitly programmed to, software will never choose to help, or be kind, or go the extra mile for a vulnerable user. Software creates a barrier between the people inside a business and the users outside. Users become anonymous numbers in a spreadsheet, or pixels on a line graph that business stakeholders are trying to move upwards and to the right. Humanity is stripped out and that makes it much easier to do unfair and harmful things through deception and manipulation.
It’s worth thinking about the words of Nobel Prize winner Richard Thaler here: ‘Whenever anyone asks me to sign a copy of the book Nudge I sign it “nudge for good”, which is a plea, not an expectation, because it is possible for actors in both the public and private sector to nudge for evil.’1
Let’s work together to make nudging for good the norm, not the exception. There should be no room for manipulation and deception in our products and markets.