The Labor Illusion and Ethically Deceptive Design
The first baking mixes, back as early as the 1930s, were truly all in one. Powdered eggs and everything. Add water, bake it. But they didn’t sell well. It took a psychologist running focus groups to figure out why.
American women wanted to feel more involved in the cake-baking process, and that cake mixes that required them to add eggs would sell better.
https://www.bonappetit.com/entertaining-style/pop-culture/article/cake-mix-history
At least for certain products or experiences — like a home made cake — people want to feel engaged, and they want to see or experience effort going into things that are traditionally hand crafted, labor intensive or so on.
(Okay, this is why I don’t blog so much. Read all this too fast to bang it out, and it turns out the above story is at least partly a myth. Dammit.)
This applies to the digital products we work with every day in much the same way. People anthropomorphize most sufficiently complex systems, so they assume that difficult things act as they would for humans. Thus, they expect to see visible effort such as spending additional time to perform a hard task.
Without that, you loose the users trust in the system. It is broken, wrong, or deliberately lying to you if the apparent effort doesn’t match your expectations.
This is because of something called the labor illusion
A ubiquitous feature of even the fastest self-service technology transactions is the wait. Conventional wisdom and operations theory suggests that the longer people wait, the less satisfied they become; we demonstrate that due to what we term the labor illusion, when websites engage in operational transparency by signaling that they are exerting effort, people can actually prefer websites with longer waits to those that return instantaneous results—even when those results are identical.
And it’s all for the usual reasons, that human brains weren’t made to handle computers, but to deal with people, and animals, and move dirt and trees around. It’s related to why we see faces in clouds.
This is all front of mind to me because Cheddar interviewed me, and others, about this topic recently:
My part of the story is that I discovered this by accident during usability testing long ago. Always test, because you might find something odd, or even at the time, fully new and unexpected.
A few years ago, my team built a mobile-phone plan advisor, a tool that helps people find the best service plan for their needs. Users experienced its loading quickly as a serious problem. People often picked the wrong plan, so ran out of minutes, got the wrong services, or wasted money. It wasn’t actually that hard to get the answers, and the technology our developers used was pretty slick, so we could get them almost instantly. But, in testing, we found that no one trusted the information. We eventually discovered that users assumed that a fast response meant a lie—a canned response pushing what the company wanted to sell.
We simply added a delay indicator—with a bit of randomness for the time so it didn’t seem fake—and people immediately became engaged with the results, leaning into the computer screen eagerly awaiting their personalized answer.
Others figured this out in the same timeframe; or maybe they just read my blogging or speaking about it :) Think about the delay to find the best deal on airline sites for example. That’s a fake delay. A lot of things having to do with trust are artificially padded delays, and not real.
Fake? That means lies!
But is it ethical? I hadn’t thought of that before, but while being interviewed for the article I had that thought. We value authenticity, so is this okay?
Um… it depends. Just like when considering dark patterns, it’s about the end result and what it makes people do. And no, not your intent. You might mean well, but make a mistake.
And even this analysis isn’t that simple. Take my little plan optimizer. Offering people telecom services is empowering to them, and saving them money is a good thing. I had meetings about this tool where there were executives not happy that we would get lower total average revenue because we’re not screwing anyone with tricks and confusion.
So, we’re good right? Well, we’re also using it to make sure the customer buys, today, and from our company. Is that ethical… much more gray area. Think hard when designing anything.
Computers Are For People
But one thing I have started to realize is that computers are for people. They work in very, very inhuman ways, so we’re already doing a lot to interpret their input and output to human ways of thinking, and perceiving.
Authenticity can go too far, so things like delays to “trick people” are simply not always evil, but are more modifying the computer to act human. Never try to change people’s expectations. Change your system to work for people.