Musings

A web log by Ben Makuh

Technology

Book Reviews

About

Creating Problems with Software Solutions

Software companies should have a healthy fear that their programmed panaceas might make the world a strictly worse place to be.

Software companies should have a healthy fear that their programmed panaceas might make the world a strictly worse place to be.

We are probably all familiar with the phrase, "If all you have is a hammer, everything looks like a nail." Have you ever played out in your head what that would actually look like? Imagine trying to solve all your problems-putting in screws, chopping vegetables, changing the oil in your car-with a hammer. Consider, though, that the phrase above is actually true of itself as well: it carries an implicit assumption that this world of problems needs to be solved with different tools, but nevertheless that every problem should be solved.

I have been reflecting lately upon the instinct of so many software engineers, product managers, product designers, and even really anyone who simply uses digital technology: namely, we begin to recognize problems or "friction" in our daily lives that could be eliminated with software. Here's how it goes. Envision yourself calling up a taxi dispatcher to request a ride to the airport, and in the middle of trying to annunciate your address extremely clearly over a telephone line it hits you that so much of this process could be so much easier. My exact location can be determined by the phone that I'm calling the taxi company with. My payment info is on that phone as well. In fact, there's nothing that really necessitates that phone call for a ride other than the reality that there isn't any other way. So you begin picturing the app in your head and creating a mental checklist of your requirements of it. You know immediately that this idea is good enough that other people would find it useful as well, and honestly it's good enough that you could pitch it to a venture capitalist and easily get them to back you.

Imagine that you're the manager of a couple hundred employees spread out across an entire floor of a building downtown. Communication is painful: you get thousands of emails per day ranging from the relatively inconsequential request for time off to an outline for the Q3 sales strategy. In between are hundreds of cold-call emails from other companies trying to sell you something, newsletters for things you theoretically care about, real junk mail, and itineraries for your upcoming flights. You're smart so you've set up plenty of filters to send emails this way and that way depending on what it is and who's sending it, but your inbox can still only be described as a black hole. Suddenly it comes to you: there's a whole category of emails in there that could only be described as intra-office communication, and that actually makes up the bulk of what comes through every day. What if that communication could be split out into a different application that was slightly less transactional, slightly less formal, and a little more conversational? What if you could group those communications not by topic or by timestamp, but by the person you're communicating with? It could be like Gmail's innovative concept of threaded emails, but the thread would capture everything you've ever said for future reference and you could do away with the formalism of an entire email. You could have public channels where everyone can see the thread, as well as private channels between you and one or more other people. This idea has legs, you realize. I need to build this.

In the course of your utopian dreaming about how to solve the most irritating aspects of life, though, have you considered the ways in which this thing might go sideways not just for you but in the ways in which it affects and shapes the way people do life? That's really just a fancy way of saying, "Have you thought through the ethical ramifications of what you're doing?" What if your app creates new, worse problems then the ones you already had? Are you trying to solve the problem with a hammer? Is it even a problem that needs to be solved?

The answer, of course, can be "yes." There's a beauty in the way that real problems can be solved and people can make a living off of solving it. There's also a sense in which every solution can and will create new problems, and so that cannot by itself prevent you from ever doing anything. It's the classic time-travel trope: "What if I go back in time and step on a butterfly and somehow that causes me to never have been born?" The protagonists of such stories typically decide to do what must be done, but to be as careful as they can be. My contention in this article is that the software industry as a whole1 tends to mostly avoid this sort of ethical reflection. Figure out how to land once you've jumped.

By and large, the financial structures underneath software do not economically incentivize this sort of ethical reflection upon the business and its decisions. VCs want a return on their investment, and they want it as soon as possible. If new problems are created in the course of building a business, well, that's fine as long as there's an eventual profit. Plus, all those new problems are future business opportunities! Ethical reflection is not the kind of thing that you can simply put on your task list and check off once you've devoted a few hours to it: it's an ongoing mode of working that yields more fruit the longer you participate in it.

The net ethical effect of a business is also not something you can push into the positive simply by offsetting all the crappy things you're doing with a couple good deeds on the side. If your company is largely responsible for the breakdown of civil discourse on the internet and in the society and for the proliferation of news that looks real but is indeed fake, then it is irrelevant whether you encourage your employees to "give back to the community" by giving them volunteer time off. It does not "balance out" your misdeeds to build social causes into your platform that people can thumb up. You have made the business more complex and challenging to reason about in an ethical sense, but you have not, in fact, done anything about the underlying rot.

The worst thing is when you realize that underlying rot is core to the business, and that doing something about it would undercut everything. When you realize that you're making your money by building personalized echo chambers (like a sort of digital Mirror of Erised) that show people what they want to see but draw them out of reality, it's extremely challenging to do anything about it. You not only can't let the profit margins drop, but you actually have to keep them going up quarter by quarter.

When should that ethical reflection have been done? Before you have built something too big to fail. "But no one can know the future!" you contend. "There's no way you can know all the possible side effects of a software application!" It's true, oh so true. But you can slow down and live the examined life. You actually can think about these things before you present your pitch deck. You can think about these things before you design your prototype. You can think about these things before you send your feature requirements to developers. You can think about these things before you write a single line of code. You can think about these things before you ship the feature.

You can say, "I was wrong," and change course. It's just easier before it grows into a unicorn.

In the Hebrew Torah, there's a fascinating account of the ancient people of Israel after they have been rescued from 400 years of hard slavery on their way to the Promised Land where their ancestors had lived before the epoch of slavery. They are trudging their way through the hot, arid, Middle Eastern desert when they stop at an imposing mountain and God tells his people what the good life, the good society in this new land should look like. He calls it "shalom," meaning "peace" or "harmony."

There's a problem, though: the land to which they are returning is now inhabited by other people. They must drive those people out of the land before they can inhabit it again.2 Now I would imagine that to a population of recently-freed slaves walking through an absolute oven of a desert, their solution to that problem would be, "Let's get this done as quickly as possible!" They do not see the big picture, though, and that solution would create worse problems than the ones they're already facing. God tells them,

I will not drive them out from before you in one year, lest the land become desolate and the wild beasts multiply against you. Little by little I will drive them out from before you, until you have increased and possess the land.3 Sometimes our quick solutions have undesirable consequences. "Out of the frying pan, into the fire," as they say. The God of Israel is not bound by their limited frame of reference, however, and in an act of kindness he tells them not only that they are going to go the slow route, but why. Moving fast often breaks things in deeper, more worrying ways than we realize. Slowing down is maddening, especially when we are chasing a vision of the Promised Land, but it is often wise.

It is worth pushing back on our instinct to solve everything by programming something. Without going so far as disavowing technology altogether, we should stop assuming that it is automatically wise to throw new apps, smarter gadgets, more intelligent gizmos at a problem. Not every feeling of friction needs to be productized. Not everything that looks like a problem to be solved is a problem to be solved. Sometimes things can just be, and the friction and frustration of life can simply be part of the deal. Let's leave space for the problems that really truly do need to be solved, and the products that really truly should exist.

Notes

  • 1 There are obvious exceptions to this blanket statement.
  • 2 The ethical questions around driving people out of the land are well worth considering, but tangential to this article.
  • 3 Exodus 23:29-30, ESV.