“Work,” wrote Lord Bertrand Russell in 1935, “is of two kinds: first, altering the position of matter at or near the Earth’s surface relatively to other such matter; second, telling other people to do so.”
The great philosopher’s words remained more or less true for half a century. But in recent decades, a third kind of work has emerged. This third kind, to use Russell’s lingo, consists of altering the position of icons on one or another glowing screen relatively to other such icons. In other words, using a computer.
As this third kind of work has skyrocketed, the need for other kinds of work has plummeted. Machines now man our factories, steward our knowledge, and fight our wars. Most labor – even skilled labor – is no longer the sole dominion of intelligent apes. For instance, an AI from Google called GoogLeNet now detects metastatic cancer better than a pathologist can. Before long, perhaps, massage therapist will be the only job open to the industrious hominid.
But if the need for human labor is being supplanted, our work schedules haven’t noticed. According to the Organisation for Economic Cooperation and Development (OECD), Americans have toiled at or around 35 hours a week in every year since 1980. Let that sink in for a moment. The combined advancements in technology, efficiency, and knowledge over nearly four decades were insufficient to generate any additional leisure time for the average worker.
So what’s going on here? Simple economics. In this universe, cold capitalism does not reward technological progress with vacation time. Russell’s example will help. When an upgrade doubles production across the pin industry, he notices, the workers don’t get a break, but rather “still work eight hours, there are too many pins, some employers go bankrupt, and half the men previously concerned in making pins are thrown out of work.”
When technology improves, the invisible hand of industry doesn’t usher in utopia; it forces massive layoffs. And yet, one might argue, technological progress doesn’t only displace jobs; it often generates them. Coding whiz, business analyst, digital marketer: these jobs were spawned by improvements in silicon. With the old work relinquished to machines, humans have found new work to fill the void. But it seems clear this situation is merely temporary. In a generation or two, the best analyst – in every sense of the word best – will be a computer.
I used to be a financial analyst myself. I know an AI could do that job. Most of my time was spent moving numbers from one file to another at the whims of senior management. These whims, however, would change on an hourly basis. It was a well-oiled machine designed to render my work obsolete at least 5 times a day.
Of course, it’s not only paid labor that’s being blanketed by technology. Our relaxation time – the very thing computers should create in spades – is no longer relaxing at all. A steady stream of data surges in. Email updates, Facebook notifications, breaking news, casual texts, romantic possibilities – even the occasional phone call – compete for our attention.
This is work in sheep’s clothing. There’s a phenomenon called decision fatigue, widely studied in psychology, that causes people to lose their decision-making faculties after one too many choices. Given this unhappy truth about the human psyche, each alert represents little more than a brain drain. The trivialities add up. Come nighttime, the mind is scarcely able to stagger to the couch and select the first thing in the Netflix queue.
I’m not against Netflix of course. Technology, without doubt, is responsible for most of our creaturely comforts. And yet we aren’t taking full advantage. Despite quantum leaps in computation, we work the same hours we did 35 years ago. Machines continually ease our workload, but instead of kicking back, we keep employees in the office and out of the sun.
And even when we’re not working, we’re working nonetheless. Thanks to our intrusive digital denizens, our free time is never free for long.
This is a curious state of affairs, but not an irredeemable one. For now at least, humans still reign over computers. Perhaps it’s time we started acting like it.
Russell, B. (1994). In Praise of Idleness. New York, NY: Routledge.