Note: The following essay was originally written during the 2008-09 financial crisis, updated in May 2020, with a further update added in July 2021.
As the financial crisis of 2008 edged into 2009 and beyond, the job market looked bleaker than it had been in decades - possibly since the Great Depression of the 1930s. College graduates couldn't find jobs. Neither could factory workers, office toilers, folks with professional degrees and licenses, or even thoroughly-experienced skilled workers.
How could this be? Why couldn't the richest nation in the world (a status that might not last much longer) keep all of its people employed?
Despite the seeming inability of economists, political leaders, business analysts, and sundry experts to come up with believable reasons for the 2008-09 financial crisis, much less long-term solutions, the answer was, and is, actually quite simple: There aren't enough worthwhile jobs for everyone. As the system has functioned in the modern era, there haven't even been enough marginal jobs for every person who wants one, or needed one. For rewarding jobs that pay a living wage, forget it.
More than a decade later, in 2020, the U.S. and the rest of the world are facing a far greater economic debacle, built upon the public health tragedy brought by the coronavirus. By late May, the U.S. alone had suffered 100,000 deaths from the virus. This time, the economic crisis came on the heels of a record-breaking low unemployment figures.
This isn't a new phenomenon. Far from it. The seeds of this disastrous course have germinated for decades, as the logistics of work changed, while the system that provides workers failed to keep pace. As a result, the number of people who'd like to be working but can find nothing continues to escalate. Each financial debacle has merely added fuel to a fire that was already well-stoked.
Saying there aren't enough jobs definitely does not mean there aren't enough tasks to be done. In fact, the nation's infrastructure, the public's health and social welfare, have long approached crisis status. Help has been needed, not tomorrow or next year, but immediately. Yet, the system has been unable, or unwilling, to organize all those activities into a logical and cohesive structure - one that takes into account the tangible abilities, valuable experience, useful intentions, and true desires of the people who might be able to accomplish those tasks.
Clearly, an entirely new approach is needed, to reevaluate what constitutes work, and how undone tasks might be appropriated fairly and usefully. Don't expect movement in that direction anytime soon, though. Despite its deficiencies, the modern-day relationship between employers and employees remains so tautly established, so firmly entrenched, that it would probably take a disaster to loosen its grip. (See update below for commentary on that very disaster.)
Some observers at the time thought the national and global financial calamity that started in 2008 just might evolve into the disaster that triggers real change. That didn't happen. Then, when the Covid-19 crisis emerged and grew so rapidly, there was no valid rapid response program in place, to deal with either the medical or the economic aspects.
Until such an evolution into radical change in the workplace takes place, and we can start to look at work with fresh insights, tangible steps might be taken to make a difference in the short-term. To an extent, that was already happening in some quarters, before the virus changed everything.
Step One should be obvious, yet has failed to establish an adequate foothold. In a world where computers can help organize all sorts of trivial information and coordinate so many aspects of daily life, why aren't they being used intensively to match people with needed tasks? Tasks that might be trivial or tedious, demanding or challenging; but either way, need to be done. Loads of "apps" have appeared in recent years that promise to assist in such matchmaking, but they constitute only a bare beginning.
Thus far, little more than baby steps have been taken. Sure, the Internet has made it possible for people to apply to dozens, hundreds, thousands of potential employers in an instant. Algorithms purport to make valid decisions about hiring practices, digitally declaring who should, and who should not, be hired. But they haven't altered the basic principles of evaluating and hiring. They certainly aren't dealing effectively with potential workers who are unwanted, ill-equipped, uninspired, and viewed with suspicion.
Instead, they serve as a veneer spread across a system whose roots date back a century or more. Full-scale computerization could send people onto truly compatible careeer paths, with considerably less fuss than the usual hit-or-miss methods. It could correlate people who have specific skills, knowledge, desires, and interests with organizations that have equally specific needs.
Based upon logic, a future-focused system could make careers possible for many potential workers who simply don't fit into the traditional networks of job applicants - and never will.
Even more important, intensive usage of the Internet, including imaginative expansion of the social networks, in helpful directions, could make it possible to provide work to people who need money right now - not tomorrow, not at the end of the pay period, but today. Especially in difficult economic times, plenty of people with tangible skills and experience are out there, in dire need of some earnings. Many lost their last jobs months, even years, ago. They may have given up looking for a replacement position. But their needs haven't shrunk at all; if anything, they've increased.
Look around, and you see how many jobs are now being done - at least partly - using computers. Does everyone engaged in such tasks have to be present in a physical office, showing up each morning at 9:00 and leaving at 5:00? Of course not. Countless duties could be performed by high-tech temporary labor - by people with skills, who are currently out of the traditional job market, whether by choice or chance.
Apologists for the system might respond that plenty of workers are now doing their jobs from home, at least part of the time. True enough; but that's just a tiny beginning, compared to what's feasible. Those who do them are still hired (and fired) in the time-honored manner, paid in the long-established way, and scrutinized carefully by their supervisors, whether physically present or not. What should matter is the end result: either it's correctly done or it's not. Instead, tradional work patterns continue to emphasize the milieu of work, more than the successful achievement of it.
Whenever a project can be broken up into separate tasks, each of those could conceivably be assigned to a person who has no career plans, who threatens no one's job security or longevity, who wants nothing more than to complete the task at hand and be paid for it - right away, not at some future date.
What's the biggest obstacle? Beyond the obvious logistical issues involved to, in order to lay out work assignments in such a way, the big barrier is bosses. They want to exercise their cherished right to stand over each and every worker. This couldn't happen if work tasks were spread out.
As an example, Task A might be done in-house, in the usual way, by a career worker. Task B could be accomplished at home, by a stay-at-home mom or dad who needs the money and wishes to exercise formerly-learned skills. Task C might be in the hands of a student, working at an Internet cafe, eager to put in a few hours of even tedious computer work, in order to be rewarded with some much-needed cash. Task D could be done by a fellow who lost his job months ago, is down to nothing, but willing and eager to sit at a computer - perhaps in a public library or other institution - in order to earn money to survive another day.
Of course, there's one additional obstacle: the Internal Revenue Service. Even though the amounts involved would be small, tax agents aren't likely to overlook the possibility that some of these quickie workers might not pay the appropriate income tax for their earnings.
Mid-2021 Update: Like most essays in this "Words On Words" group, this one was initially written long before the 2020 pandemic and lockdown. In addition to the public health aspect, that crisis resulted in tens of millions of workers losing their jobs, at least temporarily, many of them over a period of several weeks. These opinions also appeared online prior to the election of Donald Trump, who presented himself as a champion of the working class, but whose administration demopnstrated far greater inclination to lavish favors upon the wealthy and well-connected.
Even before the Covid-19 pandemic emerged early in 2020, several noteworthy aspects of American worklives were starting to change. One of the most consequential was the rise of the "gig" economy, which allowed workers to earn money outside a traditional employer/employee relationship. Semi-independent, freelance work was nothing new. Plenty of artists, writers, musicians, and other "creative" persons had been working that way for years; even decades. What made the difference here was the emergence of ride-sharing, under the auspices of a prominent company (notably, Uber and Lyft), which served as an intermediary more than a conventional employers. Drivers could choose their own hours, work as much (or as little) as they liked, with pay based upon the number and duration of rides provided. (Eventually, critics sought to compel Uber, in particular, to treat its drivers as employees with benefits rather than independent contractors, but that's a story for another essay.)
Within a surprisingly short time, simple ride-sharing expanded into a network of services, allowing customers to order groceries, ready-to-eat meals, and many other essentials, having them delivered by a ride-share driver.
During the early months of the pandemic, which began in earnest around March 2020, millions of American orkers lost their jobs, and quickly realized that prospects for being hired elsewhere were virtually nonexistent. Disruption hit hard on the entire employer/employee setup. Retained workers found their hours cut. Businesses shut their doors - hopefully temporarily, but increasing often, permanently.
Before long, observers of the workaday world began to realize that a considerable amount, if not most, of the "knowledge" work done by millions of employees could be accomplished just as well remotely - from one's home - as it could by people sitting in cubicles at a central location.
Early in 2021, as the pandemic appeared to be easing at last (due mainly to vaccination), another disruptive effect came along. Restaurants and other "hospitality" organizations, in particular, began to experience serious worker shortages. In many parts of the country, few workers - whether thoroughly experienced or simply looking for work - were expressing any interest in a stint as a waitress, bus boy, hotel clerk, and so forth. This phenomenon stood in glaring contrast to the customary employer/employee relationship, where bosses almost invarably held the upper hand. Now, job seekers could turn down situations that failed to promise sufficient pay or satisfactory conditions. Even in cities and states that had raised the minimum wage, a startling number of unemployeed folks were saying "no" to job offers that failed to please them.
What will happen when (or if) the pandemic really does end is largely beyond the imagination of those of us who study work and labor. What can be said without hesitation, though, is that in the months and years ahead, work is going to look a lot different than it did just a few years back.