Friday, January 14, 2022

Omicron Is Making America’s Bad Jobs Even Worse

Even on a good day, service jobs are pretty hard. Your schedule is constantly changing, you’re on your feet, you’re at the mercy of the general public, and the pace of your shifts swings between crushing boredom and frenetic activity. You’re probably not guaranteed any particular number of hours in a given week, and you can be cut from the schedule or called in to work at the last second. For all that, you’re paid too little to cover the basic needs of an American adult: a median of $12 to $14 an hour, according to data from the Bureau of Labor Statistics.

So far, Omicron has not provided service workers with any good days. As the highly transmissible, immunity-evading coronavirus variant surges across the country, it has filled hospitals, infected record numbers of people, and made everyday life a nightmare for workers in stores, restaurants, gyms, schools, health-care facilities, and so many other workplaces. Many workers are currently sick or have been exposed to the virus, and changing isolation and quarantine guidelines make it unclear how long they should stay home, or whether their employer will even allow that. Tests to confirm infection are expensive and scarce. In workplaces with Omicron outbreaks, there may not be enough available workers to continue operating the business for days or weeks at a time, which means everyone loses their shifts—and their paychecks—in “soft lockdowns” that workers must navigate with little institutional or governmental support. For businesses that remain open, understaffing and supply shortages make workers’ interactions with customers even more tense and dangerous.

Before the new variant reared its head, people were already leaving the service sector in droves. Now the Omicron surge is laying bare how few protections workers have retained from the scant services given to them earlier in the pandemic, and just how little safety and stability this kind of work provides to the people who do it. Omicron is making many of America’s bad jobs even worse.

Some elements of this current crisis were put in place and allowed to fester over the past two years, but many of them spring from the fundamentally precarious nature of service jobs. Understaffing and low pay, for example, have been chronic issues across shift-work occupations for years, according to Daniel Schneider, a sociologist at Harvard and a co-founder of the Shift Project, which surveys tens of thousands of hourly workers at large employers, including Dollar General, Starbucks, and Macy’s. Lowering labor costs makes these businesses more profitable, Schneider told me, but it also makes them brittle, even under the best circumstances. There may be a “kind of tipping-point dynamic here,” he said, “where, yeah, these jobs have always been precarious, they’ve always been bad, but the confluence of those conditions—more difficult customer management and even fewer people on the job—is almost a multiplier on the hazard of this work.”

One of the most obvious issues is service workers’ widespread lack of access to paid sick leave, according to Schneider. Before the pandemic, more than half of the workers surveyed by the Shift Project lacked paid sick leave completely. As of November, that number had barely moved. This is the case even though in March 2020, the federal government passed the Families First Coronavirus Response Act (FFCRA), which mandated two weeks of paid sick leave for workers previously not given it by their employer. Even at its best, this patchwork of policies had enormous deficiencies, Schneider said: The FFCRA excluded anyone who worked for a company employing more than 500 people, which disqualified workers at big-box stores, supermarkets, chain pharmacies, department stores, fast-food restaurants, and large e-commerce companies. It also left out many of the people who do poorly paid and largely invisible work in workplaces that put them at particularly extreme risk, such as hospitals and care homes, including many janitorial, laundry, and cafeteria workers.

[Read: The real reason Americans aren’t isolating]

Some of the big companies not affected by the FFCRA chose to implement leave policies and other pandemic-specific benefits of their own, such as hazard pay and testing programs, thanks at least in part to public pressure to protect workers. Walmart, Amazon, and CVS, for instance, made headlines by extending 10 days of paid leave to anyone who tested positive for COVID-19. But Schneider said this was only ever a tiny minority of employers, and for every large company that made these changes, many more didn’t provide any additional benefits at all. “What we’re seeing is large companies really try their best to do the least possible,” Schneider said. “There is really an effort by firms to avoid requirements to do things and instead to just be asked to voluntarily do things.”

That effort clearly has contributed to the tipping-point dynamic: Cases have surged at the exact same time that many protections for workers, including the FFCRA, have expired, and the relatively small number of employers who voluntarily granted extra sick leave and other benefits have largely rolled back those programs. Amazon, for example, requires employees to submit test results in order to qualify for any COVID-19 sick leave, but a number of the company’s workers told NBC News that they’re now on their own to secure testing, after the company closed down employee testing facilities that provided that service free of charge earlier in the pandemic. (In response to NBC, an Amazon spokesperson said that the company is looking into the reported issues and focusing on getting workers vaccinated.) Many companies have similar testing requirements for service workers to access leave. Without results, taking time off for illness is unpaid for many workers. And making $12 to $14 an hour, vanishingly few service workers have the financial stability necessary to take any amount of unpaid leave, if their employer would even allow it.

The federal public-health apparatus has effectively endorsed these rollbacks. In late December, the CDC reduced isolation guidelines for infected Americans who aren’t severely ill from 10 days to five. Anthony Fauci hailed the move for helping Americans “get back to the workplace, doing things that are important to keep society running smoothly,” but many experts have criticized the agency over a lack of strong evidence that it’s safe for workers to return to in-person jobs so quickly. Requiring a negative test after infection would make these guidelines safer, but the revised rules don’t require that. In the weeks since the announcement was made, Delta, Amazon, Walmart, CVS, and Walgreens have all cut their paid-leave policies for COVID-19 infections down to the equivalent of five workdays. And they’ve been slow to add any testing requirement to their own guidelines.

[Read: America’s COVID rules are a dumpster fire]

The story has largely been the same for any other benefits or protections extended to service workers during the pandemic, Schneider said. Enhanced federal unemployment benefits expired months ago; companies that provided hazard-pay wage bumps have almost all rolled those back; and even many simple precautions to protect people who work with the general public, such as local mask mandates, have been repealed. Just this week, the Supreme Court blocked the Biden administration’s vaccine-or-test mandate, which would have required large employers to verify that all of their employees are either vaccinated or regularly tested in order to ensure the safety of their workplaces.

As protections and support wash away, many service jobs themselves have become more difficult. Supply and staffing shortages at stores and restaurants mean that service and selection may not be exactly the same for customers as they were before the pandemic—tiny disappointments that spark episodes of verbal abuse or violent rage toward workers. A swirl of infections, winter storms, and supply-chain disruptions have left America’s grocery stores, for instance, scrounging for goods in recent weeks. “We’re essentially asking this least well compensated and most precariously employed workforce to take on the everyday management of a polarized and angry and dangerous public,” Schneider said. This was the case before Omicron, and even if the variant’s wave is as short as many hope it will be, its interruptions will have effects visible in additional shortages (and their attendant frustrations) for months, at least.

[Read: American shoppers are a nightmare]

Schneider said no one has a totally satisfying answer as to why retail stores and restaurants have had such a hard time staffing up in the past six months. After all, he pointed out, many of the people who would usually fill those jobs had no safety net before the pandemic either. But a few theories add up to explain much of the problem. Long-term downward trends in immigration to the United States, and especially low immigration levels in the past two years, might have choked off an important source of low-wage workers. Increased difficulty in finding adequate and affordable child care is another reason, especially for the many families that may have relied on older relatives who have been lost to the pandemic. And some people have simply left the retail and food-service industries altogether, switching to other kinds of work. “A better way to think about the labor-shortage problem is that we have a pay-shortage problem,” Ben Zipperer, an economist at the Economic Policy Institute, a left-leaning think tank, told me. Workers who take less-than-ideal jobs after mass layoffs might be more likely to stick with them instead of looking for a better role if the circumstances of many of those jobs weren’t so bad.

There is little reason to believe that the Omicron wave won’t make these jobs even harder to fill. “We haven’t solved any of the kind of fundamental problems of the labor market that make things worse during a pandemic,” Zipperer said. Incredibly popular policies, such as increasing the federal minimum wage, have largely stalled out, even though Zipperer thinks that the pandemic is an ideal time to rally the political will to make something like that happen.

Schneider didn’t feel much more optimistic about what Omicron might do to the lives of service workers, or about the signals those in power have been sending about how they intend to handle the situation. “It doesn’t feel like there is any real appetite by anybody to return to substantial policy that might protect workers,” he told me. Instead, we’ve committed to riding out this wave, no matter how bad it gets. The hope, Schneider said, is that it’s fast.



from Business | The Atlantic https://ift.tt/3fneyXJ
via IFTTT

Wednesday, November 24, 2021

The Trust Recession

4-panel cartoon with 2 figures in business attire: both standing; the left figure falling in a trust fall; the right figure walks away and left figure hits floor
Albert Tercero

Manufacturer inventories. Durable-goods orders. Nonfarm payrolls. Inflation-adjusted GDP. These are the dreary reportables that tell us how our economy is doing. And many of them look a whole lot better now than they did at their early-pandemic depths. But what if there’s another factor we’re missing? What if the data points are obscuring a deepening recession in a commodity that underpins them all?

Trust. Without it, Adam Smith’s invisible hand stays in its pocket; Keynes’s “animal spirits” are muted. “Virtually every commercial transaction has within itself an element of trust,” the Nobel Prize–winning economist Kenneth Arrow wrote in 1972.

But trust is less quantifiable than other forms of capital. Its decline is vaguely felt before it’s plainly seen. As companies have gone virtual during the coronavirus pandemic, supervisors wonder whether their remote workers are in fact working. New colleagues arrive and leave without ever having met. Direct reports ask if they could have that casual understanding put down in writing. No one knows whether the boss’s cryptic closing remark was ironic or hostile.

Sadly, those suspicions may have some basis in fact. The longer employees were apart from one another during the pandemic, a recent study of more than 5,400 Finnish workers found, the more their faith in colleagues fell. Ward van Zoonen of Erasmus University, in the Netherlands, began measuring trust among those office workers early in 2020. He asked them: How much did they trust their peers? How much did they trust their supervisors? And how much did they believe that those people trusted them? What he found was unsettling. In March 2020, trust levels were fairly high. By May, they had slipped. By October—about seven months into the pandemic—the employees’ degree of confidence in one another was down substantially.

Another survey, by the Centre for Transformative Work Design in Australia, found bosses having trust issues too. About 60 percent of supervisors doubted or were unsure that remote workers performed as well or were as motivated as those in the office. Meanwhile, demand for employee-surveillance software has skyrocketed more than 50 percent since before the pandemic. And this spring, American employees were leaving their jobs at the highest rate since at least 2000.

Each of these data points could, of course, have multiple causes. But together they point in a worrisome direction: We may be in the midst of a trust recession.

Trust is to capitalism what alcohol is to wedding receptions: a social lubricant. In low-trust societies (Russia, southern Italy), economic growth is constrained. People who don’t trust other people think twice before investing in, collaborating with, or hiring someone who isn’t a family member (or a member of their criminal gang). The concept may sound squishy, but the effect isn’t. The economists Paul Zak and Stephen Knack found, in a study published in 1998, that a 15 percent bump in a nation’s belief that “most people can be trusted” adds a full percentage point to economic growth each year. That means that if, for the past 20 years, Americans had trusted one another like Ukrainians did, our annual GDP per capita would be $11,000 lower; if we had trusted like New Zealanders did, it’d be $16,000 higher. “If trust is sufficiently low,” they wrote, “economic growth is unachievable.”

If you can rely on people to do what they say they’re going to do—without costly coercive mechanisms to make them dependable—a lot of things become possible, argued Francis Fukuyama in his 1995 book, Trust. In the late 19th century, it was “highly sociable Americans” who developed the first large-scale corporations, effectively pooling the ideas, efforts, and interests of strangers. In the late 20th, some of the earliest iterations of the internet emerged from the same talent for association. Throughout nearly all of America’s history, its economy has benefited from a high degree of trust.

But leaks in the trust reservoir have been evident since the ’70s. Trust in government dropped sharply from its peak in 1964, according to the Pew Research Center, and, with a few exceptions, has been sputtering ever since. This trend coincides with broader cultural shifts like declining church membership, the rise of social media, and a contentious political atmosphere.

[David Brooks: America is having a moral convulsion]

Data on trust between individual Americans are harder to come by; surveys have asked questions about so-called interpersonal trust less consistently, according to Pew. But, by one estimate, the percentage of Americans who believed “most people could be trusted” hovered around 45 percent as late as the mid-’80s; it is now 30 percent. According to Pew, half of Americans believe trust is down because Americans are “not as reliable as they used to be.”

Those studies of suspicious Zoom workers suggest the Trust Recession is getting worse. By October 2021, just 13 percent of Americans were still working from home because of COVID-19, down from 35 percent in May 2020, the first month the data were collected. But the physical separation of colleagues has clearly taken a toll, and the effects of a long bout of remote work may linger.

Why? One reason is: We’re primates. To hear the anthropologists tell it, we once built reciprocity by picking nits from one another’s fur—a function replaced in less hirsute times by the exchange of gossip. And what better gossip mart is there than the office? Separate people, and the gossip—as well as more productive forms of teamwork—dries up. In the 1970s, an MIT professor found that we are four times as likely to communicate regularly with someone sitting six feet away from us as with someone 60 feet away. Maybe all that face time inside skyscrapers wasn’t useless after all.

Trust is about two things, according to a recent story in the Harvard Business Review: competence (is this person going to deliver quality work?) and character (is this a person of integrity?). “To trust colleagues in both of these ways, people need clear and easily discernible signals about them,” wrote the organizational experts Heidi Gardner and Mark Mortensen. They argue that the shift to remote work made gathering this information harder. Unconsciously, they conclude, we “interpret a lack of physical contact as a signal of untrustworthiness.”

This leaves us prone to what social scientists call “fundamental attribution error”—the creeping suspicion that Blake hasn’t called us back because he doesn’t care about the project. Or because he cares about it so much that he’s about to take the whole thing to a competitor. In the absence of fact—that Blake had minor dental surgery—elaborate narratives assemble.

Add to the disruption and isolation of the pandemic a political climate that urges us to meditate on the distance—ethnic, generational, ideological, socioeconomic—separating us from others, and it’s not hard to see why many Americans feel disconnected.

What has suffered most are “weak ties”—relationships with acquaintances who fall somewhere between stranger and friend, which sociologists find are particularly valuable for the dissemination of knowledge. A closed inner circle tends to recycle knowledge it already has. New information is more likely to come from the serendipitous encounter with Alan, the guy with the fern in his office who reports to Phoebe and who remembers the last time someone suggested splitting the marketing division into three teams, and how that went.

[Read: The pandemic has erased entire categories of friendship]

Some evidence suggests that having more weak ties can shorten bouts of unemployment. In a famous 1973 survey, the Stanford sociologist Mark Granovetter discovered that, among 54 people who had recently found a new job through someone they knew, 28 percent had heard about the new position from a weak tie, versus 17 percent from a strong one. When the weak ties fall away, our “radius of trust”—to borrow Fukuyama’s term—shrinks.

That’s a problem for individual employees, as much as they may appreciate the flexibility of working anywhere, anytime. And it’s a problem for business leaders, who are trying to weigh the preferences of those employees against the enduring existence of the place that employs them. They don’t want to end up like IBM. It saved $2 billion making much of its workforce remote as early as the 1980s, only to reverse course in 2017, when it recognized that remote work was depressing collaboration. Microsoft CEO Satya Nadella recently wondered whether companies were “burning” some of the face-to-face “social capital we built up in this phase where we are all working remote. What’s the measure for that?”

A trust spiral, once begun, is hard to reverse. One study found that, even 20 years after reunification, fully half of the income disparity between East and West Germany could be traced to the legacy of Stasi informers. Counties that had a higher density of informers who’d ratted out their closest friends, colleagues, and neighbors fared worse. The legacy of broken trust has proved extraordinarily difficult to shake.

It’s not hard to find advice on how to build a culture of trust: use humor, share your vulnerabilities, promote transparency. But striking the right tone in today’s pitched political climate, often over Zoom, possibly under surveillance, is no easy feat.

Even so, it may be instructive for companies trying to navigate this moment to remember why they were formed in the first place. By the late 19th century, it was evident that some jobs were too crucial to leave to a loose association of tradespeople. If the mill had to be running full steam at all hours, you needed to know who could handle the assembly line, who could fix a faulty gasket, and above all who would reliably show up day after day. Then you needed those people legally incorporated into one body and bound by the norms, attitudes, and expectations baked into the culture of that body.

Not so incidentally, those first corporations went by a particular moniker. They were called “trusts.” And without that component underpinning all the industrial might and entrepreneurial ingenuity, you have to wonder if they could ever have been built at all.


This article appears in the December 2021 print edition with the headline “The End of Trust.”



from Business | The Atlantic https://ift.tt/3xhH3OH
via IFTTT

Thursday, October 14, 2021

The Men Who Are Killing America’s Newspapers

The Tribune Tower rises above the streets of downtown Chicago in a majestic snarl of Gothic spires and flying buttresses that were designed to exude power and prestige. When plans for the building were announced in 1922, Colonel Robert R. McCormick, the longtime owner of the Chicago Tribune, said he wanted to erect “the world’s most beautiful office building” for his beloved newspaper. The best architects of the era were invited to submit designs; lofty quotes about the Fourth Estate were selected to adorn the lobby. Prior to the building’s completion, McCormick directed his foreign correspondents to collect “fragments” of various historical sites—a brick from the Great Wall of China, an emblem from St. Peter’s Basilica—and send them back to be embedded in the tower’s facade. The final product, completed in 1925, was an architectural spectacle unlike anything the city had seen before—“romance in stone and steel,” as one writer described it. A century later, the Tribune Tower has retained its grandeur. It has not, however, retained the Chicago Tribune.

To find the paper’s current headquarters one afternoon in late June, I took a cab across town to an industrial block west of the river. After a long walk down a windowless hallway lined with cinder-block walls, I got in an elevator, which deposited me near a modest bank of desks near the printing press. The scene was somehow even grimmer than I’d imagined. Here was one of America’s most storied newspapers—a publication that had endorsed Abraham Lincoln and scooped the Treaty of Versailles, that had toppled political bosses and tangled with crooked mayors and collected dozens of Pulitzer Prizes—reduced to a newsroom the size of a Chipotle.

Spend some time around the shell-shocked journalists at the Tribune these days, and you’ll hear the same question over and over: How did it come to this? On the surface, the answer might seem obvious. Craigslist killed the Classified section, Google and Facebook swallowed up the ad market, and a procession of hapless newspaper owners failed to adapt to the digital-media age, making obsolescence inevitable. This is the story we’ve been telling for decades about the dying local-news industry, and it’s not without truth. But what’s happening in Chicago is different.

In May, the Tribune was acquired by Alden Global Capital, a secretive hedge fund that has quickly, and with remarkable ease, become one of the largest newspaper operators in the country. The new owners did not fly to Chicago to address the staff, nor did they bother with paeans to the vital civic role of journalism. Instead, they gutted the place.

Two days after the deal was finalized, Alden announced an aggressive round of buyouts. In the ensuing exodus, the paper lost the Metro columnist who had championed the occupants of a troubled public-housing complex, and the editor who maintained a homicide database that the police couldn’t manipulate, and the photographer who had produced beautiful portraits of the state’s undocumented immigrants, and the investigative reporter who’d helped expose the governor’s offshore shell companies. When it was over, a quarter of the newsroom was gone.

The hollowing-out of the Chicago Tribune was noted in the national press, of course. There were sober op-eds and lamentations on Twitter and expressions of disappointment by professors of journalism. But outside the industry, few seemed to notice. Meanwhile, the Tribune’s remaining staff, which had been spread thin even before Alden came along, struggled to perform the newspaper’s most basic functions. After a powerful Illinois state legislator resigned amid bribery allegations, the paper didn’t have a reporter in Springfield to follow the resulting scandal. And when Chicago suffered a brutal summer crime wave, the paper had no one on the night shift to listen to the police scanner.

[Read: What we lost when Gannett came to town]

As the months passed, things kept getting worse. Morale tanked; reporters burned out. The editor in chief mysteriously resigned, and managers scrambled to deal with the cuts. Some in the city started to wonder if the paper was even worth saving. “It makes me profoundly sad to think about what the Trib was, what it is, and what it’s likely to become,” says David Axelrod, who was a reporter at the paper before becoming an adviser to Barack Obama. Through it all, the owners maintained their ruthless silence—spurning interview requests and declining to articulate their plans for the paper. Longtime Tribune staffers had seen their share of bad corporate overlords, but this felt more calculated, more sinister.

A stack of Chicago Tribune newspapers, tied together as a bundle with yellow police tape that has black text "Crime Scene Do Not Cross"
Ricardo Rey

“It’s not as if the Tribune is just withering on the vine despite the best efforts of the gardeners,” Charlie Johnson, a former Metro reporter, told me after the latest round of buyouts this summer. “It’s being snuffed out, quarter after quarter after quarter.” We were sitting in a coffee shop in Logan Square, and he was still struggling to make sense of what had happened. The Tribune had been profitable when Alden took over. The paper had weathered a decade and a half of mismanagement and declining revenues and layoffs, and had finally achieved a kind of stability. Now it might be facing extinction.

“They call Alden a vulture hedge fund, and I think that’s honestly a misnomer,” Johnson said. “A vulture doesn’t hold a wounded animal’s head underwater. This is predatory.”

When Alden first started buying newspapers, at the tail end of the Great Recession, the industry responded with cautious optimism. These were not exactly boom times for newspapers, after all—at least someone wanted to buy them. Maybe this obscure hedge fund had a plan. One early article, in the trade publication Poynter, suggested that Alden’s interest in the local-news business could be seen as “flattering” and quoted the owner of The Denver Post as saying he had “enormous respect” for the firm. Reading these stories now has a certain horror-movie quality: You want to somehow warn the unwitting victims of what’s about to happen.

Of course, it’s easy to romanticize past eras of journalism. The families that used to own the bulk of America’s local newspapers—the Bonfilses of Denver, the Chandlers of Los Angeles—were never perfect stewards. They could be vain, bumbling, even corrupt. At their worst, they used their papers to maintain oppressive social hierarchies. But most of them also had a stake in the communities their papers served, which meant that, if nothing else, their egos were wrapped up in putting out a respectable product.

The 21st century has seen many of these generational owners flee the industry, to devastating effect. In the past 15 years, more than a quarter of American newspapers have gone out of business. Those that have survived are smaller, weaker, and more vulnerable to acquisition. Today, half of all daily newspapers in the U.S. are controlled by financial firms, according to an analysis by the Financial Times, and the number is almost certain to grow.

What threatens local newspapers now is not just digital disruption or abstract market forces. They’re being targeted by investors who have figured out how to get rich by strip-mining local-news outfits. The model is simple: Gut the staff, sell the real estate, jack up subscription prices, and wring as much cash as possible out of the enterprise until eventually enough readers cancel their subscriptions that the paper folds, or is reduced to a desiccated husk of its former self.

[John Temple: My newspaper died 10 years ago. I’m worried the worst is yet to come.]

The men who devised this model are Randall Smith and Heath Freeman, the co-founders of Alden Global Capital. Since they bought their first newspapers a decade ago, no one has been more mercenary or less interested in pretending to care about their publications’ long-term health. Researchers at the University of North Carolina found that Alden-owned newspapers have cut their staff at twice the rate of their competitors; not coincidentally, circulation has fallen faster too, according to Ken Doctor, a news-industry analyst who reviewed data from some of the papers. That might sound like a losing formula, but these papers don’t have to become sustainable businesses for Smith and Freeman to make money.

With aggressive cost-cutting, Alden can operate its newspapers at a profit for years while turning out a steadily worse product, indifferent to the subscribers it’s alienating. “It’s the meanness and the elegance of the capitalist marketplace brought to newspapers,” Doctor told me. So far, Alden has limited its closures primarily to weekly newspapers, but Doctor argues it’s only a matter of time before the firm starts shutting down its dailies as well.

This investment strategy does not come without social consequences. When a local newspaper vanishes, research shows, it tends to correspond with lower voter turnout, increased polarization, and a general erosion of civic engagement. Misinformation proliferates. City budgets balloon, along with corruption and dysfunction. The consequences can influence national politics as well; an analysis by Politico found that Donald Trump performed best during the 2016 election in places with limited access to local news.

With its acquisition of Tribune Publishing earlier this year, Alden now controls more than 200 newspapers, including some of the country’s most famous and influential: the Chicago Tribune, The Baltimore Sun, the New York Daily News. It is the nation’s second-largest newspaper owner by circulation. Some in the industry say they wouldn’t be surprised if Smith and Freeman end up becoming the biggest newspaper moguls in U.S. history.

They are also defined by an obsessive secrecy. Alden’s website contains no information beyond the firm’s name, and its list of investors is kept strictly confidential. When lawmakers pressed for details last year on who funds Alden, the company replied that “there may be certain legal entities and organizational structures formed outside of the United States.”

Smith, a reclusive Palm Beach septuagenarian, hasn’t granted a press interview since the 1980s. Freeman, his 41-year-old protégé and the president of the firm, would be unrecognizable in most of the newsrooms he owns. For two men who employ thousands of journalists, remarkably little is known about them.

If you want to know what it’s like when Alden Capital buys your local newspaper, you could look to Montgomery County, Pennsylvania, where coverage of local elections in more than a dozen communities falls to a single reporter working out of his attic and emailing questionnaires to candidates. You could look to Oakland, California, where the East Bay Times laid off 20 people one week after the paper won a Pulitzer. Or to nearby Monterey, where the former Herald reporter Julie Reynolds says staffers were pushed to stop writing investigative features so they could produce multiple stories a day. Or to Denver, where the Post’s staff was cut by two-thirds, evicted from its newsroom, and relocated to a plant in an area with poor air quality, where some employees developed breathing problems.

But maybe the clearest illustration is in Vallejo, California, a city of about 120,000 people 30 miles north of San Francisco. When John Glidden first joined the Vallejo Times-Herald, in 2014, it had a staff of about a dozen reporters, editors, and photographers. Glidden, then a mild-mannered 30-year-old, had come to journalism later in life than most and was eager to prove himself. He started as a general-assignment reporter, covering local crime and community events. The pay was terrible and the work was not glamorous, but Glidden loved his job. A native of Vallejo, he was proud to work for his hometown paper. It felt important.

[Margaret Sullivan: The Constitution doesn’t work without local news]

A month after he started, one of his fellow reporters left and Glidden was asked to start covering schools in addition to his other responsibilities. When the city-hall reporter left a few months later, he picked up that beat too. Glidden had heard rumblings about the paper’s owners when he first took the job, but he hadn’t paid much attention. Now he was feeling the effects of their management.

It turned out that those owners—New York hedge funders whom Glidden took to calling “the lizard people”—were laser-focused on increasing the paper’s profit margins. Year after year, the executives from Alden would order new budget cuts, and Glidden would end up with fewer co-workers and more work. Eventually he was the only news reporter left on staff, charged with covering the city’s police, schools, government, courts, hospitals, and businesses. “It played with my mind a little bit,” Glidden told me. “I felt like a terrible reporter because I couldn’t get to everything.”

He gained 100 pounds and started grinding his teeth at night. He used his own money to pull court records, and went years without going on a vacation. Tips that he would never have time to investigate piled up on a legal pad he kept at his desk. At one point, he told me, the city’s entire civil-service commission was abruptly fired without explanation; his sources told him something fishy was going on, but he knew he’d never be able to run down the story.

Meanwhile, with few newsroom jobs left to eliminate, Alden continued to find creative ways to cut costs. The paper’s printing was moved to a plant more than 100 miles outside town, Glidden told me, which meant that the news arriving on subscribers’ doorsteps each morning was often more than 24 hours old. The “newsroom” was moved to a single room rented from the local chamber of commerce. Layout design was outsourced to freelancers in the Philippines.

Frustrated and worn out, Glidden broke down one day last spring when a reporter from The Washington Post called. She was writing about Alden’s growing newspaper empire, and wanted to know what it was like to be the last news reporter in town. “It hurts to see the paper like this,” he told her. “Vallejo deserves better.” A few weeks after the story came out, he was fired. His editor cited a supposed journalistic infraction (Glidden had reported the resignation of a school superintendent before an agreed-upon embargo). But Glidden felt sure he knew the real reason: Alden wanted him gone.

Clear zip-lock bag with forensic "Evidence" label that contains a crumpled page from a newspaper
Ricardo Rey

The story of Alden Capital begins on the set of a 1960s TV game show called Dream House. A young man named Randall Duncan Smith—Randy for short—stands next to his wife, Kathryn, answering quick-fire trivia questions in front of a live studio audience. The show’s premise pits two couples against each other for the chance to win a home. When the Smiths win, they pass on the house and take the cash prize instead—a $20,000 haul that Randy will eventually use to seed a small trading firm he calls R.D. Smith & Company.

A Cornell grad with an M.B.A., Randy is on a partner track at Bear Stearns, where he’s poised to make a comfortable fortune simply by climbing the ladder. But he has a big idea: He believes there’s serious money to be made in buying troubled companies, steering them into bankruptcy, and then selling them off in parts. The term vulture capitalism hasn’t been invented yet, but Randy will come to be known as a pioneer in the field. He scores big with a bankrupt aerospace manufacturer, and again with a Dallas-based drilling company.

By the 1980s, this strategy has made Randy luxuriously wealthy—vacations in the French Riviera, a family compound outside New York City—and he has begun to school his children on the wonders of capitalism. He teaches his 8-year-old son, Caleb, to make trades on a Quotron computer, and imparts the value of delayed gratification by reportedly postponing his family’s Christmas so that he can use all their available cash to buy stocks at lower prices in December. Caleb will later recall, in an interview with D Magazine, asking his dad why he works so hard.

“It’s a game,” Randy explains to his son.

“How do you know who wins?” the boy asks.

“Whoever dies with the most money.”

Even in the “greed is good” climate of the era, Randy is a polarizing character on Wall Street. When The New York Times profiles him in 1991, it notes that he excels at “profiting from other people’s misery” and quotes a parade of disgruntled clients and partners. “The one central theme,” the Times reports, “seems to be that Smith and its web of affiliates are out, first and foremost, for themselves.” If this reputation bothers Randy and his colleagues, they don’t let on: For a while, according to The Village Voice, his firm proudly hangs a painting of a vulture in its lobby.

Around this time, Randy becomes preoccupied with privacy. He stops talking to the press, refuses to be photographed, and rarely appears in public. One acquaintance tells The Village Voice that “he’s the kind of guy who divests himself every couple of years” to avoid ending up on lists of the world’s richest people.

Most of his investments are defined by a cold pragmatism, but he takes a more personal interest in the media sector. With his own money, he helps his brother launch the New York Press, a free alt-weekly in Manhattan. Russ Smith is a puckish libertarian whose self-described “contempt” for the journalistic class animates the pages of the publication. “I’m repulsed by the incestuous world of New York journalism,” he tells New York magazine. He writes a weekly column called “Mugger” that savages the city’s journalists by name and frequently runs to 10,000 words.

Randy claims no editorial role in the Press, and his investment in the project—which has little chance of producing the kind of return he’s accustomed to—could be chalked up to brotherly loyalty. But years later, when Randy relocates to Palm Beach and becomes a major donor to Donald Trump’s presidential campaign, it will make a certain amount of sense that his earliest known media investment was conceived as a giant middle finger to the journalistic establishment.

How exactly Randall Smith chose Heath Freeman as his protégé is a matter of speculation among those who have worked for the two of them. In conversations with former Alden employees, I heard repeatedly that their partnership seemed to transcend business. “They had a father-figure relationship,” one told me. “They were very tight.” Freeman has resisted elaborating on his relationship with Smith, saying simply that they were family friends before going into business together.

Freeman’s father, Brian, was a successful investment banker who specialized in making deals on behalf of labor unions. After serving in the Carter administration’s Treasury Department, Brian became widely known—and feared—in the ’80s for his hard-line negotiating style. “I sort of bully people around to get stuff done,” he boasted to The Washington Post in 1985. The details of how Smith got to know him are opaque, but the resulting loyalty was evident.

After Brian took his own life, in 2001, Smith became a mentor and confidant to Heath, who was in college at the time of his father’s death. Several years later, when Heath was still in his mid-20s, Smith co-founded Alden Global Capital with him, and eventually put him in charge of the firm.

People who know him described Freeman—with his shellacked curls, perma-stubble, and omnipresent smirk—as the archetypal Wall Street frat boy. “If you went into a lab to create the perfect bro, Heath would be that creation,” says one former executive at an Alden-owned company, who, like others in this story, requested anonymity to speak candidly. Freeman would show up at business meetings straight from the gym, clad in athleisure, the executive recalled, and would find excuses to invoke his college-football heroics, saying things like “When I played football at Duke, I learned some lessons about leadership.” (Freeman was a walk-on placekicker on a team that won no games the year he played.)

When Alden first got into the news business, Freeman seemed willing to indulge some innovation. The firm oversaw the promotion of John Paton, a charismatic digital-media evangelist, who improved the papers’ web and mobile offerings and increased online ad revenue. In 2011, Paton launched an ambitious initiative he called “Project Thunderdome,” hiring more than 50 journalists in New York and strategically deploying them to supplement short-staffed local newsrooms. For a fleeting moment, Alden’s newspapers became unexpected darlings of the journalism industry—written about by Poynter and Nieman Lab, endorsed by academics like Jay Rosen and Jeff Jarvis. But by 2014, it was becoming clear to Alden’s executives that Paton’s approach would be difficult to monetize in the short term, according to people familiar with the firm’s thinking. Reinventing their papers could require years of false starts and fine-tuning—and, most important, a delayed payday for Alden’s investors.

So Freeman pivoted. He shut down Project Thunderdome, parted ways with Paton, and placed all of Alden’s newspapers on the auction block. When the sale failed to attract a sufficiently high offer, Freeman turned his attention to squeezing as much cash out of the newspapers as possible.

Alden’s calculus was simple. Even in a declining industry, the newspapers still generated hundreds of millions of dollars in annual revenues; many of them were turning profits. For Freeman and his investors to come out ahead, they didn’t need to worry about the long-term health of the assets—they just needed to maximize profits as quickly as possible.

[Read: Local news is dying, and Americans have no idea]

From 2015 to 2017, he presided over staff reductions of 36 percent across Alden’s newspapers, according to an analysis by the NewsGuild (a union that also represents employees of The Atlantic). At the same time, he increased subscription prices in many markets; it would take awhile for subscribers—many of them older loyalists who didn’t carefully track their bills—to notice that they were paying more for a worse product. Maybe they’d cancel their subscriptions eventually; maybe the papers would fold altogether. But as long as Alden had made back its money, the investment would be a success. (Freeman denied this characterization through a spokesperson.)

Crucially, the profits generated by Alden’s newspapers did not go toward rebuilding newsrooms. Instead, the money was used to finance the hedge fund’s other ventures. In legal filings, Alden has acknowledged diverting hundreds of millions of dollars from its newspapers into risky bets on commercial real estate, a bankrupt pharmacy chain, and Greek debt bonds. To industry observers, Alden’s brazen model set it apart even from chains like Gannett, known for its aggressive cost-cutting. Alden “is not a newspaper company,” says Ann Marie Lipinski, a former editor in chief of the Chicago Tribune. “It’s a hedge that went and bought up some titles that it milks for cash.”

Even as Alden’s portfolio grew, Freeman rarely visited his newspapers. When he did, he exhibited a casual contempt for the journalists who worked there. On more than one occasion, according to people I spoke with, he asked aloud, “What do all these people do?” According to the former executive, Freeman once suggested in a meeting that Alden’s newspapers could get rid of all their full-time reporters and rely entirely on freelancers. (Freeman denied this through a spokesperson.) In my many conversations with people who have worked with Freeman, not one could recall seeing him read a newspaper.

A story circulated throughout the company—possibly apocryphal, though no one could say for sure—that when Freeman was informed that The Denver Post had won a Pulitzer in 2013, his first response was: “Does that come with any money?”

In budget meetings, according to the former executive, Freeman hectored local publishers, demanding that they produce detailed numbers off the top of their head and then humiliating them when they couldn’t. But for all the theatrics, his marching orders were always the same: Cut more.

“It was clear that they didn’t care about this being a business in the future. It was all about the next quarter’s profit margins,” says Matt DeRienzo, who worked as a publisher for Alden’s Connecticut newspapers before finally resigning.

Another ex-publisher told me Freeman believed that local newspapers should be treated like any other commodity in an extractive business. “To him, it’s the same as oil,” the publisher said. “Heath hopes the well never runs dry, but he’s going to keep pumping until it does. And everyone knows it’s going to run dry.”

On March 9, 2020, a small group of Baltimore Sun reporters convened a secret meeting at the downtown Hyatt Regency. Alden Global Capital had recently purchased a nearly one-third stake in the Sun’s parent company, Tribune Publishing, and the firm was signaling that it would soon come for the rest. By that point, Alden was widely known as the “grim reaper of American newspapers,” as Vanity Fair had put it, and news of the acquisition plans had unleashed a wave of panic across the industry.

But there was still a sliver of hope: Tribune and Alden agreed that the hedge fund would not increase its stake in the company for at least seven months. That gave the journalists at the Sun a brief window to stop the sale from going through. The question was how.

In the Hyatt meeting, Ted Venetoulis, a former Baltimore politician, advised the reporters to pick a noisy public fight: Set up a war room, circulate petitions, hold events to rally the city against Alden. If they did it right, Venetoulis said, they just might be able to line up a local, civic-minded owner for the paper. The pitch had a certain romantic appeal to the reporters in the room. “Baltimore is an underdog town,” Liz Bowie, a Sun reporter who was at the meeting, told me. “We were like, They’re not going to take our newspaper from us! 

The paper’s union hired a PR firm to launch a public-awareness campaign under the banner “Save Our Sun” and published a letter calling on the Tribune board to sell the paper to local owners. Soon, Tribune-owned newsrooms across the country were kicking off similar campaigns. “We were in collective revolt,” Lillian Reed, a Sun reporter who helped organize the campaign, told me. When the journalists created a Slack channel to coordinate their efforts across multiple newspapers, they dubbed it “Project Mayhem.”

In Orlando, the Sentinel ran an editorial pleading with the community to “deliver us from Alden” and comparing the hedge fund to “a biblical plague of locusts.” In Allentown, Pennsylvania, reporters held reader forums where they tried to instill a sense of urgency about the threat Alden posed to The Morning Call. The movement gained traction in some markets, with local politicians and celebrities expressing solidarity. But even for a group of journalists, it was tough to keep the public’s attention. After a contentious presidential race and amid a still-raging pandemic, there was a limited supply of outrage and sympathy to spare for local reporters. When the Chicago Tribune held a “Save Local News” rally, most of the people who showed up were members of the media.

Meanwhile, reporters fanned out across their respective cities in search of benevolent rich people to buy their newspapers. The most promising prospect materialized in Baltimore, where a hotel magnate named Stewart Bainum Jr. expressed interest in the Sun. Earnest and unpolished, with a perpetually mussed mop of hair, Bainum presented himself as a contrast to the cutthroat capitalists at Alden. As a young man, he’d studied at divinity school before taking over his father’s company, and decades later he still carried a healthy sense of noblesse oblige. He took particular pride in finding novel ways to give away his family fortune, funding child-poverty initiatives in Baltimore and prenatal care for women in Liberia.

Bainum told me he’d come to appreciate local journalism in the 1970s while serving in the Maryland state legislature. At the time, the Sun had a bustling bureau in Annapolis, and he marveled at the reporters’ ability to sort the honest politicians from the “political whores” by exposing abuses of power. “You have no way of knowing that if you don’t have some nosy son of a bitch asking a lot of questions down there,” he told me.

Bainum envisioned rebuilding the paper—which, by 2020, was down to a single full-time statehouse reporter—as a nonprofit. In February 2021, he announced a handshake deal to buy the Sun from Alden for $65 million once it acquired Tribune Publishing.

But within weeks, Bainum said, Alden tried to tack on a five-year licensing deal that would have cost him tens of millions more. (Freeman has, in the past, disputed Bainum’s account of the negotiations.) Feeling burned by the hedge fund, Bainum decided to make a last-minute bid for all of Tribune Publishing’s newspapers, pledging to line up responsible buyers in each market. For those who cared about the future of local news, it was hard to imagine a better outcome—which made it all the more devastating when the bid fell through.

What exactly went wrong would become a point of bitter debate among the journalists involved in the campaigns. Some expressed exasperation with the staff of the Chicago Tribune, who were unable to find a single interested local buyer. Others pointed to Bainum’s financing partner, who pulled out of the deal at the 11th hour. The largest share of the blame was assigned to the Tribune board for allowing the sale to Alden to go through. Freeman, meanwhile, would later gloat to colleagues that Bainum was never serious about buying the newspapers and just wanted to bask in the worshipful media coverage his bid generated.

But beneath all the recriminations and infighting was a cruel reality: When faced with the likely decimation of the country’s largest local newspapers, most Americans didn’t seem to care very much. “It was like watching a slow-motion disaster,” says Gregory Pratt, a reporter at the Chicago Tribune.

Alden completed its takeover of the Tribune papers in May. It financed the deal with the help of Cerberus—a private-equity firm that owned, among other businesses, the security company that trained Saudi operatives who participated in the murder of the journalist Jamal Khashoggi.

Three days later, Bainum—still smarting from his experience with Alden, but worried about the Sun’s fate—sent a pride-swallowing email to Freeman. After congratulating him on closing the deal, Bainum said he was still interested in buying the Sun if Alden was willing to negotiate. Freeman never responded.

Red street-corner newspaper dispenser with "The Baltimore Sun" logo lying on its side with glass window smashed and newspaper spilling out, surrounded by numbered yellow evidence markers from a murder scene
Ricardo Rey

Shortly after the Tribune deal closed earlier this year, I began trying to interview the men behind Alden Capital. I knew they almost never talked to reporters, but Randall Smith and Heath Freeman were now two of the most powerful figures in the news industry, and they’d gotten there by dismantling local journalism. It seemed reasonable to ask that they answer a few questions.

My request for an interview with Smith was dismissed by his spokesperson before I finished asking. A reporter at one of his newspapers suggested I try “doorstepping” Smith—showing up at his home unannounced to ask questions from the porch. But it turned out that Smith had so many doorsteps—16 mansions in Palm Beach alone, as of a few years ago, some of them behind gates—that the plan proved impractical. At one point, I tracked down the photographer who’d taken the only existing picture of Smith on the internet. But when I emailed his studio looking for information, I was informed curtly that the photo was “no longer available.” Had Smith bought the rights himself? I asked. No response came back.

Freeman was only slightly more accessible. He declined to meet me in person or to appear on Zoom. After weeks of back-and-forth, he agreed to a phone call, but only if parts of the conversation could be on background (which is to say, I could use the information generally but not attribute it to him). On the appointed afternoon, I dialed the number provided by his spokesperson and found myself talking to the most feared man in American newspapers.

When I asked Freeman what he thought was broken about the newspaper industry, he launched into a monologue that was laden with jargon and light on insight—summarizing what has been the conventional wisdom for a decade as though it were Alden’s discovery. “Many of the operators were looking at the newspaper business as a local advertising business,” he said, “and we didn’t believe that was the right way to look at it. This is a subscription-based business.”

Freeman was more animated when he turned to the prospect of extracting money from Big Tech. “We must finally require the online tech behemoths, such as Google, Apple, and Facebook, to fairly compensate us for our original news content,” he told me. He had spoken on this issue before, and it was easy to see why. Many in the journalism industry, watching lawsuits play out in Australia and Europe, have held out hope in recent years that Google and Facebook will be compelled to share their advertising revenue with the local outlets whose content populates their platforms. Some have even suggested that this represents America’s last chance to save its local-news industry. But for that to happen, the Big Tech money would need to flow to underfunded newsrooms, not into the pockets of Alden’s investors.

Before our interview, I’d contacted a number of Alden’s reporters to find out what they would ask their boss if they ever had the chance. Most responded with variations on the same question: Which recent stories from your newspapers have you especially appreciated? I put the question to Freeman, but he declined to answer on the record.

Freeman was clearly aware of his reputation for ruthlessness, but he seemed to regard Alden’s commitment to cost-cutting as a badge of honor—the thing that distinguished him from the saps and cowards who made up America’s previous generation of newspaper owners. “Prior to the acquisition of the Tribune Company, we purchased substantially all of our newspapers out of bankruptcy or close to liquidation,” he told me. “These papers were in many cases left for dead by local families not willing to make the tough but appropriate decisions to get these news organizations to sustainability. These papers would have been liquidated if not for us stepping up.”

This was the core of Freeman’s argument. But while it’s true that Alden entered the industry by purchasing floundering newspapers, not all of them were necessarily doomed to liquidation. More to the point, Tribune Publishing—which represents a substantial portion of Alden’s titles—was profitable at the time of the acquisition.

There’s little evidence that Alden cares about the “sustainability” of its newspapers. A more honest argument might have claimed, as some economists have, that vulture funds like Alden play a useful role in “creative destruction,” dismantling outmoded businesses to make room for more innovative insurgents. But in the case of local news, nothing comparable is ready to replace these papers when they die. Some publications, such as the Minneapolis Star Tribune, have developed successful long-term models that Alden’s papers might try to follow. But that would require slow, painstaking work—and there are easier ways to make money.

In truth, Freeman didn’t seem particularly interested in defending Alden’s reputation. When he’d agreed to the interview, I’d expected him to say the things he was supposed to say—that the layoffs and buyouts were necessary but tragic; that he held local journalism in the highest esteem; that he felt a sacred responsibility to steer these newspapers toward a robust future. I would know he didn’t mean it, and he would know he didn’t mean it, but he would at least go through the motions.

But I had underestimated how little Alden’s founders care about their standing in the journalism world. For Freeman, newspapers are financial assets and nothing more—numbers to be rearranged on spreadsheets until they produce the maximum returns for investors. For Smith, the Palm Beach conservative and Trump ally, sticking it to the mainstream media might actually be a perk of Alden’s strategy. Neither man will ever be the guests of honor at the annual dinner for the Committee to Protect Journalists—and that’s probably fine by them. It’s hard to imagine they’d show, anyway.

About a month after The Baltimore Sun was acquired by Alden, a senior editor at the paper took questions from anxious reporters on Zoom. The new owners had announced a round of buyouts, some beloved staffers were leaving, and those who remained were worried about the future. When a reporter asked if their work was still valued, the editor sounded deflated. He said that he still appreciated their journalism, but that he couldn’t speak for his corporate bosses.

“This company that owns us now seems to still be pretty—I don’t even know how to put it,” the editor said, according to a recording of the meeting obtained by The Atlantic. “We don’t hear from them ... They’re, like, nameless, faceless people.”

In the months that followed, the Sun did not immediately experience the same deep staff cuts that other papers did. Reporters kept reporting, and editors kept editing, and the union kept looking for ways to put pressure on Alden. But a sense of fatalism permeated the work. “It feels like we’re going up against capitalism now,” Lillian Reed, the reporter who helped launch the “Save Our Sun” campaign, told me. “Am I going to win against capitalism in America? Probably not.”

To David Simon, the whimpering end of The Baltimore Sun feels both inevitable and infuriating. A former Sun reporter whose work on the police beat famously led to his creation of The Wire on HBO, Simon told me the paper had suffered for years under a series of blundering corporate owners—and it was only a matter of time before an enterprise as cold-blooded as Alden finally put it out of its misery.

Like many alumni of the Sun, Simon is steeped in the paper’s history. He can cite decades-old scoops and tell you whom they pissed off. He quotes H. L. Mencken, the paper’s crusading 20th-century columnist, on the joys of journalism: It is really the life of kings. At the Sun’s peak, it employed more than 400 journalists, with reporters in London and Tokyo and Jerusalem. Its World War II correspondent brought firsthand news of Nazi concentration camps to American readers; its editorial page had the power to make or break political careers in Maryland.

But for Simon, that paper exists entirely in the past. With Alden in control, he believes the Sun is “now a prisoner” that stands little chance of escape. What most concerns him is how his city will manage without a robust paper keeping tabs on the people in charge. “The practical effect of the death of local journalism is that you get what we’ve had,” he told me, “which is a halcyon time for corruption and mismanagement and basically misrule.”

When Simon called me, he was on the set of his new miniseries, We Own This City, which tells the true story of Baltimore cops who spent years running their own drug ring from inside the police department. By the time the FBI caught them, in 2017, the conspiracy had resulted in one dead civilian and a rash of wrongful arrests and convictions. The show draws from a book written by a Sun reporter, and Simon was quick to point out that the paper still has good journalists covering important stories. But he couldn’t help feeling that the police scandal would have been exposed much sooner if the Sun were operating at full force.

Baltimore has always had its problems, he told me. “But if you really started fucking up in grandiose and belligerent ways, if you started stealing and grifting and lying, eventually somebody would come up behind you and say, ‘You’re grifting and you’re lying’ … and they’d put it in the paper.”

“The bad stuff runs for so long now,” he went on, “that by the time you get to it, institutions are irreparable, or damn near close.”

Take away the newsroom packed with meddling reporters, and a city loses a crucial layer of accountability. What happens next? Unless the Tribune’s trajectory changes, Chicago may soon provide a grim case study. For Baltimore to avoid a similar fate, Simon told me, something new would have to come along—a spiritual heir to the Sun: “A newspaper is its contents and the people who make it. It’s not the name or the flag.”

He may get his wish. Stewart Bainum, since losing his bid for the Sun, has been quietly working on a new venture. Convinced that the Sun won’t be able to provide the kind of coverage the city needs, he has set out to build a new publication of record from the ground up. In recent months, he’s been meeting with leaders of local-news start-ups across the country—The Texas Tribune, the Daily Memphian, The City in New York—and collecting best practices. He’s impressed by their journalism, he told me, but his clearest takeaway is that they’re not nearly well funded enough. To replace a paper like the Sun would require a large, talented staff that covers not just government, but sports and schools and restaurants and art. “You need real capital to move the needle,” he told me. Otherwise, “you’re just peeing in the ocean.”

Next year, Bainum will launch The Baltimore Banner, an all-digital, nonprofit news outlet. He told me it will begin with an annual operating budget of $15 million, unprecedented for an outfit of this kind. It will rely initially on philanthropic donations, but he aims to sell enough subscriptions to make it self-sustaining within five years. He’s acutely aware of the risks—“I may end up with egg on my face,” he said—but he believes it’s worth trying to develop a successful model that could be replicated in other markets. “There’s no industry that I can think of more integral to a working democracy than the local-news business,” he said.

The Banner will launch with about 50 journalists—not far from the size of the Sun—and an ambitious mandate. One tagline he was considering was “Maryland’s Best Newsroom.”

When I asked, half in jest, if he planned to raid the Sun to staff up, he responded with a muted grin. “Well,” he told me, “they have some very good reporters.”



from Business | The Atlantic https://ift.tt/3FE0AMS
via IFTTT

Thursday, October 7, 2021

What Really Happens When You Return an Online Purchase

Photo of 3 rows of waves cut out of cardboard boxes, with large cardboard tsunami wave that has white crest made of white/black shipping label
Jason Fulford and Tamara Shopsin

Consider the dressing room. The concept began its mass-market life as an amenity in Gilded Age department stores, a commercial sanctuary of pedestals and upholstered furniture on which to swoon over the splendid future of your wardrobe. Now, unless you’re rich enough to sip gratis champagne in the apartment-size private shopping suites of European luxury brands, the dressing room you know bears little resemblance to its luxe progenitors.

Over the course of several decades and just as many rounds of corporate budget cuts, dressing rooms have filled with wonky mirrors and fluorescent lights and piles of discarded clothes. At one point in your life or another, as you wriggled your clammy body into a new bathing suit—underpants still on, for sanitary purposes—you have probably experienced the split-second terror of some space cadet trying to yank the door open (if you’re lucky enough to have a door). Maybe you have heard your own panicked voice croak, “Someone’s in here!”

Through the 1990s and into the 2000s, even as stores became dingy and understaffed, the dressing room try-on remained a crucial step in the act of clothing yourself. But as online shopping became ever more frictionless—and the conditions in the fitting room ever less desirable—Americans realized that it might just be better to order a few sizes on a retailer’s website and sort it out at home. Estimates vary, but in the past year, one-third to one-half of all clothing bought in the United States came from the internet. More shopping of almost every type shifts online each year, a trend only accelerated by months of pandemic restrictions and shortages.

This explosive growth in online sales has also magnified one of e-commerce’s biggest problems: returns. When people can’t touch things before buying them—and when they don’t have to stand in front of another human and insist that a pair of high heels they clearly wore actually never left their living room—they send a lot of stuff back. The average brick-and-mortar store has a return rate in the single digits, but online, the average rate is somewhere between 15 and 30 percent. For clothing, it can be even higher, thanks in part to bracketing—the common practice of ordering a size up and a size down from the size you think you need. Some retailers actively encourage the practice in order to help customers feel confident in their purchases. At the very least, many retailers now offer free shipping, free returns, and frequent discount codes, all of which promote more buying—and more returns. Last year, U.S. retailers took back more than $100 billion in merchandise sold online.

[Read: American shoppers are a nightmare]

All of that unwanted stuff piles up. Some of it will be diverted into a global shadow industry of bulk resellers, some of it will be stripped for valuable parts, and some of it will go directly into an incinerator or a landfill.

It sounds harmful and inefficient—all the box trucks and tractor trailers and cargo planes and container ships set in motion to deal with changed minds or misleading product descriptions, to say nothing of the physical waste of the products themselves, and the waste created to manufacture things that will never be used. That’s because it is harmful and inefficient. Retailers of all kinds have always had to deal with returns, but processing this much miscellaneous, maybe-used, maybe-useless stuff is an invention of the past 15 years of American consumerism. In a race to acquire new customers and retain them at any cost, retailers have taught shoppers to behave in ways that are bad for virtually all involved.

The retail-logistics industry is split into two halves. Forward logistics—the process of moving goods from manufacturers to their end users—is the half most consumers regularly interact with. It includes postal workers, your neighborhood UPS guy, and the people who stock shelves at Target or pick items and pack boxes at Amazon warehouses. “Pick packing and shipping individual things to satisfy customer orders is a madness, but it’s a straightforward madness,” Mark Cohen, the director of retail studies at the Columbia University School of Business and the former CEO of Sears Canada, told me. The other half—reverse logistics—isn’t straightforward at all.

“Reverse logistics is nasty,” Tim Brown, the managing director of the Supply Chain and Logistics Institute at Georgia Tech, told me. The process of getting unwanted items back from consumers and figuring out what to do with them is time- and labor-intensive, and often kind of gross. Online returns are collected one by one from parcel carriers, brick-and-mortar stores, a growing number of third-party services, and sometimes directly from customers’ homes. Workers at sorting facilities open boxes and try to determine whether the thing in front of them is what’s on the packing list—to discern the difference between the various car parts sold on Amazon, or the zillion black polyester dresses available to order from H&M. They also need to figure out whether it’s been used or worn, if it works, if it’s clean, and if it or any of its components are economically and physically salvageable.

[Read: Where Amazon returns go to be resold by hustlers]

Sometimes, the answers to those questions are clear. “Consumers say they’re returning XYZ, but they really return a dead rat and a cinder block,” Brown said. That kind of fraud accounts for 5 to 10 percent of returns. Usually, though, the situation is ambiguous. How used do jeans have to be for them to be considered used? Does a mere try-on count, if they’ve been removed from their packaging?

We can dispatch now with a common myth of modern shopping: The stuff you return probably isn’t restocked and sent back out to another hopeful owner. Many retailers don’t allow any opened product to be resold as new. Brick-and-mortar stores have sometimes skirted that policy; products that are returned directly to the place where they were sold can be deemed close enough to new and sold again. But even if mailed-in products come back in pristine, unused condition—say, because you ordered two sizes of the same bra and the first one you tried on fit fine—the odds that things returned to a sorting facility will simply be transferred to that business’s inventory aren’t great, and in some cases, they’re virtually zero. Getting an item back into a company’s new-product sales stream, which is sometimes in a whole different state, can be logistically prohibitive. Some things, such as beauty products, underwear, and bathing suits, are destroyed for sanitary reasons, even if they appear to be unopened or unused.

Perfectly good stuff gets thrown away in these facilities all the time, simply because the financial math of doing anything else doesn’t work out; they’re too inexpensive to be worth the effort, or too much time has passed since they were sold. Fast fashion—the extremely low-cost, quick-churn styles you can buy from brands such as Forever 21 and Fashion Nova—tends to tick both boxes, and the industry generates some of the highest return rates in all of consumer sales. Imagine a dress that sold for $25 and was sent back without its plastic packaging at the end of the typical 30-day return window. Add up the labor to pick, pack, and dispatch the item; the freight both coming and going; the labor to receive and sort the now-returned item; the cardboard and plastic for packaging; and the sorting facility’s overhead, and the seller has already lost money. By one estimate, an online return typically costs a retailer $10 to $20 before the cost of shipping. And in the space of a month, the people who might have paid full price for the dress have moved on to newer items on the seller’s website. At that point, one way or another, the dress has got to go.

[From the March 2021 issue: Ultra-fast fashion is eating the world]

Many products survive their initial return, and even get sold again—just not to the retailer’s customers. Stores like Neiman Marcus and Target, which carry a bunch of different brands, are often able to return excess product to those brands for at least a partial refund. That might mean a pallet of polo shirts goes back to Ralph Lauren, or Hanes eats part of the loss on a new line of socks that didn’t sell. At that point, the brand or wholesaler taking back the product has to decide whether it should be thrown away or sold.

Or, when someone returns a computer to Best Buy, for example, the company can try to sell it elsewhere, even if it’s just for parts. Perhaps its outer case would be discarded and its processor and video card removed and off-loaded, along with thousands of others, to a middleman who flips them to repair services or retailers that sell refurbished parts. Bulk sales of intact merchandise supply much of the inventory in domestic deep-discount retailers such as Big Lots, according to Brown, and are also why so many people in countries without American stores wear American clothes. Unwanted clothing and other goods are sold off thousands of pounds at a time in shipping containers; the buyers discard what they can’t resell and ship the rest overseas to wholesale it as fresh merchandise.

This is why it’s difficult to accurately estimate what portion of returned merchandise is discarded, or even how much waste it adds up to, though we do know that billions of pounds of returns are thrown away in the U.S. every year. Joel Rampoldt, a managing director at the consulting firm AlixPartners, told me that most people in the industry believe that about 25 percent of returns are discarded, although the proportion varies widely depending on the product (clothing tends to be easier to resell than electronics that may contain user data, for example). There are so many points in an object’s life where it could go to the trash heap instead of to a person who will use it, and once it’s off the books—especially if it’s out of the country—American retailers are no longer keeping track. These practices are essentially unregulated; companies do whatever they deem most profitable.

Now is usually when people start wondering why more returns aren’t just donated. Don’t lots of people in the U.S. need winter coats and smartphones and other crucial tools of everyday life that they can’t afford? Wouldn’t providing those things be good PR for retailers? Wouldn’t it be a tax write-off, at the very least? Donation would be the morally sound move. But companies have little incentive to act morally, and many avoid large-scale domestic donations because of what is politely termed “brand dilution”: If paying customers catch you giving things to poor people for free, the logic goes, they’ll feel like the things you sell are no longer valuable.

Some of the largest retailers, such as Amazon and Target, have begun to quietly acknowledge that it doesn’t even make sense for them to eat the cost of reverse logistics to get back many of the things they sell. They’ll refund you for your itchy leggings or wonky throw pillows and suggest that you give them away, which feels like an act of generosity but, more likely, is really just farming out the task of product disposal.

The birth of the returns problem is almost always pinned on Zappos. In the mid-2000s, the company persuaded millions of Americans to buy shoes online—a turn of events that, at the time, seemed extremely unlikely—by marketing its fast, free shipping and free, no-questions-asked return policy as ardently as it did its products. The easy-returns tactic was hardly new in retail (Nordstrom, among others, was long famous for being so lenient that the store would take back things it didn’t sell in the first place in order to keep customers happy). But the free-returns model had never before been applied at such a large scale to online sales, where the logistics of giving buyers so much latitude is much more costly. Zappos’s success helped shape how people understood online shopping to work. “It’s so baked into consumer expectations, and consumers are very irrational about the cost of shipping and returns,” Rampoldt told me. “To some extent retailers have created that, and now they’re stuck with it.”

[From the January/February 2020 issue: Stop believing in free shipping]

Businesses often lose money in the pursuit of customers, hoping to make back the initial loss in the long run by creating durable economies of scale, which Zappos has successfully done—Scott Schaefer, the company’s vice president of finance, told me that it’s profitable, and has no need or desire to tighten its shipping and returns policies. But Zappos’s strategy had ramifications far beyond its own sales figures. By changing consumer behavior, it inadvertently pushed lots of other businesses to adopt the buy-it-all, return-it-later policies that have now become the industry standard, especially as e-commerce spending consolidates among a few mega-companies like Amazon, Target, and Walmart. Retailers of that size are better able to absorb the cost of return shipping and junked product than smaller businesses are. But many of those smaller businesses must adopt similar policies anyway to hold on to their customers.

Alarmingly, the problem almost never comes up in business education. “There’s very, very, very, very little academic work in reverse logistics,” Brown said. Meanwhile, “forward logistics and supply chain is taught in every business school in the country.” People are taught to sell.

And stores don’t want to talk about returns. Seven of the eight that I contacted for this story, which specialize in everything from cheap dog toys to luxury fashion, declined to comment at all. The issue is a nonstarter in almost every way: No company wants to draw attention to customers who are disappointed in their purchases. If a retailer admits that it wants to cut back on its generous policies, it risks headlines painting it as stingy. And once people start thinking about returns, they might start asking where all that returned product goes, which is a whole other can of public-relations worms.

This avoidance runs deep—public companies have to disclose a litany of financial details to shareholders every year, but regulatory agencies don’t require them to include return rates or specify their financial impact, so they don’t. When everyone’s mouths are shut, the size of the problem becomes very difficult to discern.

Schaefer, from Zappos, said that the centrality of returns to the business’s sales model means that the price of service has long been baked in. “I could be significantly more transactionally profitable if I cut off and said no returns,” Schaefer told me. “But I would easily lose all of my customers and all my customer trust.” Because Zappos doesn’t carry fast fashion, it has an advantage over some other apparel retailers; much of its return volume comes back unworn and is reintegrated into its regular inventory.

[Read: The neurological pleasures of modern shopping]

But even some of the biggest retailers in the world now see rampant returns as an existential threat. In recent years, many have started using third-party software to find and ban their highest-volume returners from sending things back, and sometimes from buying anything at all. Amazon, Sephora, Best Buy, Ulta, and Walmart, among many others, close shoppers’ accounts or bar them from stores if their returns seem atypical or potentially fraudulent. Details on what these companies consider aberrant behavior are scant, but Mark Cohen oversaw one of the first such policies, at Sears Canada in the mid-2000s. In its sweep, he said, Sears found 1,400 people who were engaged in what he called “recreational shopping”—buying things nearly every week and returning all or almost all of them. What’s more, many of these people even employed the tactic with big-ticket items such as tractors, lawn mowers, and refrigerators.

Third-party businesses have also sprouted up to wrestle returns into some kind of submission. If you shop online with any regularity, you’ve probably interacted with a post-purchase retail-logistics company such as Narvar, even if you didn’t realize it. These companies notify buyers when things have shipped or they’re about to arrive, clean up the tracking information into something understandable at a glance, and collect and organize data about why and how often certain products come back. Other companies promise to intervene in the physical logistics of moving $100 billion in online returns back to sellers. Roadie, for example, will pay gig workers to ferry returns back to sorting facilities in their own cars, ostensibly in situations where drivers are already heading that way. Happy Returns lets shoppers drop off their unwanted, unpackaged goods at “return bars” inside local businesses—drugstores, stationery shops, FedEx offices—which in theory minimizes the hassle, and thus speeds things up. Happy Returns then sorts and sends the items back to retailers, creating some measure of greater efficiency.

But returns don’t seem like a problem that can necessarily get solved completely. As the places where people used to buy clothes or stationery or kids’ toys in person are pushed out of business, online shopping becomes even more of a necessity. And Americans will probably continue to buy more than they intend to keep, even if it means an extra trip to the UPS store. Prices will go up to account for how expensive it is to send all this unwanted stuff back and forth, and companies will make nonbinding sustainability pledges that attract positive headlines while still shoveling things into landfills. They will do so until that is no longer legal, or no longer profitable for the largest and most powerful retailers, at which point they’ll force their customers to get used to something else.

When surveyed about their preferences, big majorities of Americans under 40 say that they’d happily pay more to patronize businesses that aren’t wasteful or harmful to the environment. That is the right answer when another human asks you whether you care about the future of the planet. But the receipts tell a different story so far: Those same shoppers do a far larger portion of their shopping online than their older counterparts do, and they’re also more likely to place big orders, buying items in multiple sizes and colors, with the intention of sending some back. That’s the slick thing about shopping now. So much of it takes place in the same manner as returns—in the privacy of your own home, no human interaction or judgment required.


This article appears in the November 2021 print edition with the headline “Unhappy Returns.”



from Business | The Atlantic https://ift.tt/3FqbXbe
via IFTTT

Friday, September 24, 2021

Where’s the Cheap Beef?

Grocery prices are rising. Meat prices are rising more than most other grocery prices. Beef prices are rising more than most other meat prices.

But on the ranch, these are not prosperous times. Even as ground chuck costs more than $5 a pound at Walmart, ranchers complain that they are receiving less for their animals than it costs to feed them.

Rising food prices are likely depressing President Joe Biden’s softening approval numbers. The U.S. economy has added almost 5 million non-farm jobs since Inauguration Day. Yet Biden’s approval rating has dropped into the mid-40s. In a recent Fox News poll, 82 percent of respondents described themselves as “extremely” or “very” concerned about the cost of living. More than scenes of chaos in Afghanistan, the numbers at the supermarket checkout may be weighing Biden down.

On September 8, the White House unveiled an analysis of the problem—and an ambitious plan for action: $500 million in loan guarantees to smaller and regional beef processors.

[Annie Lowrey: The inflation gap]

What’s going on here is bigger than beef. It’s a test of a theory about the U.S. economy—and about a philosophy of government. The theory, expressed most powerfully in a 2019 book by Thomas Philippon, The Great Reversal, is that the U.S. economy is in thrall to a few dominant corporations. In industry after industry, Philippon argued, a few companies have gained the power to keep prices high, wages low, and competitors out. The philosophy of government that follows from this theory is that the government should vigorously police competition, not only by means of traditional antitrust enforcement but also through a broader program of market regulation and intervention.  

Market regulation went out of style in the 1970s, a victim of its internal contradictions. As academic critics such as Robert Bork argued back then: If, say, a supermarket gains market share from its mom-and-pop competitors by offering a wider selection at lower prices, you can understand why Mom and Pop don’t like it. But how is it “pro-competition” if the government intervenes to protect Mom and Pop from competitors who are doing a better job of meeting customer needs?

That argument prevailed for most of the past half century. The Biden administration is seeking to change course—and beef is where it’s starting.

To understand the choices facing the Biden administration, here are the two warring explanations of what’s going on with beef.

[Read: Bidenomics really is something new]

The first explanation is a classic story of supply and demand. The beef industry has been hammered over the past two years by a series of supply shocks. COVID closed many processing plants. Then, when the plants reopened, they had to work less efficiently, with workers spaced farther apart from one another. Like many other employers, meatpackers have had difficulty hiring enough labor at pre-pandemic wages, so they have had to pay more, which raises their costs.

Meanwhile, U.S. cattle herds have been ravaged by drought across the American West. The 2020 drought was bad; the 2021 drought has been worse. More than one-third of American cattle have grazed under drought conditions in 2021, sometimes—as in Montana and Washington State—extreme-drought conditions. The aggregate national herd has shrunk in numbers, and the animals that have come to market have weighed an average of 15 pounds less than animals weighed a year earlier, according to U.S. Department of Agriculture statistics.

Drought has also pushed the price of cattle feed to dizzying heights, raising beef prices even higher. The feed crisis explains some of the woes of small ranchers. Many cattle spend their early months on a ranch eating grass, then are shipped to a feedlot where they are fattened with corn and other grains. If the feed costs more, the rancher earns less.

Over the past year and a half, surging demand slammed into this constrained supply. Throughout the coronavirus pandemic, the federal government has pumped enormous purchasing power into consumers’ wallets. This extra money—plus consumer cutbacks on other kinds of spending—has enabled consumers to increase their spending at the grocery store; they spent $84 billion more in 2020 relative to 2019.

If this supply-and-demand explanation is correct, then the right policy for government is: Do nothing. Higher prices will encourage ranchers to raise more cattle. Higher prices will enable meatpackers to pay higher wages. Higher prices will induce consumers to substitute other foods for beef. Supply and demand will equilibrate, as they always do. And this time, the high prices can serve another function, too: warning consumers of the pocketbook impact of drought-causing climate change.

[Read: How meat producers have influenced nutrition guidelines for decades]

But there’s another story to tell, and it’s the story the Biden administration is telling. Meatpacking is becoming a more concentrated industry. Just four companies process more than 80 percent of America’s beef. Even as prices moved down in the early 2010s and up again in the early 2020s, the Big Four packers have been able first to increase, then to maintain, their level of profitability. In less concentrated food industries, notably eggs, prices did not rise nearly as much in 2020–21 as did prices of meat, and especially beef.

Without denying the supply-and-demand explanation altogether, the Biden administration wants to act to multiply competition in the meatpacking industry. It proposes committing $500 million in loan guarantees and direct subsidies to support smaller players against the Big Four. It hopes that more competition will raise the prices that packers pay to ranchers and cut the prices consumers pay at the store.

That’s maybe a forlorn hope. A single large-sized meatpacking plant can cost $200 million, and take many months to approve and build. So $500 million will not buy much additional capacity. Worse, from a Biden administration perspective, meatpackers faced by intensified competition have another option besides paying more to ranchers or charging consumers less: They can squeeze their own costs by, for example, automating workers out of jobs.

The architects of the Biden plan are uneasily aware that it rests on a lot of hopes, guesses, and optimistic assumptions. When pressed on the unlikelihood that their plan will deliver any near-term relief to either ranchers or consumers, they reply that the more fundamental goal of their plan is to improve the resiliency of the U.S. food system. Because meatpacking in general—and beef packing most of all—is so concentrated in a few huge plants, small shocks can disrupt the nation’s supply of meat.

In August 2019, a fire badly damaged one of the seven largest meatpacking plants in the United States, near Holcomb, Kansas. At a stroke, the U.S. lost the ability to process 30,000 head of cattle per week. In May 2021, a cyberattack temporarily closed all of the U.S. processing operations of JBS, the largest meatpacker in the world. That attack disrupted one-fourth of the U.S. beef supply.

Multiplying the number of smaller if perhaps less efficient suppliers can provide some cushions against such shocks in future. That’s the hope anyway, and President Biden has talked a lot about it. But how would that hope work in the real world? The Big Four came to dominate beef packing as they do precisely because theirs is an industry where larger size translates into lower costs and greater efficiencies. The Biden administration is not talking about turning the Big Four into the Big Five. It’s talking about supporting a lot of smaller competitors. What’s to stop the Big Four from undercutting them and driving them out of business far in advance of a crisis in which the extra resiliency might prove useful? When I put this question to officials involved in the Biden plan, they admit that the question worried the president too.

There is one way that the resiliency project can work: if the additional capacity can somehow persuade consumers to pay higher prices. Craft breweries do not compete with Anheuser-Busch on price; they compete on taste. Smaller meatpackers could likewise compete as alternatives that are more humane to animals—or that deliver organic or grass-fed meat. But that means entering the market at the top, not undercutting from below. And because the main obstacles to this kind of niche competition are regulatory, allowing the niche competitors to grow will demand a deregulatory agenda of a kind very different from what the Biden administration seems to have in mind for meatpacking.

Instead, there’s a real risk that the initial commitment of $500 million in aid and loan guarantees to small packers will expand into continuing intervention in the marketplace to keep smaller competitors in business in the face of the higher efficiency and lower prices of the big packers.

As the saying goes, there’s no taking the politics out of politics. Rage at the big meatpackers burns especially hot among ranchers in Montana and the Dakotas. These ranchers are located far away from the feedlots of the Corn Belt to the south, and they feel themselves especially disadvantaged by the industry’s present structure. They even have their own industry group, which broadly supports the Biden administration’s plans. Montana has a Democratic senator right now; North Dakota had one from 2013 to 2019. Unsurprisingly, a Democratic presidential administration listens more carefully to the views of ranchers in states that sometimes vote Democratic than to those from states that less often do.

Yet it would be a mistake to interpret beef policy as merely an expression of regional politics. What’s being proposed for beef is as an experiment in stricter marketplace regulation. If it works—or at least seems to work—for beef, it can be tried elsewhere. But what if it doesn’t work? We’ll be back where we were before the 1970s, when “pro-competition” often turned out to mean “a helping hand to the least capable competitors.” “Resiliency” is an appealing slogan. But what if it translates into plainer English as higher taxes and higher prices?



from Business | The Atlantic https://ift.tt/3u9JrWf
via IFTTT