Derek Thompson

Derek Thompson is a senior editor at The Atlantic, where he oversees business coverage for TheAtlantic.com. More

Thompson has written for Slate, BusinessWeek, and the Daily Beast. He has also appeared as a guest on radio and television networks, including NPR, the BBC, CNBC, and MSNBC.

Twitter Just Filed a Secret IPO—Why 'Secret'?

Twitter ipo

Twitter is going public, the company announced today (via Twitter). It's submitted its S-1 document, which contains all sorts of juicy information about the company's business and risks ... but secretly.

So, the company is going public privately? Who said they could do that? Well, Obama said.

Last year, the president signed the Jumpstart Our Business Startups (JOBS) Act, which let small companies -- those making less than $1 billion in annual revenue -- keep their filings private until three weeks before their road show. Twitter isn't projected to make $1 billion in revenue until next year. 

"Keeping its IPO filing secret until the last minute could help Twitter avoid the overheated anticipation that Facebook had to deal with ahead of its disastrous IPO," Zach Seward explained in QuartzIndeed, Facebook's IPO was totally disastrous, as you might recall, and much of the disaster was precipitated by last-minute changes to the S-1.

As Khadeeja Safdar explained for The Atlantic in a long investigative piece, Facebook cut its earnings forecast weeks before its public debut, which scared away most of the support from large institutional investors. It's very likely that this is why the stock flopped on its opening day -- the big money had lost interest.

This provision of the JOBS Act is supposed to protect smaller companies from a similar fate and encourage more firms to go public by making the road to IPO less fraught. At the same time, Twitter's S-1 is going to come out eventually (publicly announcing a private filing seems like the first step) at which point the madness will continue as regularly scheduled: Twitter will tell institutional investors how great they are; analysts will set their targets and phone in recommendations; and CNBC will titillate retail investors will the loudest analysts around.
 

 

1 Simple Rule for Advertising on 9/11

Don't do it. That's the rule. So simple.

Don't offer a $9.11 golf special.

Don't tweet a picture of your product framing the ghostly lights of Ground Zero.

 

AT&T September 11 Marketing Campaign

Don't offer free coffee and mini muffins for 30 minutes.

Advertising is hard. This is easy. Don't use a national tragedy as a news peg for your product or service. "Sorry for the deaths of 3,000 people, please give us money for something unrelated" is the polar opposite of clever adjacency. It is always offensive, and it never works. This is not a winnable challenge for copy writers.

On a day when most Americans are enveloped by visuals and memories of a horrible, horrible day, companies would be well-advised to adhere to the converse of "never forget." Please, marketing departments of America: Stop trying so hard to make us remember.

The Dow Jones Industrial Average Is Adorable, Should Never Change

REUTERS

The Dow Jones Industrial Average announced today that it is removing Alcoa, Bank of America, and Hewlett-Packard from its 30-company index. Goldman Sachs, Nike, and Visa are in.

Everybody's snarking on this move, most eloquently Neil Irwin:

[The Dow] is weighted not based on the size or importance of the company, but by its per-share price. So IBM, with its $184 per-share price, counts more than four times as much as Coca-Cola, at $39 a share, even though the two have about the same stock market capitalization.

Okay, yes, the DJIA is an overrated symbol. But it's a fun overrated symbol. The economy is impossibly complicated, and even sophisticated attempts to summarize its health and evolution are, inevitably, oversimplifications. The Dow, which limits its particular oversimplification to the publicly traded stock of 30 companies, is constantly hauling new companies into the boat before casting them overboard. In 2008, Bank of America and Kraft replaced Altria and Honeywell. Four years later, Bank of America and Kraft have been replaced, themselves. This is useful, how?

I'd like to propose that the Dow Jones Industrial Average become the Dow Jones Industrial Archive. Let's just pick 30 stocks and never change them again. It wouldn't be the best reflection of an evolving economy. In fact, it would be a terrible reflection of an evolving economy. But that's okay, because we have the Standard & Poor’s 500 index and Wilshire 5000, which are much better simplifications for traders and readers.

Instead, the Dow Jones Industrial Archive would be a beautiful reflection of the economy we thought we had, but are instead leaving behind.

When the Dow Jones Industrial Average launched in 1896, it smelled like a turn-of-the-century factory farm -- nothing but oil, iron, cows, and cotton. You know. America

Today, practically all of these companies -- Tennessee Coal & Iron, American Cotton Oil, Distilling & Cattle Feeding -- have been gobbled up by conglomerates that you have and haven't heard of. Only GE remains. Only the U.S. Leather trust is essentially defunct.

But I like that. We had an industrial-dominated economy, and now we don't. New time, new index. DJIA II for the auto and aerospace economy. DJIA III for the computer/financial economy. And so on.

The value to creating oversimplified theories of the economy is planting a stake in the ground and learning some time later why and how you were wrong. The Dow isn't a very good index. It would be a better time-capsule.

    The Dow Jones Industrial Average Is Adorable, Should Never Change

    REUTERS

    The Dow Jones Industrial Average announced today that it is removing Alcoa, Bank of America, and Hewlett-Packard from its 30-company index. Goldman Sachs, Nike, and Visa are in.

    Everybody's snarking on this move, most eloquently Neil Irwin:

    [The Dow] is weighted not based on the size or importance of the company, but by its per-share price. So IBM, with its $184 per-share price, counts more than four times as much as Coca-Cola, at $39 a share, even though the two have about the same stock market capitalization.

    Okay, yes, the DJIA is an overrated symbol. But it's a fun overrated symbol. The economy is impossibly complicated, and even sophisticated attempts to summarize its health and evolution are, inevitably, oversimplifications. The Dow, which limits its particular oversimplification to the publicly traded stock of 30 companies, is constantly hauling new companies into the boat before casting them overboard. In 2008, Bank of America and Kraft replaced Altria and Honeywell. Four years later, Bank of America and Kraft have been replaced, themselves. This is useful, how?

    I'd like to propose that the Dow Jones Industrial Average become the Dow Jones Industrial Archive. Let's just pick 30 stocks and never change them again. It wouldn't be the best reflection of an evolving economy. In fact, it would be a terrible reflection of an evolving economy. But that's okay, because we have the Standard & Poor’s 500 index and Wilshire 5000, which are much better simplifications for traders and readers.

    Instead, the Dow Jones Industrial Archive would be a beautiful reflection of the economy we thought we had, but are instead leaving behind.

    When the Dow Jones Industrial Average launched in 1896, it smelled like a turn-of-the-century factory farm -- nothing but oil, iron, cows, and cotton. You know. America

    Today, practically all of these companies -- Tennessee Coal & Iron, American Cotton Oil, Distilling & Cattle Feeding -- have been gobbled up by conglomerates that you have and haven't heard of. Only GE remains. Only the U.S. Leather trust is essentially defunct.

    But I like that. We had an industrial-dominated economy, and now we don't. New time, new index. DJIA II for the auto and aerospace economy. DJIA III for the computer/financial economy. And so on.

    The value to creating oversimplified theories of the economy is planting a stake in the ground and learning some time later why and how you were wrong. The Dow isn't a very good index. I'd be a better time-capsule.

      Can Your Language Influence Your Spending, Eating, and Smoking Habits?

      REUTERS

      Yes, I know. That headline. It looks like the most egregious form of causal inference. Americans don't save money because of ... our grammar? How utterly absurd. But bear with me.

      In the 1930s, linguists proposed that the way we read, write, and talk helped to determine the way we see the world. Speakers of languages that had the same word for orange and yellow had a harder time actually distinguishing the colors. Speakers of the Kook Thaayorre language, which has no words for left and right, must orient themselves by north, south, east, and west at all time, which enhances their awareness of geographical and astronomical markers.

      Last year, economist Keith Chen released a working paper (now published) suggesting speakers of languages without strong future tenses tended to be more responsible about planning for the future. Quick example. In English, we say "I will go to the play tomorrow." That's strong future tense. In Mandarin or Finnish, which have weaker future tenses, it might be more appropriate to say, "I go to the play tomorrow." 

      Chen wondered whether languages with weak future tenses would be more thoughtful about the future because they consider it, grammatically, equivalent to the present. He mapped stronger and weak future-tense languages across Europe and correlated the data with future-oriented behaviors like saving, smoking, and using condoms.

      Remarkably, he discovered that speakers with weak future tenses (e.g. German, Finnish and Estonian) were 30 percent more likely to save money, 24 percent more likely to avoid smoking, 29 percent more likely to exercise regularly, and 13 percent less likely to be obese, than speakers of languages with strong future tenses, like English.

      If your B.S. antennae are standing straight up (as mine were), you might be more interested in this next part. Chen next compared speakers born and raised within the same countries, as well, controlling for factors like age and number of children. He found the same results: Speakers with weak future tenses demonstrated dramatically, and statistically significantly, more responsible future-oriented behaviors -- even within countries like Switzerland, which are a motley blend of strong-future languages (like French) and weak-future languages (like German).

      The whole paper -- from its credulous claims to its surprising conclusions -- is summed up rather elegantly in this video by Fractl , the creative team for eBay Deals.

      The correlation was savaged by some economists and linguists as facile or worse. But others re-ran the data and found, to their astonishment, that Chen seemed to be right. His paper was published (along with thanks to some of his fiercest critics) in the American Economic Review this year.

      "One important issue in interpreting these results is the possibility that language is not causing but rather reflecting deeper differences that drive savings behavior," Chen concluded. Languages map to large groups of people, but so does religion, culture, family values, and a common history. Are Germans frugal because their language protects them from hyperbolic discounting, or is it just that, well, they're Germans?

      That question doesn't have a satisfying answer, but this paper, as wild as it seems, isn't a radical departure from the literature. "Overall, my findings are largely consistent with the hypothesis that languages with obligatory future-time reference lead their speakers to engage in less future-oriented behavior," Chen wrote.

      What does it mean? I have no idea, and Chen himself has responded to the criticism of his work with an honorable blend of erudition and shrugging disbelief. But I suppose that if you suffer from issues like crippling procrastination, as I do, it couldn't hurt to learn Estonian -- or, perhaps more simply, to write inspirational notes to yourself exclusively in the present tense. The future might be a different country, but it doesn't have to feel that way.

      A Recovery Unlike Any Other Recovery: America's Strange, Shrinking Government

      Decades from now, some lazy historian is in danger of confusing Obama for an anti-government radical.

      Basically, he's going to look at the president's record of job creation and see that Obama presided over a historically bizarre period of private-sector job creation combined with public-sector-job destruction.

      Adding last week's meh jobs report to the pile of meh jobs reports, Bill McBride finds that this private sector employment recovery has been stronger than those under George W. Bush or his father. These are private sector jobs, only. (Obama's record in BLUE.)

      The jobless recovery has felt so jobless, then, not only because the drop was so steep, but also because the public sector has had an unprecedentedly bad post-crash performance. In every other recovery, going back three decades, the public sector has grown. Often, it's grown by millions of jobs in the years following the recession. In this recovery, however, the public sector has collapsed by hundreds of thousands of workers. (The BLUE line is this recovery.)

      The public sector collapse was not Obama's plan. It is not his fault, really. It is scarcely his responsibility, since the vast majority of government jobs -- and of government jobs lost -- have been at the state and local level. The recession devastated state tax revenue, and states cannot borrow from international lenders like Washington. So most had no choice but to cut workers. The stimulus delayed, but did not indefinitely defer, the blood-letting. And, since Congress refused to extend support even as interest rates clung to historically low levels, it's been brutal.

      If we held government employment perfectly steady since Day One of Obama's presidency -- not one more government job, nor one less -- job creation would have seen a 25 percent boost. Instead, Obama's legacy will include an historically strange post-recession collapse in government unemployment -- and a powerful lesson in the limits of presidential power.

      The Dubious Future of the American Car Business—in 14 Charts

      America's amazing car recovery explains the U.S. economy, I wrote this morning. But how does the U.S. economy explain America's amazing auto recovery?

      Here's the answer, gleaned from the highlights of a GM research presentation shared with The Atlantic, with annotations from me to guide you along.

      The Car Market Has Been Amazingly Resilient

      1. The overall recovery stinks.

      2. The jobs recovery stinks more.

      3. But auto sales have kept pace with past recoveries, even if overall growth has not. The car industry has accounted for 15 percent of post-2009 GDP growth -- pretty astonishing, considering total vehicle output is just 3 percent of the economy.

      Cars Are Getting More Expensive

      1. The total cost of operating a car keeps going up.

      2. The inflation-adjusted price of an average new vehicle is near all-time highs.

      3. Meanwhile, inflation-adjusted gas prices have scarcely been higher, even though we're coming out of a deep recession. (So, it's not just the price of a car that's going up; it's the price of running a car.)

      As a Result, the U.S. Car Market Is Getting Older and Richer

      1. The median household income of a new car- or truck-buyer is 74 percent higher than the overall median household income.

      2. The people most willing new new cars are over 55. Incredibly, the 65+ cohort might be the most bullish post-recession car market in the country.

      Do Young People Care About Cars?

      1. Licensed drivers are rising or steady among every age cohort, except the 16-24 group, where it's simply plummeting. Why?

      2. Cars are expensive. Not just buying them (remember, average real car prices are near all-time highs), or paying for gas (gas prices are pushing all-time highs, too), but insuring them, too, especially for young people when you consider how little they earn.

      3. The growth in student loans in the last decade among younger Americans has crowded out other loans, like for cars and houses. In this way, young people have traded investments in habitation and transportation for investments in education -- in their own human capital.

      3 Reasons to Be Optimistic

      1. Despite income stagnation, there is still a solid number of households with the means to afford a new car.

      2. Americans can hold on to their old cars for only so long, and, like a pinched hose snapping to, there seems to be plenty of pent-up activity ready to be unleashed.

      3. The historic trend line of auto industry growth suggests there's still running room for the largest manufacturers.

      Overdrive: How America's Amazing Car Recovery Explains the U.S. Economy

      REUTERS

      There was a time, not so long ago, when cars supposedly personified the American character. Our aggression, our style, our rugged independence. In the last 30 years, the automobile has faded slightly in the American imagination, but today the car industry does, in fact, explain the American economy.

      It is a surprisingly durable, fantastically productive juggernaut, whose success relies on the old, the rich, and foreign trade -- and less on American workers.

      ***

      To begin this story, let's appreciate the big picture. The car economy, a small but mighty sliver of American industry, has been on a roll. Since 2009, car production has nearly doubled, accounting for between 15 and 20 percent of our whole recovery.

      Ford and GM said this past August was the best month for car sales in seven years. JD Power is saying it might be the best month on record. But man cannot live on cars alone, and neither can countries. "We're not big enough to tow the whole boat," Sean McAlinden, chief economist at the Center for Automotive Research, told me. "It's lonely all out there by yourself."

      For decades, housing has been the engine the moves recoveries. When people would clean up their debt, they'd swing a house. And, at least since 1950, a house meant a garage, and a garage meant a new car. As Jordan Weissmann has written , cars and houses accounted for more than half of the recovery in the 1970s, a third of the "Reagan Recovery" in the early 1980s, a sixth of the recoveries in the early 1990s and 2000s.

      Houses_and_Cars_Growth2.PNG

      But this time, cars are leading houses, thanks to a surprising source: older Americans. "[Demand] is coming from an increased buying rate of people over 55," McAlinden said, "which is scary because we don't have a lot of repeat sales left in us."

      Young people are essentially locked out of the car market, just as they have been locked out of the housing market -- and the labor market. Average vehicle prices are as high as ever, but wages are low, and unemployment for young people has typically been twice as high as for the overall population. There is also evidence that cars have fallen from their cultural perch, squeezed by urbanization among young people and the growth of a new, expensive, social, mobile technology -- the smartphone.

      Young vs. old might not be the most important binary for car companies. That would be rich vs. poor. The U.S. is beginning to look like the aristocratic auto market we're used to seeing in Europe, McAlinden said, where the top 25 percent buys most of the new cars and the bottom 75 percent only buys old and used. "Seventy-five percent of households here are relying on used cars, thinking 'I hope that rich guy is done,'" he said.

      Plutocracy in the car market isn't unique, but rather illustrative. There is “no such animal as the U.S. consumer,” three Citigroup analysts concluded in the heart of the real estate boom in 2005. Instead, we have the rich and the rest. As Don Peck wrote in his summer 2011 cover story for The Atlantic, for many industries, "the rest" just don't matter.

      All the action in the American economy was at the top: the richest 1 percent of households earned as much each year as the bottom 60 percent put together; they possessed as much wealth as the bottom 90 percent; and with each passing year, a greater share of the nation’s treasure was flowing through their hands and into their pockets. It was this segment of the population, almost exclusively, that held the key to future growth and future returns.

      The last two years have done nothing to make those Citigroup economists look anything less than prophetic. Middle-income jobs (like, say, auto-parts workers) made up 60 percent of jobs lost in the recession, but lower-wage occupations have accounted for about 60 percent of jobs gained the recovery. The auto recovery, like the U.S. recovery, is built on a fragile assumption: The rich can be rich enough for the rest of us.

      So Much Work, So Where Are the Jobs?

      The amazing car comeback has not translated into equally amazing jobs. Auto manufacturing employment is up since 2009, but whereas the motor industry has accounted for 15 to 20 percent of economic growth, it's accounted for just 2 to 3 percent of job growth. "We're at record productivity levels," McAlinden said. But productivity across the industry hasn't trickled down as better pay or equally rising employment. Instead, car companies are making more with less. According to CAR figures shared with The Atlantic, total motor vehicle output has grown 75 percent faster than total industry employment.

      Why? First, the auto industry is seeing record levels of overtime. Second, car companies are relying on logistics and trucking companies -- not typically counted in auto manufacturing categories -- to work in and around the factories today to assist with assembly and sequencing of parts. These workers, even if unionized, tend to be paid less than members of the auto union. Third, car companies are importing more finished parts from Mexico -- hatchbacks, body panels, electronics -- which means cheaper Mexican workers have replaced Americans.

      Data shared by Yen Chen, a senior economist at CAR, shows auto imports from Mexico on an absolute tear since 2009, far outstripping China, Canada, and Japan.

      "We suspect that the record levels of parts imports is a big reason why employment is stuck in the rut," McAlinden said. The parts sector in Mexico employs 540,000 people, compared to about 480,000 in the U.S. Remarkably, Mexico's industry is already "bigger" than the United States. As we've seen with other global industries, American employment has been restrained by large companies moving more labor along their supply chain to cheaper countries. American companies simply don't need that many more Americans. 

      ***

      The modern auto recovery is, over all, a sensational story. We need growth, and we're getting more of it from cars than perhaps another other industry. 

      But unpacking this story reveals a more frightening picture of American industry and productivity. In the mid-20th century, a strange and wonderful blip of good fortune for the American middle class, unions concentrated in the manufacturing sector helped millions of American families achieve healthy and rising wages, thanks to collective bargaining and a burgeoning industry that wasn't yet automated or globalized. But that story is over. It has been replaced by a new American story where one of the country's most iconic industries scarcely needs more American workers to do all the work it needs.

      College Enrollment Plummeted in 2012, but for Very Good Reasons

      The number of college students fell for the first time in six years, according to new Census figures released this week. The half-a-million-student drop is "a huge decline," Census Bureau statistician Julie Siebens told me. This sounds like bad news. And while you won't find a bigger proponent of higher education than the Atlantic Business channel, I'd argue it's actually a sign of good news.

      It means the labor market is -- slowly, but surely -- getting better.

      The Census doesn't ask people why they are or aren't enrolling in college. So rather than a survey, we have to rely on a bit of intuition. But here's what we know. College is cheaper, in a way, during recessions, because the opportunity cost of leaving the work force to go to school is lower when there aren't any jobs out there, anyway. As a result, college enrollment typically accelerates during bad economic times. Since the peak of the housing bubble in 2006 (incidentally, the last year enrollment fell), college attendance has grown by 3.2 million students -- or 18 percent.

      Now we're seeing the reverse take place. The economy has improved steadily. Youth unemployment has fallen. And twentysomethings are going back to work. Ninety percent of the overall decline in enrollment was from students over 25. That is, students in their prime-working years. Steadily falling college enrollment would be bad news indeed. But a one-year correction suggests that students on the job-or-school bubble could be plowing their productivity into salaried jobs, Siebens said.

      The rest of the report is a mixed bag. On the one hand, college is getting significantly more diverse. As the Wall Street Journal reported, the share of non-Hispanic white students declined to 58% from 67% in the last six years. But rising college costs and fears of student loan burdens are likely depressing college attendance, too. In the long run, that's not a good story because, as we've written a million times, learning is earning power in the modern economy.

      Jeff Bezos Is Exactly 50% Wrong About What's Killing The Washington Post

      REUTERS

      Amazon CEO Jeff Bezos might not know how to revive his latest purchase, The Washington Post, but he says he knows what's ailing the newspaper:

      He said the newspaper faced two business problems: the Rewrite Problem and the Debundling Problem.

      In the former, the newspaper could spend weeks or months on a project that a Web site like the Huffington Post could rewrite “in 17 minutes.”

      In the latter, whereas people once bought a paper and read and passed sections of it around, the Web has debundled the paper so that people can read one story and move on to a different site. 

      Not to completely rewrite Matt Yglesias, but the Rewrite Problem isn't much of a problem -- especially compared to the Debundling Problem.

      Just think about the causality and chronology. The Washington Post's readership and revenue collapse is new. Very new. Since 2008, daily circulation has fallen from about 670,000 to 470,000. But the Rewrite Problem is old. Reader's Digest, which became the single biggest magazine in the world thanks to rewriting, is old. Book reviews are old. Columns and news stories that piggy-back on other people's exhaustive reporting? They're old, too. All journalism, even the most investigative reporting, relies somewhat on the Rewrite "Problem." Reporters are all little dwarves standing on the shoulders of giants.

      What's new is the Internet and its debundling of the business model that supported all of this news-gathering, reporting, and rewriting, in the first place. Newspapers are bundles of paper, but they're bundles of businesses, too. Marketing in the classifieds, sports, cars, and real estate sections have always subsidized the capital-N News on the front page. (Serious News, after all, has never paid for itself.) But readers abandoned broadsheets, since they could get their news from a million free sources, few of which required allegiance in the form of a paid subscription. Advertisers moved with them -- in particular, advertisers in the back pages that supported journalism in the front pages. The Washington Post used to claim a quasi-monopoly on local newspaper advertising. Today, it doesn't even have the zip-code's most famous political coverage.

      The Debundling Problem is a real problem, but it's an observation without an obvious solution. (The Internet isn't going anywhere, and Jeff Bezos is the last person to hope otherwise.) The Post is a behemoth with tons of traffic, annual Pulitzers, brilliant reporters, and a diseased business. It's not obvious that its news-gathering would be best-served by an instinct to make profits, at all. It's easy to imagine The Post thriving as a large and growing money-loser for a price-insensitive bajillionaire. And this philosophy would arguably produce the best possible newspaper even if it didn't make the best possible business.*

      But if Bezos wants to keep his margins low (even when they're negative margins), the best way to match less revenue with spending is, of course, to spend less money. For this, the Post already has a model in WorldViewsThe FixWonkblog and The Switch -- that is, whip-smart, quite-young, decently affordable journalists doing targeted reporting, value-add analysis and, well, lots and lots of rewriting.

      *Updated.

      Why Sequels Will Never Die: Hollywood's Summer of 'Flops' Was Actually Its Best Year Ever

      REUTERS

      It was, you might have heard, a summer of discontent for movie studios, as audiences endured (or, rather, ignored) flop after flop, like R.I.P.D., Turbo, The Lone Ranger, and White House Down. Does this represent the death of something more than Ryan Reynold's leading-man dreams? Is it the death of the entire modern Hollywood business model? "The summer-blockbuster strategy itself may have tanked," Catherine Rampell writes in the New York Times Magazine

      But Hollywood's summer blockbuster strategy -- essentially: adapt books, make sequels -- didn't really "tank." In fact, it was the biggest summer in history, in nominal dollars.

      Here were the biggest movies of the summer ...

      ... and the top-grossing films went sequel, sequel, adaptation/sequel, sequel, sequel, sequel, adaptation of a recent best-selling novel. That, folks, is a business model. Of course, there were some $100 million surprises (The Heat, We're the Millers) but that's just how the strategy is supposed to work. The dependable sequel-adaptation two-step subsidizes efforts to produce original ventures that might break $100 million.

      Rampell makes a deeper critique: That the summer season has become an overcrowded bazaar of crap, a victim of its own success and excess. But as a business, the invention of the summer "movie season" has been a thrilling success.

      Americans barely go to the movies anymore. The typical adult sees one movie every two months. Sixty years ago, we saw five-times more.

      Screen Shot 2013-07-10 at 3.16.23 PM.png

      The collapse of movie audiences, which far pre-dates Jaws and summer blockbusters, requires studios to heavily market their films since Americans' default position on movies these days is not to see them. Studios have cannily created a summer of tent-pole features to focus audience attention on a handful of months when we're taught to expect to go to the movies. Iron Man III would probably make a billion dollars if it were released on a Tuesday morning in March. But lesser films might benefit from debuting in a season when audiences are predisposed to going to the movies.

      Making films, as Rampell points out, is a risky business. It's hard to know what 100 million people want to see each July. The sequel/adaptation strategy assumes that people want to see stories they already are familiar with. The fact that this strategy isn't risk-free (see: The Hangover Part III) doesn't change the fact that it's been hugely risk-mitigating. If anything, the failure of R.I.P.D. and Lone Ranger -- adaptations of more esoteric fare than Batman or Star Trek -- might only make movie studios double down on their previous plans to make movies from only the most familiar stories. Forget Batman vs. Superman, or Hunger Games II. Get ready for Star Wars XXVII.

      The Myth of Part-Time America

      REUTERS

      Liberals and conservatives agree: "Welcome to Part-Time America!"

      The numbers are out to scare you. "Of the 963,000 jobs created in the past six months, according to the Bureau of Labor Statistics’ Household Surveys, 936,000 of them are part-time," Harold Meyerson writes. The predictions will scare you more. The Wall Street Journal foretells a time when Obamacare's employer mandate will force small businesses across the country to move swaths of workers to part-time rolls, narrowly ducking the law's penalty.

      The scare stories are good scares, but they're not good stories, since they start at the end and forget the beginning and middle. The truth about Part-Time America is that a part of America has always been working part-time, and there's not much evidence that we're seeing a terrifically new phenomenon.

      The first thing you would expect from a true Part-Time America is a growing number of part-time workers. Instead, there are fewer Americans working less than full-time for economic reasons now than just about any time in the last three years. Here is your jagged line of part-time work, meandering its way downward (WSJ).

      Zoom out four decades, as San Francisco Fed does, and you get another graph making an argument against Part-Time America as a new phenomenon. Part-time work rises when the economy stinks. It falls when the economy improves. That shouldn't surprise anybody. But in Part-Time America, you would expect the share of part-time workers to be historically high and rising after the recession. Again, it's not. "The level of part-time work in recent years is not unprecedented," the researchers find.

      Another look. Here is the ratio of part-timer workers to unemployed workers. In Part-Time America, you would expect that ratio to be historically out of whack today, as well. It's not.

      Across the Internet, economic writers are calling out: Is there no responsible way to terrify my readers about the awful future of part-time work? And yes, there is a responsible way. It begins with marriage demographics.

      Although we're not seeing historically unprecedented levels of part-time work in the aggregate figures, single men and single women are more likely to be working part-time than any year in the last four decades. You can blame it on the rise of single moms and dads, who have to balance work and childcare. You can blame it on the rise of retail and food service jobs, which are more likely to staff part-timers. You can even try to blame it on Obamacare, although the evidence isn't as friendly there.

      Just because right now Part-Time America is an overblown crisis manufactured to fit headlines rather than statistics doesn't mean the stats won't eventually catch up to the headlines. The future of work in America is fraught and uncertain. But there is a difference between prophecy and analysis.

      How Roommates Replaced Spouses in the 20th Century

      When I graduated from college, I moved in with roommates. So did almost all of my college friends. And almost all of their college friends.

      Fifty years ago, this would have been utterly bizarre.

      In 1968, just six percent of young people --between 18 and 31 --  lived with platonic roommates, according to Pew Research. The vast majority (85 percent!) of Americans who had moved on from their homes and college dorms shacked up with spouses. 

      In the last half-century, home life for young people has undergone an amazing shift. Friends and Craigslist strangers have essentially replaced spouses as the de facto first living-partners for American 20-somethings. Here's my graph, drawn with Current Population Survey numbers (via Pew Research):

      Put more quirkily: In 1968, if you wanted to throw a dinner party for all your twentysomething friends who had moved out of their parents' home, and you invited three girls who lived together, you would have to invite fourteen married couples to make the party proportionally representative of your generation's living arrangements (okay, not a priority for most dinner parties, parties, but still.) Roommate was a synonym for spouse, basically.

      What changed? More school and fewer spouses, for starters. The share of adults with a bachelor's degree has nearly doubled since 1968, and most of the change is because of women who rallied into the workforce. As more women grabbed degrees, the average marriage age has steadily inched up into the upper 20s. Meanwhile, birth control, in particular normalization of "the pill," nearly eliminated the odds of becoming pregnant, allowing women to invest with confidence in their work-life and delay marriage without worrying about an unwanted pregnancy derailing their careers. Meanwhile, for lower-income women, marriage rates have declined for more debatable reasons.

      That's a rough explanation for why fewer couples are getting married in their early 20s, which is itself a rough explanation for the growth of platonic roommates, but surely there's more. Any other ideas?

      A Record-High Number of Young People Are Still Living With Their Parents: Why?

      Reuters

      Another month, another record number of young people living at home long after their teenage years are over.

      This time, it's the Wall Street Journal reporting that, despite the improved economy pulling unemployment down for the last three years, the share of young adults living with their parents is still rising. Still! More than a third of Americans between 18 and 31 are currently living with their parents, according to the Current Population Survey.

      Seriously. What's going on here, if it's not just the economy?

      We can begin to find the answers in the new mammoth Pew Research Report, released just this month, which found a record 21.6 million "Millennials" living at home. The answer boils down to three variables, which I'll sum up as: economicsbachelor's degrees, and bachelors.

      We have to start with economics. The share of young people living in the basement was basically unchanged for four decades before the recession. Then the recession hit, and millions of young people who would have otherwise had jobs didn't. 

      Last year, Millennials without a job were 55 percent more likely to be living with their folks than employed young people.

      But when you look at the shift since 2007 in the graph below, something might seem funny to you. Sure, the recession figures are high, but pre-recession figures are high, too. One in two 18-24-year olds were living at home before the crash? And one in seven late-twentysomethings?

      Why were so many young adults apparently living at home when unemployment was about 4 percent?

      It comes down to a very sneaky definition of "home." In the Current Population Survey that provides these figures, "college students in dormitories are counted as living in the parental home." Dormitories! This might strike you as absurd -- and it certainly strikes me as questionable -- but it's Labor Day Weekend, and I'm not going to waste it fighting with the folks at CPS, so there it is. Dorms = your parents' place, according to the government.

      This is a huge deal for the Millennials-living-at-home figures, because college enrollment increased significantly during the recession -- 39% of 18- to 24-year-olds were enrolled in college in 2012, compared with from 35% in 2007 -- and college enrollees are much more likely to be living at home (er, in dorms) than students who skip college, drop out, or finish early.

      So a huge part of the explanation for the ostensible boom of stay-at-home kids is actually good news: more bachelor's degrees.

      Finally: More bachelors. Unmarried young people are six-times more likely to be living at their parents' place than married couples (understandably). In the last 50 years, the share of twentysomethings who are married utterly collapsed before 2007. The recession made marriage even less attractive to many young people.

      Higher unemployment, more people going to college, and more single people explain most of the change. But research found that there was even an increase within all three groups, as well. Maybe they were all affecting each other. Or perhaps a fourth factor -- general unease about the future? a gradual normalization of twentysomethings living with their parents? -- is at play.

      But the most important takeaway is that, although the Great Recession was nothing but a tragedy, the rise in young people living at home isn't quite as tragic. It's partially a reflection of more young people going to school and saving money before starting a family of their own.

      If You Love Netflix, Thank Cable

      Kevin Spacey gave a great, great speech recently about the TV entertainment business. "We have learned a lesson that the music industry didn't learn," he said. "Give people what they want. When they want it. In the form they want it in. At a reasonable price. And they'll more likely pay for it rather than steal it."

      It's an awesome clip. Seriously. Give this man a TV show. Or, better, a TV network. 

      But the reaction has been less awesome. "Everyone in the tech industry is passing around this video of Kevin Spacey talking about how Netflix (and other tech companies) will blow up the traditional TV industry," Nicholas Carlson and Jay Yarow write at Business Insider.

      I'm sorry to return here to my regularly scheduled role as the grumpy uncle of TV economics, but ... no. Kevin Spacey is not talking about that. He's talking about writing and filming great stories, putting them on digital platforms that allow audiences to watch them on big screens and small screens, live and delayed, binged whole or nibbled slowly.

      He's not talking about blowing up the traditional TV industry. That's not what he wants.

      Think about what Kevin Spacey really wants: a network with enough money to make his great, expensive series. That requires a company with a dependable revenue stream to take a bet on a risky project. Netflix has a dependable revenue stream from nearly 30 million paid subscribers. It buys TV rights off media companies who don't (yet) charge utterly usurious rates to Netflix, because they're making the vast majority of their revenue from live programming and syndication on pay-TV -- that is, from the cable bundle, doubly financed by cable bills and adjacent advertising.

      Netflix could afford to pay full-price for "House of Cards" precisely because it was getting a discount on everything else. Its business model thrives because of -- not in spite of --the traditional TV industry.

      The 100 million households paying for cable are subsidizing the entertainment on Netflix. This subsidy allows Netflix to charge an affordable-enough monthly rate so that they can attract a truly mass audience. Just about everything that you love about Netflix (its affordability, its variety, its ability to take risks) is made possible because of just about everything you hate about cable, whose high cost and refusal to offer a la carte creates high margins for entertainment companies, who auction the scraps to Netflix, Amazon and other Internet video companies.

      The instinct among some tech writers to implicitly root for Netflix over the traditional cable industry is understandable. Netflix is cheap and easy to use. Cable is expensive and remote controls are terrible. Netflix's affordability and its willingness to take risks are both made possible by the same traditional TV business they're threatening.

      Why the Fast-Food Worker Strikes Are Doomed

      It's amazing when you stand back and think about it. The fastest-growing industry in the world's most powerful economy can scarcely pay its workers enough to live.

      No wonder thousands of fast-food workers have gone on strike to protest their measly wages and non-existent benefits, demanding a $15-per-hour minimum wage, which would more than double some of their hourly income. These aren't teenagers, after all. A quarter of fast-food workers are raising a child. Forty percent are older than 25. 

      But sympathy and optimism are two separate things. And when it comes to the fast-food worker strikes and dramatically raising wages for food service employees, I'm only feeling the former.

      Let's begin by acknowledging that it's still early, you can't rush to measure the success of a national protest movement, and workers have already achieved some small, store-specific victories, "from scheduling changes, to raises, to the restoration of a tip jar."

      But let's also be realistic. The strikes would have a much better shot at inspiring a change in franchise- and corporate-level policy if fast-food chains perceived one of two threats: (a) a threat to the steady supply of food-service workers who want to be employed at any wage and (b) a threat from consumers demanding higher wages for their fast-food clerks by not buying burgers and fries at McDonald's.

      Instead, the big-picture doesn't reveal either of these pressure points. Fast-food jobs aren't merely scattered among the most despondent corners of the economy. They're growing fastest among some of the richest and most-educated metros. Bridgeport, Conn., Salt Lake City, Raleigh, Chapel Hill, and Washington, D.C., are among the five areas with the most growth in food service work between 2010 and 2013.

      Study this graph. It shows the post-1990 change in three big industries' share of total employment. All three historically attract young, low-educated workers -- the bread and butter of fast-food employment. Manufacturing's share of all jobs fell by nearly 50 percent. Retail, which absorbed a lot of the slack from manufacturing's slowdown in the 1970s and 1980s, has flat-lined. Food services? Shot up by 25 percent.

      This graph doesn't tell us everything you need to know about why low wages in food services are probably here to stay. But it does suggest that the collapse of middle-income stalwarts like manufacturing has left a glut of young low-skill workers who are rushing into to fill local service-sector needs at big-box stores and fast-food chains. And that, to me, suggests another thing: That there are more people willing to do these jobs than there are people willing to strike.

      This isn't to say that intervention is impossible. Congress could decide in September to double the minimum wage or triple the Earned Income Tax Credit. But it won't. And without some sort of third-party intervention, it is hard for me to see how the higher-wage movement succeeds on the streets.

      The Maddening, Unmoving Economic Gap Between Blacks and Whites

      President Obama's powerful address today, on the 50th anniversary of Martin Luther King Jr.'s "I Have a Dream" speech, included a beautiful turn of phrase linking social and economic progress for blacks in America.

      "For what does it profit a man, Dr. King would ask, to sit at an integrated lunch counter if he can't afford the meal?" Without equality of economic opportunity,  legal equality can only accomplish so much.

      Social and economic progress since King's speech has been slow but spiky. Here's a brief tour.

      We'll start with the good. The life-expectancy and high-school completion gap between blacks and whites has shrunk tremendously, according to Pew Research ...

      ... while blacks experience the same disadvantage in poverty and homeownership that they witnessed in the 1960s ...

      ... meanwhile, in the last half-century, the median income of families with black heads-of-household has grown, but the gap between black families and white families has expanded.

      What you don't see in these pictures is the all-important college story, where blacks still lag. "Today, white adults 25 and older are significantly more likely than blacks to have completed at least a bachelor’s degree," Pew tells us. On the one hand, the black completion rate as a percentage of the white completion rate has increased from 42% then to 62% now. On the other hand, whites are still far more likely to graduate from a bachelor's program by 25. This college advantage -- reinforced through dual-earner households -- translates into higher family incomes, higher home-ownership, and (as a result) higher wealth for whites. There is a reason why so many discussions of social mobility begin and conclude with education.

      For more on this topic, see Brad Plumer's wonderful post on Wonkblog.

      How Goliaths Beat Themselves: Microsoft's Mobile Failure and the Innovator's Dilemma

      Some of the best pre-mortems for Steve Ballmer, out-going CEO of Microsoft, have chalked up the company's problem to the "innovator's dilemma." But which innovator's dilemma are they talking about, exactly?

      The first and most simplistic definition of the dilemma, which was coined by Harvard's Clayton Christensen, says that firms fail when their core product is undermined by a "disruptive technology."

      A second, less simple -- but, I think, more useful -- definition is that firms led by excellent managers fail precisely by doing the thing excellent managers are supposed to do: listening to their customers. No good business teacher will tell you, "never listen to your idiot customers." Listening to users is how you make stuff better! But there's a catch.

      Since customers cannot possibly tell you want they don't know (how many of you were crying out for something like the Samsung Galaxy S4 in 1998?), great managers have to anticipate and bravely invest in technologies that not only have pathetic current customer appeal but also might cannibalize their core business. Cut to Microsoft: Redmond HQ was brilliant at iterating on a core product in the late 1990s -- Windows 2.01, Windows 2.03, Windows 2.10 -- but less brilliant at foreseeing the mobile trend that would eclipse Windows-run desktops. As the graph leading this article (via Benedict Evans) shows: Apple and Android are devouring the market for computers.

      This leads to a third, more complex, and even more useful lesson from the innovator's dilemma. Managers find it very hard to keep resources focused on the pursuit of disruptive technologies because successful companies tend to be better at organizing around -- and honing expertise in -- the product lines they already make money from. Success creates its own centripetal force that makes it hard to invest in a future outside the model.

      From the book, itself:

      Despite their endowments in technology, brand names, manufacturing prowess, management experience, distribution muscle, and just plain cash, successful companies populated by good managers have a genuinely hard time doing what does not fit their model for how to make money. Because disruptive technologies rarely make sense during the years when investing in them is most important, conventional managerial wisdom at established firms constitutes an entry and mobility barrier that entrepreneurs and investors can bank on. It is powerful and pervasive. 

      You could scarcely write a better analytical description of Microsoft in the late 1990s if you tried. Take it from one guy who did try. Kurt Eichenwald's Vanity Fair essay on the company could be a footnote to The Innovator's Dilemma:

      Microsoft failed repeatedly to jump on emerging technologies because of the company’s fealty to Windows and Office. “Windows was the god—everything had to work with Windows,” said Stone. “Ideas about mobile computing with a user experience that was cleaner than with a P.C. were deemed unimportant by a few powerful people in that division, and they managed to kill the effort.”

      This prejudice permeated the company, leaving it unable to move quickly when faced with challenges from new competitors. “Every little thing you want to write has to build off of Windows or other existing products,” one software engineer said. “It can be very confusing, because a lot of the time the problems you’re trying to solve aren’t the ones that you have with your product, but because you have to go through the mental exercise of how this framework works. It just slows you down.”

      In his last chapter, Christensen writes that the most gratifying conclusion from his book is learning that "thinking smarter," "managing better," and "working harder" aren't answers to the innovator's dilemma, and thank heavens, because most people that make it to the top of a major company are pretty smart thinkers, good managers, and hard workers. Instead, successful companies become victims of their own self-sustaining success. 

      Microsoft is a very successful company. It's worth as much as Amazon and Verizon put together. Since 2009, revenue is up 33 percent, net income is up 50 percent, and total assets have doubled. Steve Ballmer has basically succeeded in growing Microsoft as far as it could go ... and that's exactly how he failed.

      How to Fit Every New Word in the Oxford Dictionary Into 1 Article

      MEMO

      FROM: Word Selection Committee of the Oxford Dictionary

      TO: Staff

      SUBJECT: Re: today's new words

      Dear Staff,

      I know what you're thinking: "Grats, idiots. You've destroyed the English language."

      You don't like our new batch of words. You unlike our new batch of words. The Oxford Dictionary isn't supposed to girl crush on Urban Dictionary. We're supposed to be a gateway for the future of language, not some linguistic omnishambles for Generation Twerk. When trends like the Internet of thingsMOOCs and space tourism crop up, the Oxford Dictionary is supposed to stick with tradition, not bandy about some vapid list of last season's most fashionable acronyms (FIL? BYOD?), like we're some A/W catalog previewing next season's chandelier earrings for click and collect shoppers. (Even as I'm typing that sentence, I barely know what it means!) And lord, you're thinking, if some Jersey Shore girl in a pixiecut with double-denim jorts and flatforms taking a selfie on her phablet is this generation's William Shakespeare, you're gonna straight up vom your street food.

      I'll admit, guac is a "new" word like bitcoin is a "real" currency.

      But let me respond first by saying: Apols. Lately, we've been feeling a bit of FOMO about all the buzzworthy verbiage orbiting outside our hallowed pages. While initially it seemed a bit dappy to add nonsense like LDR and other ghastly abbrevs just because teens don't have time to spell things out on Facebook Chat, the thing is, we can't have our blondie cake pop and eat it, too.

      It's not this dictionary's job to request a digital detox just because Web diction has shaved a fauxhawk into the English language. Rather, it's our job to highlight the words that blend into the way we actually talk today. It's kinda like linguistic balayage, if I truly understood what the heck balayage actually was.

      So yes, our language is suffering from a food baby of derp these days. But it's our job to adapt to the geek chic hackerspace -- even if babymoons strike you as a dumb excuse for me time; even if pear cider remains an unacceptable alternative to beer; and even if  emoji represents everything a good dictionary should be against.

      TL;DRSrsly, this is the future of language. Squee.

       

      What Makes Employees Work Harder: Punishment or Pampering?

      You're a boss. You have a bunch of employees you consider under-motivated, unproductive, and, well, sorta lazy. You wonder: How do I make them talk less? Focus more? Try harder? Think more creatively? Just be better?

      You turn to social science and economic research and find that much of the conclusions appear to be divided into two strategic approaches. For lack of more officious-sounding terms, I'm going to call them: the Feel-Good strategies vs. the Feel-Bad strategies.

      The Feel-Good boss is drawn to the research that suggests that just about every thing you can think of that makes workers feel good about themselves also makes them better workers. Four-day workweeks mean more meaningful work hours. Shorter work days translate into better focus. More vacation time means heightened creativity. More short breaks means sharper attention to detail. More "toys" means more creative, playful employees who are necessary for out-of-the-box thinking.

      And then the dismal science comes along and finds that, oh, actually, scaring the bejeezus out of workers turns out to be pretty effective too. The Feel-Bad boss is vindicated.

      The Great Recession, a Great Motivator

      Let's go back to 2008. The economy tanked. Productivity spiked. Why?

      There are two basic explanations: the "weakest link" explanation and the "motivation" explanation. When employers have to let people go, they start with the least productive workers -- the weakest links. That makes the remaining team more productive, right? Theoretically. But also, if you think every day is an audition to keep your job, you might be motivated to work harder.

      One explanation dramatically overshadows the other when accounting for our productivity boom in the bust, according to economists Edward P. Lazear Kathryn L. Shaw Christopher Stanton, the authors of a new paper "MAKING DO WITH LESS: WORKING HARDER DURING RECESSIONS." Actually, the answer is right there in the headline. Employers made do with less because people worked harder during the recession. Fully 85% of increase in productivity came from workers' "increased effort."

      Some pictures to hammer the point home. Here's their look at productivity per worker between December 2007 and July 2009, the official beginning and end of the Great Recession. The recession visibly increased measured effort.

      In particular, workers in states with high unemployment changes (BLUE LINE in the graph below) become more productive than workers in states with low unemployment changes (RED DASH line) --  a reversal of what you'd observe in 2006 during better times.

      In econo-speak: "an increase in the unemployment rate makes finding an alternative job more difficult, which reduces the relative cost of effort." In human: People worked harder in states where finding a job was harder, since they were totally freaked out about being unemployed. Fascinatingly, the economists found that the least productive workers had the highest gains in measured effort -- possibly because they felt the most scrutinized in areas with high unemployment. 

      Big Brother, Inc.

      Scrutiny works in mysterious ways. The knowledge that your industry (or state, or macroeconomy) is under siege makes workers feel watched. But what about workers who are, quite literally, being watched?

      That was the subject of another recent paper on software that watches workers to make sure they don't steal stuff: Cleaning House: The Impact of Information Technology Monitoring on Employee Theft and Productivity. The New York Times unpacks:

      The researchers measured the impact of software that monitors employee-level theft and sales transactions, before and after the technology was installed, at 392 restaurants in 39 states ... The savings from the theft alerts themselves were modest, $108 a week per restaurant. However, after installing the monitoring software, the revenue per restaurant increased by an average of $2,982 a week, or about 7 percent.

      What happened here? Yes, theft declined. That's to be expected. But also, revenue spiked. Productivity increased. Turning casual-dining restaurants into casual-dining panopticons made everybody work harder, perhaps by cutting down on procrastination or encouraging waiters to sell more drinks and appetizers to customers. 

      Productivity research remains something of a spray-and-pray field. There is research suggesting that practically everything that workers enjoy, from cat pictures to naps, amazingly makes them better at their jobs, while there's also research that the *worst things ever* like Great Recessions and employer-monitoring make us more diligent. Perhaps some of these conclusions can live together: Being well-rested at work gives you the mental space to be creative, but the threat of consequences also makes you more arduous on a minute-to-minute basis. There are no absolutes here yet, except, I suppose for one. If you're looking for a job, any job, try North Dakota.

      The Biggest Story in Photos

      A Beautiful Collection of Insects

      Subscribe Now

      SAVE 65%! 10 issues JUST $2.45 PER COPY

      Newsletters

      Sign up to receive our free newsletters

      (sample)

      (sample)

      (sample)

      (sample)

      (sample)

      (sample)