Announcement

Collapse
No announcement yet.

Tech, Tick, Nology

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Tech, Tick, Nology

    The tech industry used to think big. As early as 1977, when personal computers were expensive and impractical mystery boxes with no apparent utility or business prospects, the young Bill Gates and Paul Allen were already working toward a future in which we would see “a computer on every desk and in every home.” And in the late 1990s, when it was far from clear that they would ever make a penny from their unusual search engine, the audacious founders of Google were planning to organize every bit of data on the planet — and make it available to everyone, free.

    These were dreams of vast breadth: The founders of Microsoft, Google, Facebook and many of the rest of today’s tech giants were not content to win over just some people to their future. They weren’t going after simply the rich, or Americans or Westerners. They planned to radically alter how the world did business so the impossible became a reality for everyone.

    Whatever happened to the tech industry’s grand, democratic visions of the future?



    A dish from Munchery, a meal-delivery service. “We have an internal mission of making real, good food accessible to everyone, everywhere,” said Tri Tran, a co-founder. CreditMunchery
    We are once again living in a go-go time for tech, but there are few signs that the most consequential fruits of the boom have reached the masses. Instead, the boom is characterized by a rise in so-called on-demand services aimed at the wealthy and the young.

    With a few taps on a phone, for a fee, today’s hottest start-ups will help people on the lowest rungs of the 1 percent live like their betters in the 0.1 percent. These services give the modestly wealthy a chance to enjoy the cooks, cleaners, drivers, personal assistants and all the other lavish appointments that have defined extravagant wealth. As one critic tweeted, San Francisco’s tech industry “is focused on solving one problem: What is my mother no longer doing for me?

    No, no, say the start-ups that, today, look as if they’re targeting the rich. The nature of the tech business is that costs come down. Through repeated innovation and delivery at scale, the supercomputers of the 1960s became the PCs of the 1980s, which in turn became the smartphones of the 2010s. The rich subsidize the rest of us — were it not for the suckers who spent more than $10,000 on early versions of the Mac, Apple might not have survived to build the iPhone, in turn begetting an era of affordable pocket supercomputers.


    This is the basic defense of the new wave of on-demand start-ups: If their rosiest visions of growth come true, they’ll achieve a scale that will let them reduce prices, and in that way offer services that could radically alter how even ordinary people conduct their lives.

    It is a plausible vision — but an unlikely one. To achieve the scale that will enable the start-ups to reach a wider audience, everything for these companies will have to go right, and success will have to feed on itself. That happens rarely in the tech world.

    Two companies that are archetypes of today’s on-demand business recently allowed me to investigate their economic models for a look at how they might achieve mass scale. One is Shuddle, a start-up that is creating a ride service for children — an Uber to take your tots to school and soccer. Another is Munchery, which delivers restaurant-quality food to your door (you can think of it as an on-demand personal chef). Both firms resisted the notion that they were building services for the wealthy and explained in detail how they planned to serve the masses and lower their prices.

    “The first time you roll out a service, it’s fairly expensive,” said Tri Tran, Munchery’s co-founder and chief executive. “But we have an internal mission of making real, good food accessible to everyone, everywhere, and if we only catered to the upper middle class or people who are really affluent, then we will not accomplish that goal.”

    He conceded that his prices weren’t low enough to make Munchery an option for everyone, but the business model, he said, would soon allow for greater access. Munchery, which began in 2010, operates in the San Francisco Bay Area, Seattle, New York and soon Los Angeles. Today, a typical adult-size Munchery entree costs around $11 or $12, and a child’s meal around $6. With a $3 delivery fee, a dinner for a family of four might cost around $37. That compares to about $25 for dinner at, say, Chipotle, not counting the time and money it takes to get there.

    But like many e-commerce businesses, Munchery enjoys certain cost advantages over its physical counterparts that Mr. Tran says will lead to lower prices. It buys high-quality ingredients in bulk, uses a single kitchen in an out-of-the-way part of a city and uses advanced cooking tech to cut down on labor.

    Mr. Tran promises that within a couple of months, chicken, beef and fish dishes will sell for less than $10 a portion and pasta dishes for $7. In the long run, Mr. Tran is aiming for prices that are competitive with those of fast-food chains and that will make cooking at home seem expensive.

    “If you buy the same quality ingredients that we do and cook it yourself, just the ingredients alone will cost you more,” he said.

    Foodies might scoff at this idea. But sociologists have found that for many low- and middle-income families, cooking every day takes too much time, planning and money. If Munchery can make a non-junk-food dinner at prices comparable to junk, without much time, wouldn’t that be a useful service to people who aren’t millionaires?


    “If there are great, commercially driven companies out there that can do things like food distribution at scale, and we can piggyback on their success, that could be a huge win,” said Hannah Calhoon, the director of Blue Ridge Labs, an organization that aims to build tech products for low-income communities.

    You can make a similar case for Shuddle, which was created by Nick Allen, a founder of the ride-sharing service Sidecar, who said he was trying to solve a problem of modern parenting — the parents are working, the children need to be ferried among home and school and their activities, and all the ways to do so require lots of time and money. Though I did not use Shuddle for my own children — the service is available only to children who’ve grown out of car seats, so mine are too young — I spoke to several parents who described it in rhapsodic terms.

    “It’s amazing to have someone else drive your kids while you’re making dinner, so everyone’s eating dinner at a logical hour,” said Rana DiOrio, a Bay Area mother of three who has been using Shuddle a few times a week for several months. Ms. DiOrio, the chief executive of a children’s book publishing company, said she found the service cheaper than alternatives like hiring a babysitter for an hour to drive a car, but she acknowledged that she was relatively well off and that it was not at a price that could serve everyone.

    Today, most parents pay $12 to $15 a ride — more than for on-demand services like Uber, and much more than public transportation, which of course isn’t available everywhere.

    The cost is partly a result of complexity. Shuddle puts drivers through a more extensive screening than its ride-share competitors, including requiring that they have previous child care experience. As a result, almost all of its 250 drivers are female. To allocate drivers efficiently in low-density areas like the suburbs, parents must schedule rides ahead of time, and to help coordinate the rides, children must carry basic cellphones. The company uses software to track how carefully its drivers are driving.

    But Mr. Allen has a plan to sharply reduce prices: car-pooling. As Shuddle grows, it will learn enough about the ride habits of local families to put multiple children in cars or vans together, which would significantly lower the price.

    “I’d love to be able to get your kid to school every day for $5 — basically almost as low as taking the city bus,” Mr. Allen said.
    After hearing the start-ups out, I remain unsure if they will ever get to the point where they can serve the masses.


    American Innovation Lies on Weak Foundation


    The iPhone in your pocket has more computing power than the Voyager spacecraft that left the solar system two years ago. High-tech cancer drugs are being approved every month. A few years into the future, Google’s Calico project promises to extend our life span.

    It’s easy, indeed, to be excited about the scientific and technological prowess of American companies.

    Apple, Procter & Gamble, 3M — American businesses dominate the list of the most innovative companies in the world. And new companies trumpeting new products, financed by dynamic venture capital operations, continue to emerge from Silicon Valley, the Boston region, New York, Northern Virginia and elsewhere.

    But talk to a scientist in a research lab almost anywhere and you are likely to hear that the edifice of American innovation rests on an increasingly rickety foundation.

    Investment in research and development has flatlined over the last several years as a share of the economy, stabilizing at about 2.9 percent of the nation’s gross domestic product in 2012, according to the National Science Foundation.

    That may not be far from the overall peak. But other countries are now leaving the United States behind. And even more critically, investment in basic research — the fundamental building block for innovation and economic advancement — steadily shrank as a share of the economy in the decade to 2012, the last year for which there are comprehensive statistics.

    The trend poses two big challenges. The first concerns government budgets for basic research, the biggest source of financing for scientific inquiry. It fell in 2013 to substantially below its level 10 years earlier and, as one of the most politically vulnerable elements in an increasingly straitened federal budget, looks likely to shrink further.

    The second, equally important, challenge regards the future of corporate research. Evidence suggests that American corporations, constantly pressured to increase the next quarter’s profits in the face of powerful foreign competition, are walking away from basic science, too.

    “Companies’ R&D,” Ashish Arora of Duke University’s Fuqua School of Business told me, “is moving away from the R toward the D.”

    This bodes ill for American progress.

    The number of American patent applications keeps rising. Yet increasingly divorced from the scientific advances on which technological progress ultimately rests, the patenting rush looks less and less like fundamental innovation.

    A research paper by Professor Arora and Sharon Belenzon from Fuqua, and Andrea Patacconi of the Norwich Business School at the University of East Anglia, tracks American corporations’ loss of interest in scientific research.

    The R&D of publicly held companies increased to 2 percent of sales in 2007 from 1 percent of sales in 1980. The share of businesses holding patents increased to just less than 30 percent from 20 percent during the period. Yet the share of companies whose researchers published in scientific journals shrank. Publishing original research took a much-diminished role in corporations’ overall R&D efforts.

    American corporate labs are the stuff of legend. Researchers dream of the halcyon days of Bell Labs and its eight Nobel Prizewinners, who brought us the transistor and Unix. Others reminisce about Xerox PARC, which came up with the graphical user interface that propelled the personal computer into just about every home and office.

    Those days are long gone. Today, laments Mark Muro of the Brookings Institution, investment in innovation has been balkanized, split between government financed basic research, squeezed by skimpy budgets, and a corporate R&D effort constrained by its focus on the very short term.

    What happened? The researchers at Duke and East Anglia reject the argument that tightening regulations have pushed companies to cut their research budgets. Corporate investment in basic research, they note, is waning in Europe, too. This is not exclusively an American dynamic.

    They also doubt that science has somehow become less valuable, an argument proposed by prominent economists like Robert Gordon of Northwestern University. Citations of recent scientific research are as common in corporate patents today as they were in the 1980s, suggesting science remains critical to companies’ innovation.

    Still, something has clearly changed: Investors may value corporate patents as much as ever, but the stock market places a lower valuation on original research than it did three decades ago. Corporate executives, their compensation tied overwhelmingly to short-term gains in the market value of their companies, may be responding accordingly.

    Science has always been risky. Xerox was not the main beneficiary of the graphical user interface. Apple and Microsoft were. Bell Labs might not have poured much money into discovering cosmic microwave background radiation without the backing of a deep-pocketed telephone monopoly.

    Harassed by international competition in our more cutthroat era, companies have less incentive to create knowledge that may or may not be profitable. Instead, they are encouraged to patent more intensely, to protect what profitable knowledge they already have.

    Can innovation survive this realignment?

    Ultimately a dearth of research threatens productivity growth, which slowed sharply over the last decade to less than half the average of the previous 10 years. It leaves young scientists in the lurch, without projects on which to apply their knowledge. It discourages young Americans from pursuing scientific careers.

    Congressional efforts to bolster corporate research by making the R&D tax credit permanent are unlikely to help much. Already extended umpteen times since it came into being in 1981, the credit “delivered, at most, a modest stimulus to domestic business R&D investment from 2000 to 2010,” according to the Congressional Research Service.

    Congress might do better by getting out of the way. Efforts by congressional Republicans to reshape federal research policy through the reauthorization of the America Competes Act, reducing funding in crucial areas like climate change and geoscience, among other provisions, could do real damage to the nation’s research effort.

    As large companies have cut their research budgets, small, science-driven businesses have stepped in to pick up some of the slack, spawned by the Bayh-Dole Act of 1980 that encouraged scientists and universities to commercialize the discoveries they made on the federal dime.

    They have opened a path for corporate innovation. Rather than invest in their own science, big companies may now more easily buy innovative start-ups and develop their most promising discoveries into profitable technology.

    And yet this ecosystem is vulnerable, reliant on a dwindling pot of public money that underwrites most university-based research.

    “The buy-up strategy would be fine if we had confidence that the university system and start-ups were picking up the slack,” Professor Arora told me. “But we still don’t understand how this division of innovative labor would work.”

    It might work. But it looks risky to put all our trust in that approach. It’s time for a different paradigm — more on that in a future column — to finance the innovation that will power America’s future.

Working...
X