Category Archives: business

Greg Ellis (ex-REA CEO) leaving for Germany

http://www.abc.net.au/radionational/programs/saturdayextra/growing-aust-business/5364200

Growing Australian business

Saturday 5 April 2014 8:05AM

One of Australia’s most creative businessmen has joined a small but definitely growing critique of our national business culture.

Greg Ellis, the outgoing Chief Executive of the REA Group – the online real estate classified business, that’s rapidly increased in value under his leadership – strongly believes that Australian business needs a lot more fresh ideas.

Flu Trends fails…

  • “automated arrogance”
  • big data hubris
  • At its best, science is an open, cooperative and cumulative effort. If companies like Google keep their big data to themselves, they’ll miss out on the chance to improve their models, and make big data worthy of the hype. “To harness the research community, they need to be more transparent,” says Lazer. “The models for collaboration around big data haven’t been built.” It’s scary enough to think that private companies are gathering endless amounts of data on us. It’d be even worse if the conclusions they reach from that data aren’t even right.

But then this:
http://www.theatlantic.com/technology/archive/2014/03/in-defense-of-google-flu-trends/359688/

 

http://time.com/23782/google-flu-trends-big-data-problems/

Google’s Flu Project Shows the Failings of Big Data

Google flu trends
GEORGES GOBET/AFP/Getty Images

A new study shows that using big data to predict the future isn’t as easy as it looks—and that raises questions about how Internet companies gather and use information

Big data: as buzzwords go,it’s inescapable. Gigantic corporations like SAS andIBM tout their big data analytics, while experts promise that big data—our exponentially growing ability to collect and analyze information about anything at all—will transform everything from business to sports to cooking. Big data was—no surprise—one of the major themes coming outof this month’s SXSW Interactive conference. It’s inescapable.

One of the most conspicuous examples of big data in action is Google’s data-aggregating tool Google Flu Trends (GFT). The program is designed to provide real-time monitoring of flu cases around the world based on Google searches that match terms for flu-related activity. Here’s how Google explains it:

We have found a close relationship between how many people search for flu-related topics and how many people actually have flu symptoms. Of course, not every person who searches for “flu” is actually sick, but a pattern emerges when all the flu-related search queries are added together. We compared our query counts with traditional flu surveillance systems and found that many search queries tend to be popular exactly when flu season is happening. By counting how often we see these search queries, we can estimate how much flu is circulating in different countries and regions around the world.

Seems like a perfect use of the 500 million plus Google searchesmade each day. There’s a reason GFT became the symbol of big data in action, in books like Kenneth Cukier and Viktor Mayer-Schonberger’s Big Data: A Revolution That Will Transform How We Live, Work and Think. But there’s just one problem: as a new article in Science shows, when you compare its results to the real world, GFT doesn’t really work.

GFT overestimated the prevalence of flu in the 2012-2013 and 2011-2012 seasons by more than 50%. From August 2011 to September 2013, GFT over-predicted the prevalence of the flu in 100 out 108 weeks. During the peak flu season last winter, GFTwould have had us believe that 11% of the U.S. had influenza, nearly double the CDC numbers of 6%. If you wanted to project current flu prevalence, you would have done much better basing your models off of 3-week-old data on cases from the CDC than you would have been using GFT’s sophisticated big data methods. “It’s a Dewey beats Truman moment for big data,” says David Lazer, a professor of computer science and politics at Northeastern University and one of the authors of the Sciencearticle.

Just as the editors of the Chicago Tribune believed it could predict the winner of the close 1948 Presidential election—they were wrong—Google believed that its big data methods alone were capable of producing a more accurate picture of real-time flu trends than old methods of prediction from past data. That’s a form of “automated arrogance,” or big data hubris, and it can be seen in a lot of the hype around big data today. Just because companies like Google can amass an astounding amount of information about the world doesn’t mean they’re always capable of processing that information to produce an accurate picture of what’s going on—especially if turns out they’re gathering the wrong information. Not only did the search terms picked by GFT often not reflect incidences of actual illness—thus repeatedly overestimating just how sick the American public was—it also completely missed unexpected events like the nonseasonal 2009 H1N1-A flu pandemic. “A number of associations in the model were really problematic,” says Lazer. “It was doomed to fail.”

Nor did help that GFT was dependent on Google’s top-secret and always changing search algorithm. Google modifies its search algorithm to provide more accurate results, but also to increase advertising revenue. Recommended searches, based on what other users have searched, can throw off the results for flu trends. While GFT assumes that the relative search volume for different flu terms is based in reality—the more of us are sick, the more of us will search for info about flu as we sniffle above our keyboards—in fact Google itself alters search behavior through that ever-shifting algorithim. If the data isn’t reflecting the world, how can it predict what will happen?

GFT and other big data methods can be useful, but only if they’re paired with what the Science researchers call “small data”—traditional forms of information collection. Put the two together, and you can get an excellent model of the world as it actually is. Of course, if big data is really just one tool of many, not an all-purpose path to omniscience, that would puncture the hype just a bit. You won’t get a SXSW panel with that kind of modesty.

A bigger concern, though, is that much of the data being gathered in “big data”—and the formulas used to analyze it—is controlled by private companies that can be positively opaque. Google has never made the search terms used in GFT public, and there’s no way for researchers to replicate how GFT works. There’s Google Correlate, which allows anyone to find search patterns that purport to map real-life trends, but as the Scienceresearchers wryly note: “Clicking the link titled ‘match the pattern of actual flu actvity (this is how we built Google Flu Trends!)’ will not, ironically, produce a replication of the GFT search terms.” Even in the academic papers on GFT written by Google researchers, there’s no clear contact information, other than a generic Google email address. (Academic papers almost always contain direct contact information for lead authors.)

At its best, science is an open, cooperative and cumulative effort. If companies like Google keep their big data to themselves, they’ll miss out on the chance to improve their models, and make big data worthy of the hype. “To harness the research community, they need to be more transparent,” says Lazer. “The models for collaboration around big data haven’t been built.” It’s scary enough to think that private companies are gathering endless amounts of data on us. It’d be even worse if the conclusions they reach from that data aren’t even right.

On bureaucracies

The American economist William A. Niskanen considered the organisation of bureaucracies and proposed a budget maximising model now influential in public choice theory. It stated that rational bureaucrats will “always and everywhere seek to increase their budgets in order to increase their own power.”

An unfettered bureaucracy was predicted to grow to twice the size of a comparable firm that faces market discipline, incurring twice the cost.

http://theconversation.com/reform-australian-universities-by-cutting-their-bureaucracies-12781

Reform Australian universities by cutting their bureaucracies

Australian universities need to trim down their bureaucracies. University image from www.shutterstock.com

Universities drive a knowledge economy, generate new ideas and teach people how to think critically. Anything other than strong investment in them will likely harm Australia.

But as Australian politicians are preparing to reform the university sector, there is an opportunity to take a closer look at the large and powerful university bureaucracy.

Adam Smith argued it would be preferable for students to directly pay academics for their tuition, rather than involve university bureaucrats. In earlier times, Oxford dons received all tuition revenue from their students and it’s been suggested that they paid between 15% and 20% for their rooms and administration. Subsequent central collection of tuition fees removed incentives for teachers to teach and led to the rise of the university bureaucracy.

Today, the bureaucracy is very large in Australian universities and only one third of university spending is allocated to academic salaries.

 

The money (in billions) spent by the top ten Australian research universities from 2003 to 2010 (taken from published financial statements).Authors
Click to enlarge

 

Across all the universities in Australia, the average proportion of full-time non-academic staff is 55%. This figure is relatively consistent over time and by university grouping (see graph below).

Australia is not alone as data for the United Kingdom shows a similar staffing profile with 48% classed as academics. A recent analysis of US universities’ spending argues:

Boards of trustees and presidents need to put their collective foot down on the growth of support and administrative costs. Those costs have grown faster than the cost of instruction across most campuses. In no other industry would overhead costs be allowed to grow at this rate – executives would lose their jobs.

We know universities employ more non-academics than academics. But, of course, “non-academic” is a heterogeneous grouping. Many of those classified as “non-academic” directly produce academic outputs, but this rubs both ways with academics often required to produce bureaucratic outputs.

An explanation for this strange spending allocation is that academics desire a large bureaucracy to support their research efforts and for coping with external regulatory requirements such as the Excellence in Research for Australia (ERA) initiative, theAustralian Qualifications Framework (AQF) and the Tertiary Education Quality and Standards Agency (TEQSA).

 

Staffing profile (% of total FTE classed as academic) of Australian universities 2001-2010, overall and by university groupings/ alliances.Authors

 

Another explanation is that university bureaucracies enjoy being big and engage in many non-academic transactions to perpetuate their large budget and influence.

The theory to support the latter view came from Cyril Northcote Parkinson, a naval historian who studied the workings of the British civil service. While not an economist, he had great insight into bureaucracy and suggested:

There need be little or no relationship between the work to be done and the size of the staff to which it may be assigned.

Parkinson’s Law rests on two ideas: an official wants to multiply subordinates, not rivals; and, officials make work for each other. Inefficient bureaucracy is likely not restricted to universities but pervades government and non-government organisations who escape traditional market forces.

Using Admiralty Statistics for the period between 1934 and 1955, Parkinson calculated a mean annual growth rate of spending on bureaucrats to be 5.9%. The top ten Australian research universities between 2003 and 2010 report mean annual growth in spending on non-academic salary costs of 8.8%. After adjusting for inflation the annual growth rate is 5.9%.

The American economist William A. Niskanen considered the organisation of bureaucracies and proposed a budget maximising model now influential in public choice theory. It stated that rational bureaucrats will “always and everywhere seek to increase their budgets in order to increase their own power.”

An unfettered bureaucracy was predicted to grow to twice the size of a comparable firm that faces market discipline, incurring twice the cost. Some insight and anecdotal evidence to support this comes from a recent analysis of the paperwork required for doctoral students to progress from admission to graduation at an Australian university.

In that analysis, the two authors of this article (Clarke and Graves) found that 270 unique data items were requested on average 2.27 times for 13 different forms. This implies the bureaucracy was operating at more than twice the size it needs to. The university we studied has since slimmed down the process.

Further costs from a large bureaucracy arise because academics are expected to participate in activities initiated by the bureaucracy. These tend to generate low or zero academic output. Some academics also adopt the behaviour of bureaucrats and stop or dramatically scale back their academic work.

The irony is that those in leadership positions, such as heads of departments, are most vulnerable, yet they must have been academically successful to achieve their position.

Evidence of this can be seen from the publication statistics of the professors who are heads of schools among nine of the top ten Australian research universities. Between 2006 and 2011, these senior academics published an average of 1.22 papers per year per person as first author.

This level of output would not be acceptable for an active health researcher at a professor, associate professor or even lecturer level.

The nine heads of school are likely tied up with administrative tasks, and hence their potential academic outputs are lost to signing forms, attending meetings and pushing bits of paper round their university.

If spending on the costs of employing non-academics could be reduced by 50% in line with a Niskanen level of over-supply, universities could employ additional academic staff. A further boost to productivity could be expected as old and new staff benefit from a decrease in the amount of time they must dedicate to bureaucratic transactions.

If all Australian universities adopted the staffing profile of the “Group of 8” institutions, which have the highest percentage of academics (at 51.6%), there would have been up to nearly 6,500 extra academics in 2010.

While no economist would question the need for some administration, there needs to be a focus on incentives to ensure efficient operation. It’s possible to run a tight ship in academic research as shown by Alan Trounson, president of the California Institute for Regenerative Medicine (CIRM).

In 2009, Trounson pledged to spend less than 6% of revenues on administration costs, a figure that is better than most firms competing in markets. So far, this commitment has been met.

It’s clear then that finding solutions to problems in modern Australian universities calls for a better understanding of economics and a reduction in bureaucracy.

The Hammerbacher Quote

“The best minds of my generation are thinking about how to make people click ads… That sucks.”

 

http://www.businessweek.com/magazine/content/11_17/b4225060960537.htm

This Tech Bubble Is Different

By  

As a 23-year-old math genius one year out of Harvard, Jeff Hammerbacher arrived at Facebook when the company was still in its infancy. This was in April 2006, and Mark Zuckerberg gave Hammerbacher—one of Facebook’s first 100 employees—the lofty title of research scientist and put him to work analyzing how people used the social networking service. Specifically, he was given the assignment of uncovering why Facebook took off at some universities and flopped at others. The company also wanted to track differences in behavior between high-school-age kids and older, drunker college students. “I was there to answer these high-level questions, and they really didn’t have any tools to do that yet,” he says.

Over the next two years, Hammerbacher assembled a team to build a new class of analytical technology. His crew gathered huge volumes of data, pored over it, and learned much about people’s relationships, tendencies, and desires. Facebook has since turned these insights into precision advertising, the foundation of its business. It offers companies access to a captive pool of people who have effectively volunteered to have their actions monitored like so many lab rats. The hope—as signified by Facebook’s value, now at $65 billion according to research firm Nyppex—is that more data translate into better ads and higher sales.

After a couple years at Facebook, Hammerbacher grew restless. He figured that much of the groundbreaking computer science had been done. Something else gnawed at him. Hammerbacher looked around Silicon Valley at companies like his own, Google (GOOG), and Twitter, and saw his peers wasting their talents. “The best minds of my generation are thinking about how to make people click ads,” he says. “That sucks.”

You might say Hammerbacher is a conscientious objector to the ad-based business model and marketing-driven culture that now permeates tech. Online ads have been around since the dawn of the Web, but only in recent years have they become the rapturous life dream of Silicon Valley. Arriving on the heels of Facebook have been blockbusters such as the game maker Zynga and coupon peddler Groupon. These companies have engaged in a frenetic, costly war to hire the best executives and engineers they can find. Investors have joined in, throwing money at the Web stars and sending valuations into the stratosphere. Inevitably, copycats have arrived, and investors are pushing and shoving to get in early on that action, too. Once again, 11 years after the dot-com-era peak of the Nasdaq, Silicon Valley is reaching the saturation point with business plans that hinge on crossed fingers as much as anything else. “We are certainly in another bubble,” says Matthew Cowan, co-founder of the tech investment firm Bridgescale Partners. “And it’s being driven by social media and consumer-oriented applications.”

There’s always someone out there crying bubble, it seems; the trick is figuring out when it’s easy money—and when it’s a shell game. Some bubbles actually do some good, even if they don’t end happily. In the 1980s, the rise of Microsoft (MSFT), Compaq (HPQ), and Intel (INTC) pushed personal computers into millions of businesses and homes—and the stocks of those companies soared. Tech stumbled in the late 1980s, and the Valley was left with lots of cheap microprocessors and theories on what to do with them. The dot-com boom was built on infatuation with anything Web-related. Then the correction began in early 2000, eventually vaporizing about $6 trillion in shareholder value. But that cycle, too, left behind an Internet infrastructure that has come to benefit businesses and consumers.

 

This time, the hype centers on more precise ways to sell. At Zynga, they’re mastering the art of coaxing game players to take surveys and snatch up credit-card deals. Elsewhere, engineers burn the midnight oil making sure that a shoe ad follows a consumer from Web site to Web site until the person finally cracks and buys some new kicks.

This latest craze reflects a natural evolution. A focus on what economists call general-purpose technology—steam power, the Internet router—has given way to interest in consumer products such as iPhones and streaming movies. “Any generation of smart people will be drawn to where the money is, and right now it’s the ad generation,” says Steve Perlman, a Silicon Valley entrepreneur who once sold WebTV to Microsoft for $425 million and is now running OnLive, an online video game service. “There is a goodness to it in that people are building on the underpinnings laid by other people.”

So if this tech bubble is about getting shoppers to buy, what’s left if and when it pops? Perlman grows agitated when asked that question. Hands waving and voice rising, he says that venture capitalists have become consumed with finding overnight sensations. They’ve pulled away from funding risky projects that create more of those general-purpose technologies—inventions that lay the foundation for more invention. “Facebook is not the kind of technology that will stop us from having dropped cell phone calls, and neither is Groupon or any of these advertising things,” he says. “We need them. O.K., great. But they are building on top of old technology, and at some point you exhaust the fuel of the underpinnings.”

And if that fuel of innovation is exhausted? “My fear is that Silicon Valley has become more like Hollywood,” says Glenn Kelman, chief executive officer of online real estate brokerage Redfin, who has been a software executive for 20 years. “An entertainment-oriented, hit-driven business that doesn’t fundamentally increase American competitiveness.”

Hammerbacher quit Facebook in 2008, took some time off, and then co-founded Cloudera, a data-analysis software startup. He’s 28 now and speaks with the classic Silicon Valley blend of preternatural self-assurance and save-the-worldism, especially when he gets going on tech’s hottest properties. “If instead of pointing their incredible infrastructure at making people click on ads,” he likes to ask, “they pointed it at great unsolved problems in science, how would the world be different today?” And yet, other than the fact that he bailed from a sweet, pre-IPO gig at the hottest ad-driven tech company of them all, Hammerbacher typifies the new breed of Silicon Valley advertising whiz kid. He’s not really a programmer or an engineer; he’s mostly just really, really good at math.

Hammerbacher grew up in Indiana and Michigan, the son of a General Motors (GM) assembly-line worker. As a teenager, he perfected his curve ball to the point that college scouts from the University of Michigan and Harvard fought for his services. “I was either going to be a baseball player, a poet, or a mathematician,” he says. Hammerbacher went with math and Harvard. Unlike one of his more prominent Harvard acquaintances—Facebook co-founder Mark Zuckerberg—Hammerbacher graduated. He took a job at Bear Stearns.

On Wall Street, the math geeks are known as quants. They’re the ones who create sophisticated trading algorithms that can ingest vast amounts of market data and then form buy and sell decisions in milliseconds. Hammerbacher was a quant. After about 10 months, he got back in touch with Zuckerberg, who offered him the Facebook job in California. That’s when Hammerbacher redirected his quant proclivities toward consumer technology. He became, as it were, a Want.

 

At social networking companies, Wants may sit among the computer scientists and engineers, but theirs is the central mission: to poke around in data, hunt for trends, and figure out formulas that will put the right ad in front of the right person. Wants gauge the personality types of customers, measure their desire for certain products, and discern what will motivate people to act on ads. “The most coveted employee in Silicon Valley today is not a software engineer. It is a mathematician,” says Kelman, the Redfin CEO. “The mathematicians are trying to tickle your fancy long enough to see one more ad.”

Sometimes the objective is simply to turn people on. Zynga, the maker of popular Facebook games such as CityVille and FarmVille, collects 60 billion data points per day—how long people play games, when they play them, what they’re buying, and so forth. The Wants (Zynga’s term is “data ninjas”) troll this information to figure out which people like to visit their friends’ farms and cities, the most popular items people buy, and how often people send notes to their friends. Discovery: People enjoy the games more if they receive gifts from their friends, such as the virtual wood and nails needed to build a digital barn. As for the poor folks without many friends who aren’t having as much fun, the Wants came up with a solution. “We made it easier for those players to find the parts elsewhere in the game, so they relied less on receiving the items as gifts,” says Ken Rudin, Zynga’s vice-president for analytics.

These consumer-targeting operations look a lot like what quants do on Wall Street. A Want system, for example, might watch what someone searches for on Google, what they write about in Gmail, and the websites they visit. “You get all this data and then build very rapid decision-making models based on their history and commercial intent,” says Will Price, CEO of Flite, an online ad service. “You have to make all of those calculations before the Web page loads.”

Ultimately, ad-tech companies are giving consumers what they desire and, in many cases, providing valuable services. Google delivers free access to much of the world’s information along with free maps, office software, and smartphone software. It also takes profits from ads and directs them toward tough engineering projects like building cars that can drive themselves and sending robots to the moon. The Era of Ads also gives the Wants something they yearn for: a ticket out of Nerdsville. “It lets people that are left- brain leaning expand their career opportunities,” says Doug Mack, CEO of One Kings Lane, a daily deal site that specializes in designer goods. “People that might have been in engineering can go into marketing, business development, and even sales. They can get on the leadership track.” And while the Wants plumb the depths of the consumer mind and advance their own careers, investors are getting something too, at least on paper: almost unimaginable valuations. Just since the fourth quarter, Zynga has risen 81 percent in value, to a cool $8 billion, according to Nyppex.

No one is suggesting that the top tier of ad-centric companies—Facebook, Google—is going down should the bubble pop. As for the next tier or two down, where a profusion of startups is piling into every possible niche involving social networking and ads—the fate of those companies is anybody’s guess. Among the many unveilings in March, one stood out: An app called Color, made by a seven-month-old startup of the same name. Color lets people take and store their pictures. More than that, it uses geolocation and ambient-noise-matching technology to figure out where a person is and then automatically shares his photos with other nearby people and vice versa. People at a concert, for example, could see photos taken by all the other people at that concert. The same goes for birthday parties, sporting events, or a night out at a bar. The app also shares photos among your friends in the Color social network, so you can see how Jane is spending her vacation or what John ate for breakfast, if he bothered to take a photo of it.

 

Whether Color ends up as a profitable app remains to be seen. The company has yet to settle on a business model, although its executives say it’ll probably incorporate some form of local advertising. Figuring out all those location-based news feeds on the fly requires serious computational power, and that part of the business is headed by Color’s math wizard and chief product officer, DJ Patil.

Patil’s Silicon Valley pedigree is impeccable. His father, Suhas Patil, emigrated from India and founded the chip company Cirrus Logic (CRUS). DJ struggled in high school, did some time at a junior college, and through force of will decided to get good at math. He made it into the University of California at San Diego, where he took every math course he could. He became a theoretical math guru and went on to research weather patterns, the collapse of sardine populations, the formation of sand dunes, and, during a stint for the Defense Dept., the detection of biological weapons in Central Asia. “All of these things were about how to use science and math to achieve these broader means,” Patil says. Eventually, Silicon Valley lured him back. He went to work for eBay (EBAY), creating an antifraud system for the retail site. “I took ideas from the bioweapons threat anticipation project,” he says. “It’s all about looking at a network and your social interactions to find out if you’re good or bad.”

Patil, 36, agonized about his jump away from the one true path of Silicon Valley righteousness, doing gritty research worthy of his father’s generation. “There is a time in life where that kind of work is easy to do and a time when it’s hard to do,” he says. “With a kid and a family, it was getting hard.”

Having gone through a similar self-inquiry, Hammerbacher doesn’t begrudge talented technologists like Patil for plying their trade in the glitzy land of networked photo sharing. The two are friends, in fact; they’ve gotten together to talk about data and the challenges in parsing vast quantities of it. At social networking companies, Hammerbacher says, “there are some people that just really buy the mission—connecting people. I don’t think there is anything wrong with those people. But it just didn’t resonate with me.”

After quitting Facebook in 2008, Hammerbacher surveyed the science and business landscape and saw that all types of organizations were running into similar problems faced by consumer Web companies. They were producing unprecedented amounts of information—DNA sequences, seismic data for energy companies, sales information—and struggling to find ways to pull insights out of the data. Hammerbacher and his fellow Cloudera founders figured they could redirect the analytical tools created by Web companies to a new pursuit, namely bringing researchers and businesses into the modern age.

Cloudera is essentially trying to build a type of operating system, à la Windows, for examining huge stockpiles of information. Where Windows manages the basic functions of a PC and its software, Cloudera’s technology helps companies break data into digestible chunks that can be spread across relatively cheap computers. Customers can then pose rapid-fire questions and receive answers. But instead of asking what a group of friends “like” the most on Facebook, the customers ask questions such as, “What gene do all these cancer patients share?”

Eric Schadt, the chief scientific officer at Pacific Biosciences, a maker of genome sequencing machines, says new-drug discovery and cancer cures depend on analytical tools. Companies using Pacific Bio’s machines will produce mountains of information every day as they sequence more and more people. Their goal: to map the complex interactions among genes, organs, and other body systems and raise questions about how the interactions result in certain illnesses—and cures. The scientists have struggled to build the analytical tools needed to perform this work and are looking to Silicon Valley for help. “It won’t be old school biologists that drive the next leaps in pharma,” says Schadt. “It will be guys like Jeff who understand what to do with big data.”

Even if Cloudera doesn’t find a cure for cancer, rid Silicon Valley of ad-think, and persuade a generation of brainiacs to embrace the adventure that is business software, Price argues, the tech industry will have the same entrepreneurial fervor of yesteryear. “You can make a lot of jokes about Zynga and playing FarmVille, but they are generating billions of dollars,” the Flite CEO says. “The greatest thing about the Valley is that people come and work in these super-intense, high-pressure environments and see what it takes to create a business and take risk.” A parade of employees has left Google and Facebook to start their own companies, dabbling in everything from more ad systems to robotics and publishing. “It’s almost a perpetual-motion machine,” Price says.

Perpetual-motion machines sound great until you remember that they don’t exist. So far, the Wants have failed to carry the rest of the industry toward higher ground. “It’s clear that the new industry that is building around Internet advertising and these other services doesn’t create that many jobs,” says Christophe Lécuyer, a historian who has written numerous books about Silicon Valley’s economic history. “The loss of manufacturing and design knowhow is truly worrisome.”

Dial back the clock 25 years to an earlier tech boom. In 1986, Microsoft, Oracle (ORCL), and Sun Microsystems went public. Compaq went from launch to the Fortune 500 in four years—the quickest run in history. Each of those companies has waxed and waned, yet all helped build technology that begat other technologies. And now? Groupon, which e-mails coupons to people, may be the fastest-growing company of all time. Its revenue could hit $4 billion this year, up from $750 million last year, and the startup has reached a valuation of $25 billion. Its technological legacy is cute e-mail.

There have always been foundational technologies and flashier derivatives built atop them. Sometimes one cycle’s glamour company becomes the next one’s hard-core technology company; witness Amazon.com’s (AMZN) transformation over the past decade from mere e-commerce powerhouse to e-commerce powerhouse and purveyor of cloud-computing capabilities to other companies. Has the pendulum swung too far? “It’s a safe bet that sometime in the next 20 months, the capital markets will close, the music will stop, and the world will look bleak again,” says Bridgescale Partners’ Cowan. “The legitimate concern here is that we are not diversifying, so that we have roots to fall back on when we enter a different part of the cycle.”

Vance_190
Vance is a technology writer for Bloomberg Businessweek in Palo Alto, Calif. Follow him on Twitter @valleyhack.

McKinsey – from the from lines of implementing analytics

Key issues highlighted:

  • Hype – need to manage internally
  • Privacy – flipside is improved health outcomes. the remedy is to provide consumers with more control of their data and building trust is the way forward. Opt-in. Company behaviour. Reinforce benefits.
  • Talent – short supply of analytics and IT professionals. Also short on “translators” – people whose talents bridge the disciplines of IT and data, analytics, and business decision making. These translators can drive the design and execution of the overall data-analytics strategy while linking IT, analytics, and business-unit teams. Without such employees, the impact of new data strategies, tools, and methodologies, no matter how advanced, is disappointing.
  • Centre of Excellence –  To catalyze analytics efforts, nearly every company was using a center of excellence, which works with businesses to develop and deploy analytics rapidly.
  • Adoption – Automation & Training

 

  •  combine proprietary data with open data sources to boost richness, improve models and business outcomes
  • Establishing priorities wisely and with a realistic sense of the associated challenges lies at the heart of a successful data-analytics strategy.
  • Start with an portfolio/ensemble pilot effort with clear rules for making go/no go decisions from shift from exploratory to production

PDF: McKinsey_ViewsFromTheFrontlinesOfTheDataAnalyticsRevolution

http://www.mckinsey.com/insights/business_technology/views_from_the_front_lines_of_the_data_analytics_revolution

Views from the front lines of the data-analytics revolution

At a unique gathering of data-analytics leaders, new solutions began emerging to vexing privacy, talent, organizational, and frontline-adoption challenges.

March 2014 | byBrad Brown, David Court, and Tim McGuire

This past October, eight executives from companies that are leaders in data analytics got together to share perspectives on their biggest challenges. All were the most senior executives with data-analytics responsibility in their companies, which included AIG, American Express, Samsung Mobile, Siemens Healthcare, TD Bank, and Wal-Mart Stores. Their backgrounds varied, with chief information officers, a chief data officer, a chief marketing officer, a chief risk officer, and a chief science officer all represented.1 We had seeded the discussion by asking each of them in advance about the burning issues they were facing.

For these executives, the top five questions were:

  • Are data and analytics overhyped?
  • Do privacy issues threaten progress?
  • Is talent acquisition slowing strategy?
  • What organizational models work best?
  • What’s the best way to assure adoption?

Here is a synthesis of the discussion.

1. Data and analytics aren’t overhyped—but they’re oversimplified

Participants all agreed that the expectations of senior management are a real issue. Big-data analytics are delivering an economic impact in the organization, but too often senior leaders’ hopes for benefits are divorced from the realities of frontline application. That leaves them ill prepared for the challenges that inevitably arise and quickly breed skepticism.

The focus on applications helps companies to move away from “the helicopter view,” noted one participant, in which “it all looks the same.” The reality of where and how data analytics can improve performance varies dramatically by company and industry.

Customer-facing activities. In some industries, such as telecommunications, this is where the greatest opportunities lie. Here, companies benefit most when they focus on analytics models that optimize pricing of services across consumer life cycles, maximize marketing spending by predicting areas where product promotions will be most effective, and identify tactics for customer retention.

Internal applications. In other industries, such as transportation services, models will focus on process efficiencies—optimizing routes, for example, or scheduling crews given variations in worker availability and demand.

Hybrid applications. Other industries need a balance of both. Retailers, for example, can harness data to influence next-product-to-buy decisions and to optimize location choices for new stores or to map product flows through supply chains. Insurers, similarly, want to predict features that will help them extend product lines and assess emerging areas of portfolio risk. Establishing priorities wisely and with a realistic sense of the associated challenges lies at the heart of a successful data-analytics strategy.

Companies need to operate along two horizons: capturing quick wins to build momentum while keeping sight of longer-term, ground-breaking applications. Although, as one executive noted, “We carefully measure our near-term impact and generate internal ‘buzz’ around these results,” there was also a strong belief in the room that the journey crosses several horizons. “We are just seeing the tip of the iceberg,” said one participant. Many believed that the real prize lies in reimagining existing businesses or launching entirely new ones based on the data companies possess.

New opportunities will continue to open up. For example, there was a growing awareness, among participants, of the potential of tapping swelling reservoirs of external data—sometimes known as open data—and combining them with existing proprietary data to improve models and business outcomes. (See “What executives should know about open data.”) Hedge funds have been among the first to exploit a flood of newly accessible government data, correlating that information with stock-price movements to spot short-term investment opportunities. Corporations with longer investment time horizons will need a different playbook for open data, but few participants doubted the value of developing one.

2. Privacy concerns must be addressed—and giving consumers control can help

Privacy has become the third rail in the public discussion of big data, as media accounts have rightly pointed out excesses in some data-gathering methods. Little wonder that consumer wariness has risen. (Data concerns seem smaller in the business-to-business realm.) The flip side is that data analytics increasingly provides consumers, not to mention companies and governments, with a raft of benefits, such as improved health-care outcomes, new products precisely reflecting consumer preferences, or more useful and meaningful digital experiences resulting from a greater ability to customize information. These benefits, by necessity, rest upon the collection, storage, and analysis of large, granular data sets that describe real people.

Our analytics leaders were unanimous in their view that placing more control of information in the hands of consumers, along with building their trust, is the right path forward.

Opt-in models. A first step is allowing consumers to opt in or opt out of the collection, sharing, and use of their data. As one example, data aggregator Acxiom recently launched a website (aboutthedata .com) that allows consumers to review, edit, and limit the distribution of the data the company has collected about them. Consumers, for instance, may choose to limit the sharing of their data for use in targeted Internet ads. They control the trade-off between targeted (but less private) ads and nontargeted ones (potentially offering less value).

Company behavior. Our panelists presume that in the data-collection arena, the motives of companies are good and organizations will act responsibly. But they must earn this trust continually; recovering from a single privacy breach or misjudgment could take years. Installing internal practices that reinforce good data stewardship, while also communicating the benefits of data analytics to customers, is of paramount importance. In the words of one participant: “Consumers will trust companies that are true to their value proposition. If we focus on delivering that, consumers will be delighted. If we stray, we’re in problem territory.”

3. Talent challenges are stimulating innovative approaches—but more is needed

Talent is a hot issue for everyone. It extends far beyond the notoriously short supply of IT and analytics professionals. Even companies that are starting to crack the skill problem through creative recruiting and compensation strategies are finding themselves shorthanded in another area: they need more “translators”—people whose talents bridge the disciplines of IT and data, analytics, and business decision making. These translators can drive the design and execution of the overall data-analytics strategy while linking IT, analytics, and business-unit teams. Without such employees, the impact of new data strategies, tools, and methodologies, no matter how advanced, is disappointing.

The amalgam is rare, however. In a more likely talent scenario, companies find individuals who combine two of the three needed skills. The data strategists’combination of IT knowledge and experience making business decisions makes them well suited to define the data requirements for high-value business analytics.Data scientists combine deep analytics expertise with IT know-how to develop sophisticated models and algorithms. Analytic consultants combine practical business knowledge with analytics experience to zero in on high-impact opportunities for analytics.

A widespread observation among participants was that the usual sources of talent—elite universities and MBA programs—are falling short. Few are developing the courses needed to turn out people with these combinations of skills. To compensate, and to get more individuals grounded in business and quantitative skills, some companies are luring data scientists from leading Internet companies; others are looking offshore.

The management and retention of these special individuals requires changes in mind-set and culture. Job one: provide space and freedom to stimulate exploration of new approaches and insights. “At times, you may not know exactly what they”—data scientists— “will find,” one executive noted in describing the company’s efforts to provide more latitude for innovation. (So far, these efforts are boosting retention rates.) Another priority: create a vibrant environment so top talent feels it’s at the cutting edge of technology change and emerging best practices. Stimulating engagement with the data-analytics ecosystem (including venture capitalists, analytics start-ups, and established analytics vendors) can help.

4. You need a center of excellence—and it needs to evolve

To catalyze analytics efforts, nearly every company was using a center of excellence, which works with businesses to develop and deploy analytics rapidly. Most often, it includes data scientists, business specialists, and tool developers. Companies are establishing these centers in part because business leaders need the help. Centers of excellence also boost the organization-wide impact of the scarce translator talent described above. They can even help attract and retain talent: at their best, centers are hotbeds of learning and innovation as teams share ideas on how to construct robust data sets, build powerful models, and translate them into valuable business tools.

Our participants agreed that it’s worth creating a center of excellence only if you can locate it in a part of the company where data-analytics assets or capabilities could have a dramatic strategic impact. For some companies, this meant IT; for others, marketing and sales or large business units. At one company, for instance, the analytics agenda is focused on exploiting a massive set of core transactional data across several businesses and functions. In this case, the center of excellence resides within IT to leverage its deep knowledge of this core data set and its role as a shared capability across businesses.

The goal should be for these centers to be so successful at building data-analytics capabilities across the organization that they can tackle increasingly ambitious priorities. One executive suggests that as businesses build their analytics muscle, centers of excellence will increasingly focus on longer-term projects more akin to sophisticated R&D, with an emphasis on analytics innovation and breakthrough insights.

5. Two paths to spur adoption—and both require investment

Frontline adoption was the most important issue for many leaders. Getting managers and individual contributors to use new tools purposefully and enthusiastically is a huge challenge. As we have written elsewhere,2 companies simply don’t invest enough, in time or money, to develop killer applications that combine smart, intuitive design and robust functionality. However, our participants see two clear paths leading to broad adoption.

Automation. One avenue to spurring adoption works for relatively simple, repetitive analytics: creating intuitive end-user interfaces that can be rolled out rapidly and with little training. For example, a mobile application on a smartphone or tablet might give brand managers instant visibility into volume and sales trends, market share, and average prices. These tools become part of the daily flow of decision making, helping managers to figure out how intensely to promote products, when tactical shifts in pricing may be necessary to match competitors, or, over time, where to begin pushing for new products. According to one executive, “Little or no training is required” with simple tools like these. Provided they are “clear and well designed, with strong visualization qualities, end users will seek them out.”

Training. A second path requires significant investments in training to support more complex analytics. Consider a tool for underwriting small and midsize business loans. The tool combines underwriters’ knowledge and the power of models, which bring consistency across underwriting judgments, clarifying risks and minimizing biases. But underwriters need training to understand where the model fits into the underwriting process flow and how they can incorporate what the models and tools say into their own experience of customer characteristics and their business priorities.

Whichever path is chosen, it should start with pilot efforts and clear rules for making “go/no-go” decisions about the shift from exploratory analytics to a full-scale rollout. Some models don’t end up being predictive enough to deliver the desired impact; better to shelve them before they become investment sinkholes and undermine organizational confidence in analytics. Executives need to be willing to press “pause” and remind the organization that the failure of some analytics initiatives to materialize is nothing to worry about; in fact, this is the reason for pursuing a portfolio of them. The combination of success stories and hard-nosed decisions to pull the plug is all part of creating a climate where business units, functions, top management, and frontline employees embrace the transformational possibilities of data analytics.

About the authors

Brad Brown is a director in McKinsey’s New York office, David Court is a director in the Dallas office, and Tim McGuire is a director in the Toronto office.

The authors would like to acknowledge the contributions of Brian Tauke and Isaac Townsend to the development of this article.

“There is no freedom in addiction”

Michael Bloomberg was laughed at for suggesting that New York City businesses limit soda serving sizes. It was never a perfect plan, but his public shaming shows how closely we equate food with ‘freedom.’ The problem is, there is no freedom in addiction. As the Nature Neurosciencestudy showed above, rats and humans alike will overeat (or eat less healthy food options) even if they know better.

Hence the magic bullet at the center of McDonald’s letter: a precise combination of fat, sugar and salt that keeps us craving more. As NY Timesreporter and author of Salt Sugar Fat: How the Food Giants Hooked UsMichael Moss said in an interview

These are the pillars of processed foods, the three ingredients without which there would be no processed foods. Salt, sugar and fat drive consumption by adding flavor and allure. But surprisingly, they also mask bitter flavors that develop in the manufacturing process. They enable these foods to sit in warehouses or on the grocery shelf for months. And, most critically to the industry’s financial success, they are very inexpensive.

PN: The fallacy in the rump of this discussion is that cigarettes are not that more harmful than a big mac. I’m just as likely to die from smoking a single cigarette in front of you, as I am if I were to eat a big mac in front of you. The problems arise when you smoke/eat these products every day of your life.

http://bigthink.com/21st-century-spirituality/should-big-food-pay-for-our-rising-obesity-costs

Should Big Food Pay For Our Rising Obesity Costs?

FEBRUARY 25, 2014, 4:29 PM
Bt-big-food

Paul McDonald didn’t expect his letter to go public. The Valorem Law Group partner had queried sixteen states, asking leaders to consider investigating Big Food’s potential role in paying for a percentage of the health system’s skyrocketing obesity costs. The Chamber of Commerce got wind of this letter and made it public, setting off a national debate over food marketing, ingredient manipulation and personal responsibility.

McDonald’s premise is simple enough: if large food companies are purposefully creating addictive foods to ensure consumer loyalty, adding to the rising obesity levels in this country, they should be responsible for covering costs associated with treatment. The backlash was immediate and biting.

Comparisons to the Big Tobacco companies came first to mind. In the 1998 Tobacco Masters Settlement Agreement, major players in the tobacco industry agreed to pay $246 billion to offset health risks and diseases associated with its product. Critics of McDonald’s idea believe there is no link between tobacco and food.

Advertising

On the face of it, this would appear true: you don’t need to smoke, but eating is a necessity. Smoking is a choice, and therefore if you choose to smoke, you pay the consequences. Eating falls into an entirely different category.

Yet the neural mechanisms might be similar. A 2010 study in Nature Neuroscience found that rats consumed well past their limits when offered high-calorie foods such as bacon, sausage and cake, speculating that humans, when faced with an equivalent scenario, also choose to overeat.

Harvard University Professor of Medicine, Emeritus David Blumenthal’s study, Neurobiology of Food Addiction, found a similar link between food and drug abuse. In the summary he writes

Work presented in this review strongly supports the notion that food addiction is a real phenomenon…although food and drugs of abuse act on the same central networks, food consumption is also regulated by peripheral signaling systems, which adds to the complexity of understanding how the body regulates eating, and of treating pathological eating habits.

The argument against food addiction is a tough one, waged by industry insiders who want to keep 60,000 products on American shelves. The real question, however, is: are food companies purposefully producing addictive foods that change our neurobiology? If so, should they be held economically accountable?

American obesity costs are currently $147 billion per year. The CDC estimates that 35.7% of adults and 17% of children ages 2-19 are obese—a number that has risen dramatically over the last two decades. A joint report between Trust for America’s Health and the Robert Wood Johnson Foundation estimates that 44% of American adults will be obese by 2030. The report predicts that will add between $48-66 billion to our costs, some of which is paid for by taxpayers.

Yet food is such an emotional topic. For example, when informing someone that I’m vegan, they immediately let me know why they could never do such a thing (I didn’t ask) or that it’s ‘wrong’ for them, and sometimes by extension, me (last week’s annual blood work shows me in perfect shape).

Michael Bloomberg was laughed at for suggesting that New York City businesses limit soda serving sizes. It was never a perfect plan, but his public shaming shows how closely we equate food with ‘freedom.’ The problem is, there is no freedom in addiction. As the Nature Neurosciencestudy showed above, rats and humans alike will overeat (or eat less healthy food options) even if they know better.

Hence the magic bullet at the center of McDonald’s letter: a precise combination of fat, sugar and salt that keeps us craving more. As NY Timesreporter and author of Salt Sugar Fat: How the Food Giants Hooked UsMichael Moss said in an interview

These are the pillars of processed foods, the three ingredients without which there would be no processed foods. Salt, sugar and fat drive consumption by adding flavor and allure. But surprisingly, they also mask bitter flavors that develop in the manufacturing process. They enable these foods to sit in warehouses or on the grocery shelf for months. And, most critically to the industry’s financial success, they are very inexpensive.

Inexpensive to companies, not to consumers. Paul McDonald is striking an important nerve in how we manufacture, distribute and consume food in our country. There will be a lot of resistance and debate from both industry and citizens. But if we don’t begin this conversation now, our national and mental health is only going to continue to decline.

Image: Aliwak/shutterstock.com

The business case for value-based care

  •  value-based payments will come into the US in the next 5-10 years
  • payments will be based on conditions, not treatments
  • e.g. current c-section rates are highly variable, due to the way fees are paid, not their actual value

 

http://www.healthleadersmedia.com/print/COM-301451/Building-the-Business-Case-for-ValueBased-Care

Building the Business Case for Value-Based Care

John Commins, for HealthLeaders Media , February 26, 2014

 

Harold D. Miller, president and CEO of the Center for Healthcare Quality and Payment Reform, discusses a fundamental barrier to shifting payment models in healthcare: Some providers mistakenly think all they have to do is tweak existing fee-for-service billing structures without understanding what drives costs in the underlying payment system.

Harold D. MillerHarold D. Miller, President and CEO
Center for Healthcare Quality
and Payment Reform

The shift away from volume-based, fee-for-service billing towards value-based reimbursements is gaining momentum and will be largely in place over the next few years. And yet a surprising number of healthcare providers really don’t grasp the details of how value-based reimbursements work.

Harold D. Miller, president and CEO of the non-profit Center for Healthcare Quality and Payment Reform, says many providers mistakenly believe that all they have to do is tweak existing fee-for-service billing structures without identifying potential savings or understanding what drives costs in the underlying payment system.

Miller, the author of a Robert Wood Johnson Foundation-funded report called Making the Business Care for Payment and Delivery Reform, spoke with me this week about what providers must do to build an effective business case for value-based care. The following is an edited transcript.

HLM: Where are we on the fee-for-service/value-based care timeline?

Miller: It could be the dominant model within the next five to 10 years, but it is a matter of how quickly physicians and in particular physicians in hospitals meet with the purchasers of care— the employers— to work that out. It’s about how soon both side come together and create the win, win, win that is good for patients, providers, and purchasers.

HLM: What are the stumbling blocks on the road to value-based care?

Miller: Most health plans and Medicare are trying to change the way care is delivered and reduce costs by piling on pay for performance and shared savings on top of fee-for-service. The problem is that if you don’t change the underlying payment system, you don’t change the incentives and the barriers that it creates.

For example, one of the best ways to keep people with chronic disease healthier and out of the hospital is for a physician practice to hire a nurse to educate and encourage patients to call when they have a problem. The problem is that doctors don’t get paid for nurses and they don’t get paid for answering phone calls. So practices are forced to lose money under fee-for-service to deliver better care, even though it would actually save money by keeping the patients out of the hospital.

 

HLM: Is value-based healthcare a particularly challenging sector?

Miller: Every patient is different, but on the other hand, how do health insurance companies operate? The law of large numbers says that on average, patients are fairly similar. You don’t have to deliver the exact same treatment to everybody to estimate on average what it is going to be like.

If you get the unusually expensive case—the patient who is an outlier with unique health problems— that is what insurance is for.

On the other hand, saying ‘We shouldn’t be giving an MRI to everyone who comes in with lower back pain. Most of them should probably go to physical therapy first.’ That is something you can do across a broad number of patients. That is going to save money on average and probably be better for the patients.

HLM: Is there common ground for fee-for-service and value-based models that providers can build on?

Miller: A lot of the payment reforms that are being done actually build on fee-for-service. The idea is you don’t just leave it in place and try to pile something on top. The problem with fee-for-service now is that it says you get paid the exact same amount to do something whether you do it well or poorly and whether or not [or whether] there are complications or infections that occur. And in fact you may get paid more.

But you don’t fix fee-for-service by sticking little penalties or bonuses on top. You have to change the fundamental way it is delivered.

For example, for patients who have health problems, we are looking at payments based on the patient’s condition and not based on exactly the procedure you used. A good example is delivering a baby. You get paid more to do a caesarian section than you get paid than a vaginal delivery. Yet the vaginal delivery takes longer, and is better for the mother and the baby.

So why do we now have a 33% C-section rate in the country? Because the fees we pay are not based on the actual value.

 

HLM: Why does value-based care create so much unease among many providers?

Miller: A lot of the anxiety comes because people don’t have the data. You have to have access to good data and in most cases healthcare providers can’t do that. Medicare has only just recently started to release data, so that someone could actually do the kind of analysis that I recommend in my report.

Most health plans treat their data as a proprietary secret, but there are a number of communities around the country that have multi-payer claims databases where people can do these kinds of analyses.

HLM: Why should providers welcome the switch to value-based care?

Miller: You could actually do better in a value-based payment model. People have the perception that somehow it is going to be worse, but the sooner you get into it the better you may be able to do because you are able to capture a lot of the value out there now that isn’t being captured.

Rather than staying in fee-for-service and hoping you may get a small increase in fees or that you don’t get a cut in fees, it’s better to ask ‘Can I redesign care in a way that would allow me to be paid significantly more?’

Medicare has done a demonstration that has been operational now for several years called theAcute Care Episode Demonstration that bundled together hospital and physician payments for orthopedic and cardiac procedures and the physicians were able to earn up to 25% more than their standard fee-for-service payments by being able to redesign care and reduce the costs. That is far more of an increase in pay quickly than you could ever get by simply staying in the existing fee-for-service model.

HLM: Who should be at the table when providers build the business case for value-based care?

Miller: Step No. 1 is changing the way care is delivered. It is the physicians on the front lines who have to say ‘Where do we think we are actually doing too much of something we shouldn’t do or that we are not providing good care to the patients?’

 

Then you have to get the COO or the CFO to say ‘Let’s work the numbers.’ Typically, you don’t find those two parts of organizations working together. Doing spreadsheets is not the physicians’ skill and providing care is not the CFO’s skill. But if you can get them to come together, that is where the magic happens.

Payment Reform

You say to physicians ‘Where do you think you could redesign care if somebody gave you the flexibility to be paid differently, to be paid for things that you aren’t being paid for today?’ When I talk to physicians, they all have ideas but nobody asks them.

The typical approach is that physicians say ‘Pay me for these things that you don’t pay me for today.’ The health plan, Medicare, employers or whomever says, ‘Wait a minute. That will increase costs if you are going to be paid for something new.’ If you think it is going to be better, run the numbers to see if it actually will save money. What will you do less of and what will that save?

Get everybody in the room. Get their ideas. Figure out which subset appears to be the most promising. Do the detail work and go to payers to put it in place. If you can show success then that encourages people to do more. Not every case will it be a savings proposition.

Which of those things is there really a business case for, and if there seems to be a business case then let’s do a finer analysis to show that and take it to the payers to say ‘how about a deal here?’ Even if you can’t get the perfect data, using approximate data to at least see if it looks like a business case then tells you which things to focus on.

HLM: How soon could a value-based model see a return on investment?

Miller: For many of these things, the savings can happen very quickly. A lot of what has been done in healthcare has been desirable, but has a long-term payoff. There is a lot of focus on better management of diabetes and hypertension; all very desirable but it doesn’t save a lot of money this year.

 

On the other hand, if you focus on people going unnecessarily to the emergency room and getting unnecessary tests and [you] figure out how to redesign that care, you save money immediately because you are avoiding the unnecessary care. Thirty day re-admissions are a perfect example.

HLM: Who do providers speak with on the payer side?

Miller: The focus will differ. Medicare doesn’t have a whole lot of interest in maternity care, whereas for businesses and Medicaid maternity care is in many cases their biggest expenditures. Everyone is interested in chronic disease. The distinction I make is between the purchaser and the payer. The purchaser in commercial insurance is the employer.

In fact, 60% of commercially insured employees in the country are in self-insured employer plans. The deal you are working out is actually with the employer and not the health plan. All the health plan is doing is processing claims. One of the challenges for commercial health plans is that value-based isn’t necessarily a good business proposition for them. They may have to incur costs to change the payment system, but the savings don’t go to them, they go back to their self-insured accounts.

HLM: What influences will insurance exchanges and consumer-driven healthcare play in the business case for value-based care?

Miller: It could be a potential advantage if different provider organizations get beyond this fairly narrow shared-savings model to the point where they are actually able to take accountability for populations of patients and can price that.

They could go on the exchange and allow people to sign up for this ACO and pick a primary care physician there and work with the coordinated set of docs at a lower cost and higher quality than simply picking a generic health plan. It’s kind of halfway between the traditional HMO/PPO models. You are picking who you want to lead your care. You don’t necessarily have to be limited to once set of docs or have a gatekeeper for everything.


John Commins is a senior editor with HealthLeaders Media. 

 

WSJ Transparent Pricing

  •  One of the most widespread initiatives comes from insurers themselves—who say they are eager to help plan members and employers cut their health-care bills. Some 98% of health plans now offer their members some online tool that lets them calculate their out-of-pocket costs, according to a survey by Catalyst for Payment Reform. A few let users compare different providers in the same network.
  • UnitedHealth Group Inc. has one of the most extensive tools. More than 21 million members can log into myHealthcare Cost Estimator and compare the negotiated rates for more than 500 individual services at in-network providers across the country, as well as their individual out-of-pocket costs for each one. Hundreds of thousands of plan members have used the tool since it launched in 2012, the company says.
  • In one pilot project, the California Public Employees’ Retirement System, found prices for hip and knee replacements ranging from $15,000 to $110,000 in the San Francisco area. It agreed to pay up to $30,000, and some 40 hospitals cut their prices to match. Such initiatives have helped Calpers save nearly $3 million in the past two years, one study found.
  • A growing body of research has found that there is no clear connection between price and outcomes such as mortality rates, blood clots, bed sores and hospital readmission. “Until you break that connection in peoples’ minds, there is a perverse incentive for hospitals and health systems to continue to raise prices,” Ms. Dentzer says.

http://online.wsj.com/news/articles/SB10001424052702303650204579375242842086688

How to Bring the Price of Health Care Into the Open

There’s a Big Push to Tell Patients What They’ll Pay—Before They Decide on Treatment

It’s a simple idea, but a radical one. Let people know in advance how much health care will cost them—and whether they can find a better deal somewhere else.

With outrage growing over incomprehensible medical bills and patients facing a higher share of the costs, momentum is building for efforts to do just that. Price transparency, as it is known, is common in most industries but rare in health care, where “charges,” “prices,” “rates” and “payments” all have different meanings and bear little relation to actual costs.

Unlike other industries, prices for health care can vary dramatically depending on who’s paying. The list prices for hospital stays and doctor visits are often just opening bids that insurers negotiate down. The deals insurers and providers strike are often proprietary, making comparisons difficult. Even doctors are generally clueless about what the tests, drugs and specialists they recommend will cost patients.

Princeton economist Uwe Reinhardt likens using the U.S. health-care system to shopping in a department store blindfolded and months later being handed a statement that says, “Pay this amount.”

The price-transparency movement aims to lift that veil of secrecy and empower patients and other payers to be smarter health-care consumers. Federal and state agencies are gathering reams of price information from doctors and hospitals and posting them for the public. Health plans are offering online tools that let members calculate their out-of-pocket costs. Startup companies are ferreting out and publishing the long-secret rates that providers negotiate with insurers.

When consumers can compare prices for doctor visits, hospital stays and other services, the theory goes, market competition will help keep them down.

An Incentive to Change

This is new territory for health care. Doctors and hospitals have rarely competed on cost. Third-party payers still foot the bulk of the bills, and many players in the health-care industry benefit from keeping their costs and profit margins murky.

“The time for transparency has clearly arrived—but is everybody ready to have real pricing power brought to bear in a way that could destabilize the health-care sector?” asks Susan Dentzer, a senior policy adviser at the Robert Wood Johnson Foundation. “It means upsetting a lot of apple carts.”

The pressure to change is rising, however. Experts expect consumers to be much more price-sensitive as they shoulder a growing proportion of health costs themselves. Last year, 38% of Americans with employer-sponsored insurance had a deductible of $1,000 or more—up from 10% in 2006, according to the Kaiser Family Foundation.

Silver and bronze plans created by the Affordable Care Act carry average family deductibles of $6,000 and $10,386, respectively. More than half of bronze plans also require patients to pay 30% of doctors’ fees, according to health-information site HealthPocket.com. “Most of us still don’t have much financial incentive to shop around for cheaper care,” says Suzanne Delbanco, executive director of Catalyst for Payment Reform, a nonprofit that works on behalf of employers. “That’s changing rapidly.”

 

Efforts to raise transparency are coming from a number of corners, including the Obama administration. But some have mainly shown how confusing health-care pricing is.

Hoping to shine a light on the variations in hospital charges, the Centers for Medicare and Medicaid Services, or CMS, grabbed headlines last May when it released a list of the average prices 3,300 U.S. hospitals charged Medicare for the 100 most common inpatient services during 2011.

Huge Differences

The variations were stunning. The average charge for joint-replacement surgery, for example, ranged from $5,300 in Ada, Okla., to $223,000 in Monterey Park, Calif. Even in the same city, there were huge swings. The charge for treating an episode of heart failure was $9,000 in one hospital in Jackson, Miss., and $51,000 in another.

A month later, CMS released a second database comparing average hospital charges for 30 common outpatient procedures, and the variations were just as great. A hospital in Pennington, N.J., charged $3,036 for a diagnostic and screening ultrasound, while one in Bronx, N.Y., billed just $88.

Many hospital executives dismiss those list prices—also known as chargemaster prices—as meaningless and misleading, since few patients ever pay them. Commercial insurers often use them as a starting point for negotiating big discounts. Medicare itself pays hospitals predetermined rates based on diagnoses, regardless of what they charge.

Industry experts say list prices vary so much in part because hospitals use different accounting methods and have different patient populations. List prices also reflect all the costs of running a hospital, including keeping ERs, burn units and other costly services running 24 hours a day. What’s more, many hospital executives say they have to mark up charges for privately insured patients because Medicare and Medicaid reimbursements don’t cover those patients’ cost—a shortfall the American Hospital Association puts at $46 billion nationwide last year.

Hospitals “are absolutely in favor of price transparency,” says AHA president Rich Umbdenstock, and they support a bill in Congress that would let individual states determine price-disclosure rules. He also says hospitals would like to end the confusing chargemaster and cost-shifting practices, but they can’t do it without big changes in payment practices by both the government and the insurance industry.

“If this were in our power to solve, we would have done it a long time ago,” Mr. Umbdenstock says. “But it’s not something we can do on our own.”

Shining a Light

Jonathan Blum, deputy administrator of the CMS, counters that chargemaster prices do matter, particularly to uninsured patients who sometimes get stuck with those inflated bills. He says the administration’s goal was to spark discussion about price variations, and that “a tremendous number” of visitors had downloaded the data.

“We’ve discovered that oftentimes, even health-care providers don’t fully realize the extent of those variations,” he says. “Our hypothesis is that a lot of the variations aren’t warranted.”

The prices insurers negotiate with hospitals and doctors are more important to consumers, experts say. Traditionally, those rates have been proprietary. Neither insurers nor providers want competitors and other business partners to know what they’re willing to settle for. Some contracts include gag clauses barring disclosure.

But states are increasingly requiring payers and providers to reveal that information. A few states specifically outlaw gag clauses in health-care contracts. Sixteen states have “all-payer claims databases” designed to collect insurance claims data and use it to monitor trends and identify high- and low-price providers. And some 38 states now require hospitals to report at least some pricing information, although only two—Massachusetts and New Hampshire—rated an “A” in Catalyst for Payment Reform’s annual report card for making the information accessible and usable by patients.

Meanwhile, entrepreneurs are sleuthing out negotiated rates from claims data and making them available to consumers and employers in various forms. Healthcare Bluebook aims to do for health care what the Kelley Blue Book does for used cars: It analyzes negotiated rates paid for thousands of medical services in every ZIP Code—supplied by employers and other clients—and posts what it considers a “fair” price for each so consumers can evaluate what they’re being charged.

Bluebook’s founder and CEO, Jeffrey Rice, says the rates insurers pay for, say, an MRI or knee surgery can vary as much as chargemaster prices do, particularly if a local hospital is dominant or prestigious.

“The difference may not be much between Nashville and Chicago—the big difference may be just down the block,” he says.

Mr. Rice says the employers Healthcare Bluebook works with have saved as much as 12% on their health-care costs by making price information available to their employees, with most savings coming on imaging studies, endoscopies, cardiac testing and other outpatient procedures.

Another service, PricingHealthcare.com, asks users to anonymously supply information from their own medical bills to help it amass the list prices, cash prices and negotiated rates for common procedures. It currently shows rates for some 500 procedures in 11 states. Founder Randy Cox says some providers are furious when asked what their rates are, while others are eager to have their entire price list posted. “I get calls from hospital CEOs who know people are concerned about price and think this is an opportunity for their business,” he says.

A Hand From Insurers

One of the most widespread initiatives comes from insurers themselves—who say they are eager to help plan members and employers cut their health-care bills. Some 98% of health plans now offer their members some online tool that lets them calculate their out-of-pocket costs, according to a survey by Catalyst for Payment Reform. A few let users compare different providers in the same network.

UnitedHealth Group Inc. has one of the most extensive tools. More than 21 million members can log into myHealthcare Cost Estimator and compare the negotiated rates for more than 500 individual services at in-network providers across the country, as well as their individual out-of-pocket costs for each one. Hundreds of thousands of plan members have used the tool since it launched in 2012, the company says.

Nationwide, only about 2% of health-plan members who have access to such tools have used them, according to Catalyst for Payment Reform. But Ms. Delbanco expects that number to rise as more patients become aware of the tools and see their out-of-pocket costs growing.

Proponents say it is too early to tell how much impact transparency efforts will have on costs overall. California has required hospitals to make their chargemaster prices public since 2003, with little effect on prices.

But one approach called “reference pricing” has yielded some savings. Where local prices differ substantially for a service like a colonoscopy, an insurer publishes a list of providers’ rates and agrees to pay a set amount. If patients choose a provider that charges more, they must pay the difference themselves.

In one pilot project, the California Public Employees’ Retirement System, found prices for hip and knee replacements ranging from $15,000 to $110,000 in the San Francisco area. It agreed to pay up to $30,000, and some 40 hospitals cut their prices to match. Such initiatives have helped Calpers save nearly $3 million in the past two years, one study found.

What Comes Next?

Experts say that as consumers increasingly compare prices, it’s critical to provide them with information about quality of care as well—otherwise, they might assume high cost equates with high quality.

A growing body of research has found that there is no clear connection between price and outcomes such as mortality rates, blood clots, bed sores and hospital readmission. “Until you break that connection in peoples’ minds, there is a perverse incentive for hospitals and health systems to continue to raise prices,” Ms. Dentzer says.

Indeed, critics fear that some price-transparency efforts could backfire and spur higher prices: If providers see that insurers are paying competitors more, they might hold out for higher rates, and insurers might be less inclined to give some providers favorable deals.

Some skeptics think that without fundamental changes in how health care is priced and paid for, transparency may confuse consumers more than it empowers them.

But there’s a growing consensus that while price transparency alone cannot transform the health-care system, it is necessary to help reveal which costs are excessive and let consumers make better-informed choices.

“At the end of the day, it’s our money,” Ms. Delbanco says. “We have a right to know what our health care is going to cost.”

Ms. Beck covers health care and writes The Wall Street Journal’s Health Journal column. She can be reached at melinda.beck@wsj.com.