Future Diets Report

  • with wealth comes an increase in animal products, fat and sugar, but globalisation is not leading to a convergence towards a single international norm with income becoming a weaker determinant of diet over time – this allows scope for public policy to intervene
  • Trajectories are not pre-ordained; there is scope to  influence the evolution of diet to get better outcomes for health and agriculture.
  • This has never been attempted, with the rare exception of the wartime rationing in Britain, which stands out as an unusual natural experiment that  led to better health; but one that the British public were delighted to abandon once supplies had been
    restored after the Second World War.
  • Studies such as that of Cecchini et al. (2010) show large benefits compared to costs from measures to influence people to  adopt healthier diets.
  • A final comment (and paradox): interest in diet has never been stronger in high-income countries as  we obsess about our waistlines, worry about the social impacts of the marketing strategies of (very) large food retail chains, and enthuse over the culinary art and tradition shown in countless television programmes. Scientifically, a plethora of papers have been drafted in the past 10 years that ponder the  rise of obesity worldwide and its implications.
    It seems, then, that it is only a matter of time before people will accept and demand stronger and effective measures to influence diets. When that time comes, we will need the evidence – provided in a very preliminary way by this review – on the main problems of emerging diets, and which policies  (and combinations of policies) will be most effective in addressing the emerging challenges.
  • overweight and obese in developing countries grew from 250M in 1980 to almost a billion (mostly Indians) in 2008
  • consumption of sugar has risen 20% per person between 1961 and 2009
  • 1 in 8 people (852m) in poor countries do not have sufficient access to food
  • 1/3 of infants in the developing world are stunted
  • 2 billion affected by micro-malnutrition
  • this impacts on the number of people developing certain types of cancers, diabetes, strokes and heart attacks
  • politicians are fearful of interfering in the dinner table, combined with powerful lobbying
  • South Korea has seen an increase in fruit and vegetable consumption after publicity, social marketing and an education campaign including large-scale teaching of women in preparing traditional, low-fat, high-vegetable meals

Future Diets Report (PDF):  http://www.odi.org.uk/sites/odi.org.uk/files/odi-assets/publications-opinion-files/8776.pdf

Source: http://www.foodnavigator-asia.com/Markets/Obesity-is-a-weighty-issue-for-almost-1bn-in-developing-world

By RJ Whitehead, 06-Jan-2014

Related topics: Markets, Asian tastes

The number of overweight and obese adults in developing countries has ballooned from some 250m in 1980 to almost a billion today, with Indians forming a huge chunk of this number.

This figure is highlighted in a major new review by the UK’s Overseas Development Institute to expose the global scale and consequences of overweight and obesity, and what it calls governments’ failure to address this growing crisis.

The Future Diets report is an analysis of public data detailing what the world eats. It selected five middle-income countries—India, China, Egypt, Peru and Thailand—as case studies to illustrate changes in dietary trends.

One in three overweight

The results highlight that the number of adults who are obese or overweight in the developing world more than tripled between 1980 and 2008, while in richer countries the figure has risen by over 200 million. One in three of the world’s adults are now overweight or obese, it found.

According to ODI research fellow Steve Wiggins, who authored the report, the growing rate of overweight and obesity in developing countries is alarming.

On current trends, globally, we will see a huge increase in the number of people suffering certain types of cancer, diabetes, strokes and heart attacks, putting an enormous burden on public healthcare systems,” Wiggins said, warning governments that they are not doing enough to tackle the growing crisis.

The percentage of obese and overweight in India rose from about 9% of the population in 1980 to 11% in 2008.

India’s consumption of animal products is approaching that of China’s in terms of its contribution to the average plate, but here the increase is almost entirely in milk consumption, with only limited increases for meat,” the report said.

Many Indians are vegetarian, avoiding beef or pork for cultural and religious reasons. The consumption of pulses remains relatively high in India, although it has been on the decline.”

Politicians fearful of meddling

Wiggins believes that the rise in obesity is partly due to politicians’ reluctance to interfere at the dinner table, along with the powerful influence of farming and food lobbies in the developing world and a large gap in public awareness of what constitutes a healthy diet.

Governments have focused on public awareness campaigns, but evidence shows this is not enough. The lack of action stands in stark contrast to the concerted public actions taken to limit smoking in developed countries.

Politicians need to be less shy about trying to influence what food ends up on our plates. The challenge is to make healthy diets viable whilst reducing the appeal of foods which carry a less certain nutritional value.”

However, the report does cite some successful examples of governments’ changing diets for the better. In South Korea, for example, policies that have led to an increase in fruit and vegetable consumption largely thanks to a publicity, social marketing and education campaign, including large-scale training of women in preparing traditional low-fat, high-vegetable meals.

Analysis of existing data shows that, amongst others, since 1980 overweight and obesity rates have almost doubled in China.

One indicator of changing diets is an increase in the consumption of sugar. Sugar and sweetener consumption has risen by over one-fifth per person globally from 1961 to 2009.

Fat consumption is also an issue. Among developing countries the highest consumption of fat is in East Asia, however industrialised countries still have much higher levels of fat consumption—often more than double.

Worryingly, despite a 50 per cent increase in the amount of food sourced from animals and a doubling in the quantity of fruit and vegetables being harvested, the report also notes that one in eight people (852m) in poor countries still do not have enough food to satisfy their basic needs.

2014 Big Data Predictions

  • growth will be 6 times the overall IT market
  • shortage of talent > analytics as a service, small/nimble analytics, cloud will grow quickly at 50% CAGR
  • VC investment moving to the top layer – from information management to analytics, discovery and applications
  • “Analytics 3.0” (IIA) and “Digitization of Everything” – companies across industries will use analytics on their accumulated data to develop new products and services — G.E. is the poster boy for this
  • automation solutions – cognitive computing, rules management, analytics, biometrics, rich media recognition – will replace knowledge worker roles
  • over-automation of decisions will result in an optimal mix of human and machine learning
  • Heightened focus on governance and privacy will improve results – governance will be a driver for ROI (Capgemini)

 

Gil Press, Contributor

I write about technology, entrepreneurs and innovation.

12/12/2013 @ 11:18AM |18,465 views

$16.1 Billion Big Data Market: 2014 Predictions From IDC And IIA

Both IDC and The International Institute of Analytics (IIA) discussed their big data and analytics predictions for 2014 in separate webcasts earlier this week. Here is my summary of their predictions plus a few nuggets from other sources.

IDC predicts that the market for big data will reach $16.1 billion in 2014, growing 6 times faster than the overall IT market. IDC includes in this figure Infrastructure (servers, storage, etc., the largest and fastest growing segment at 45% of the market), services (29%) and software (24%). IDC commented that the benefits of big data are not always clear today (indeed, BNY Mellon recently asked its 50,000 employees “for ideas about how to harness the power of Big Data”). IIA predicted that companies will want to see demonstrable value in 2014 and will focus on embedding big data analytics in business processes to drive process improvement.

The much-discussed shortage of analytics and data science talent led IIA to make three separate but related predictions. One prediction is that the adoption of analytics-as-a-service will accelerate with “ready-made analytics in the cloud” offering an attractive option for quickly testing big data analytics or scaling up existing programs. Similarly, Capgemini predicts (in an email to me) “smaller, nimble analytics,” as a result of the rise of machine-to-machine data, “making cloud the de facto solution.” And IDC predicts that cloud infrastructure will be the fastest-growing sub-segment of the big data market, with a 2013-2017 CAGR of close to 50%.

Another IIA prediction related to the dearth of talent is the increasing attention paid by companies to organizing in teams the analysts and data scientists they currently have on board, either embedded in the business units or in a center of excellence. The focus will be on making these teams more effective by establishing and sharing best practices and by “operationalizing and managing models,” with the rest of the world getting closer to the proficiency level of the financial industry in this regard (in other words,keeping up with the quants? hopefully, also learning from the financial industry’s failures in this regard—see financial crisis, 2008 edition).

As for the prospects for alleviating the talent shortage, IIA commented that there are now well over 100 programs at universities in the US where analytics and data science “are in focus” (see my list of graduate programs here). IDC, for its part, cautioned that these programs “will bear fruit only in four to five years,” referring obviously to the newly-established data science programs. IDC agrees with IIA that companies providing big data analytics services will fill the gap in the meantime and predicts that the big data professional services market will exceed $4.5 billion in 2014. The number of vendors providing such services will triple over the next three years, according to IDC, and these firms will “aggressively acquire scarce big data talent,” making it scarcer.

A very interesting dimension to the dearth of talent raised by IDC is the shortage of IT professionals capable of dealing with the new big data requirements. 33% of respondents to an IDC and Computerworld survey earlier this year noted as one of their big data challenges the “lack of sufficiently skilled big data and analytics IT staff” (“lack of sufficient number of staff with appropriate analytics skills” was selected by 45% of respondents).

Also interesting was IDC’s expansion of the services part of the market to include “value added content providers.” These include “traditional vendors” such as Thompson, LexisNexis, and Experian; “new wave vendors” such as DataSift, Gnip, and LinkedIn; “company and personal information vendors” such as Acxiom, Equifax, and Tarsus; and “search engine/aggregators” such as Yahoo, Google, and Salesforce/Data.com. IDC believes that this market segment will be “challenged by lack of business model clarity and standards.”

A related prediction from IDC is that VC investment will shift to the top layers of the big data software stack, from information management to the “analytics & discovery” and “applications” layers. New types of applications (“use cases”), such as personalized medicine, will emerge out of what IDC predicts will be the blurring of the boundaries between high-performance computing (previously limited to scientific/engineering applications) and “enterprise big data” (i.e., mainstream applications managed by an IT department). IIA sees other new horizons for the application of big data, predicting that companies in a variety of industries will increasingly use analytics on the data they have accumulated to develop new products and services. GE has been the poster boy for this emerging trend, called “Analytics 3.0” by IIA, or “the digitization of everything” by me (you decide).

Another application, security, will become the next big front for big data, IDC predicts, as security infrastructure will increasingly take on big data-like attributes. Big data will be used to correlate log data and identify malicious activity in real time, allowing companies to react quickly, rather than after the event. Gartner begs to differ, however, predicting that “big data technology in security contexts will stay immature, expensive and difficult to manage for most organizations as targeted attacks become more stealthy and complex to identify in progress. … The noise about big data for security has grown deafening in the industry, but the reality lags far, far behind.”

In a somewhat far-out prediction, IIA talked about facial recognition and wearable device data that will be incorporated into predictive analytics. One of the examples given was “pet stores could use facial recognition to greet dogs as well as customers.” IDC was a bit closer to 2014 (or was it?) when it predicted that the “proliferation of sensor, mobile, wearable, and embedded devices (Internet of Things) will become a significant driver of the big data market,” stressing the need for investment in “Data-in-Motion” and “real-time analysis of geo-dispersed incoming data streams,” primarily in the cloud (that you don’t need wearables or geo-whatever to satisfy your obsession with quantifying your life, was recently demonstrated by the resident data scientist at MarkITx who crunched his lunches to come up with a happiness-per-gram metric).

Both IDC and IIA got a bit more into the technologies behind big data analytics, with IDC predicting the co-habitation for the foreseeable future (my words) of traditional database technology (RDBMS) with the newer Hadoop ecosystem and NoSQL databases, concluding that “in the short term,” information management will become more complex for most organizations (see shortage of qualified IT staff above); and IIA predicting that “the adoption of data visualization will accelerate in both the high and low ends of the complexity spectrum [for analytics].” Humans, however, don’t comprehend things in more than two dimensions or, at most, three dimensions, so IIA advised tempering our enthusiasm “a bit” (this came from self-described Tom “Curmudgeon” Davenport so you may want to consider how much tempering you want to do; as for me, I always opt for being “uber-curmudgeon”).

Last but certainly not least, both IDC and IIA talked about automation in the context of big data. IDC predicts that “decision and automation solutions, utilizing a mix of cognitive computing, rules management, analytics, biometrics, rich media recognition software and commercialized high-performance computing infrastructure [phew!], will proliferate.” Some of these solutions, IDC says (warns?), “will begin to replace or significantly impact knowledge worker roles.”  IIA predicts that “we will see a continued move to machine learning and automation to keep pace with speed and volume of data” and that “as they strive to operationalize analytics but encounter challenges with the over-automation of decisions, companies will focus more on the optimal mix between human and machine capability and judgment.” If you take humans too much out of the equation, their decision making will atrophy, warned IIA, asking “If you don’t have experts, who will train the next generation of [machine learning] software?” From the IIA’s lips, to the NSA’s ears, I say. (Well, we can assume these words were collected and stored by the omnipresent sleuths the second they were uttered; the question is: do they understand what the words mean?)

One prediction that didn’t make the official list of IIA’s predictions, but Davenport nevertheless managed to include in the webcast, was that “companies will need to hire lawyers to verify that they actually own the data.” Indeed, the nagging issues—that I think will be even more prominent in 2014—of privacy and governance were largely missing from the IDC and IIA discussions (Capgemini, in contrast, contributed this: “A heightened focus on governance will improve analytic results… Governance will need to be a driver in shaping the ROI story for Big Data in 2014”).  Also missing were discussions of “open data” and the increased use of big data by the public sector (outside of the NSA) to name just a few pertinent big data trends not on their list of predictions. But of course, the challenge is to select the nine or ten most important ones and we have lots to chew on with IDC’s and IIA’s lists.

Listeners to the IIA webcast were given the opportunity to vote on which predictions they thought would come true:

IIA_Poll-Results

Participants in the IIA webcast included Sarah Gates, Tom Davenport, Bob Morison, Bill Franks, Greta Roberts, Omer Sohail and Sanjeev Kumar; IDC’s webcast was delivered by Dan Vesset and Ashish Nadkarni; Capgemini’s predictions were attributed to SVP for Business Information Management Scott Schlesinger.

Follow me on Twitter @GilPress or Facebook or Google+ 

how new leaders can build trust

  • Meet with as many individual contributors as possible
  • Develop a plan based on their insights
  • Report that plan back to them once it is developed

http://blogs.hbr.org/2013/12/the-best-way-for-new-leaders-to-build-trust

20131216_1

The Best Way for New Leaders to Build Trust

by Jim Dougherty  |   8:00 AM December 13, 2013

When I took over as CEO of Intralinks,  a company that provides secure web based electronic deal rooms, the company was hemorrhaging so much cash that its survival was at stake. The service was going down three times per week; we were in violation of the contract with our largest client; our chief administrative officer had just been demoted, and so on.

So, what I do on my first day? I spent more than four hours  listening in to client support calls at the call center.  I shared headsets with many of the team, moving from desk to desk to speak to the reps. To say they were surprised is an understatement: Many CEOs never visit the call center, and virtually none do it their first afternoon on the job.

I made this my priority partly because I wanted to know what customers were saying—but also to make an internal statement. I knew there had to be some radical changes to behaviors, expectations, and attitudes.  There was no time to be subtle.  I needed to show I was different, that things were going to be different, and I needed to establish trust as quickly as possible.

In leading various companies over the years, one of the most valuable lessons I’ve learned is that establishing trust is the top priority. Whether you are taking over a small department, an entire division, a company, or even a Boy Scout troop, the first thing you must get is the trust of the members of that entity.  When asked, most leaders will agree to this notion, but few do anything to act on it.

Without trust, it is very unlikely you will learn the truth on what is really going on in that organization and in the market place.  Without trust, employees won’t level with you—at best, you’ll learn either non-truths or part truths. I see this all too frequently. Sometimes employees will go out of their way to hoard and distort the truth.

The best way to start building trust to take the time and meet as many individual contributors as you can as soon as you can. In addition to meeting customers, meeting rank-and-file employees should be your top priority.

This is not a common approach. Many leaders see their role as directing and giving information, rather than gathering.  There is pressure to “come up with the answer” quickly or risk looking weak.  Too many new leaders believe they’re expected to know the answer without input or guidance. Nothing could be further from the truth.

Doing this correctly takes time—but less than you might think. The meetings can be on one on one or small groups.  The sessions can’t be rushed.  In the first few weeks I’d suggest you spend up to half your time in these meetings. Take a pad and take notes.  Listen intently.  A simple but effective open-ended question is: “If you were put into my role tomorrow, what would be the first three things you’d do and why?”  Or: “What are the three biggest barriers to our success, and what are our three biggest opportunities we have?” Really great ideas can emerge from these meetings—along with some really mediocre ones—but it’s your job to filter and prioritize them. First, gather the information.

Later on my first day at Intralinks, I began arranging meetings with individual contributors. That’s where my learning really began. Over the next few weeks I met with over 60 individual contributors. Not only did I learn a lot, but I convinced them that I cared what they thought and could be trusted with the truth.

In the middle of my first week as CEO, one of the company’s original VCs called. “So, what’s your plan?” he asked. I said I have to spend a few weeks learning. He was incredulous that I did not have a pre-baked plan. I was incredulous he thought that I should.

Over those weeks I learned how unhappy clients were with our complex bills, why service went down so often, why our pricing gave our clients headaches, that 80% of the customer calls could be eliminated with a simple fix to our service, and that clients wanted predictability of expenditures with us.

After six weeks, I had enough information to return to the management team with specific recommendations on what I thought we should do. Instead of just laying this out in an all-hands meeting, I began laying out the plan in one-on-one meetings in which I talked about how each individual’s feedback had helped guide my thinking. This created a tremendous buy in among all levels of the team.

By mid March, after only 10 weeks on the job, we rolled out the new plan. By the end of the year we’d signed 150 new long-term contracts (up from zero), revenue was up by almost 600%, our burn rate was cut by 75%, and we’d positioned ourselves to raise a $50 million round of financing a few months later in the heart of the dot.com winter.

None of this could have happened without building the trust of the team. New leaders must remember that many of the best insights on how to fix a company lie with employees further down the org chart. Creating a trusting, honest dialogue with these key personnel should be every new leader’s top priority.

More blog posts by 

HBR Blogs – leadership

  • moving from being a fire fighter to a fire chief – getting more strategic
  • moving away from being the doer-in-chief

 

20131218_3

Doing Less, Leading More

by Ed Batista  |   10:00 AM December 17, 2013

Our first accomplishments as professionals are usually rooted in our skill as individual contributors. In most fields we add value in the early stages of our careers by getting things done. We’re fast, we’re efficient, and we do high-quality work. In a word, we’re doers. But when we carry this mindset into our first leadership roles, we confuse doing with leading. We believe that by working longer, harder, and smarter than our team, we’ll inspire by example.  Sometimes this has the desired effect–as Daniel Goleman wrote in his HBR article “Leadership that Gets Results,” this “pacesetting” leadership style “works well when all employees are self-motivated, highly competent, and need little direction or coordination.” But the pacesetting style can also carry a high cost – Goleman notes that it “destroys climate [and] many employees feel overwhelmed by the pacesetter’s demands.”

Instead simply doing more, sustaining our success as leaders requires us to redefine how we add value. Continuing to rely on our abilities as individual contributors greatly limits what we actually contribute and puts us at a disadvantage to peers who are better able to mobilize and motivate others. In other words we need to do less and lead more. Sometimes this transition is obvious and dramatic, such as when we’re promoted and obtain our first direct reports or hire our first employees. Suddenly we need to expand our behavioral repertoire to incorporate new leadership styles as a means of influencing others effectively. (“Leadership That Gets Results” provides a useful roadmap here, highlighting the styles that have the greatest positive impact or are used less frequently by managers.)

Subsequent transitions may be more subtle and nuanced, such as when we go from leading front-line staff to leading managers, who themselves must navigate this same transition. A coaching client realized that he was running his company as though he were the “Doer-in-Chief,” and this model of leadership had permeated throughout the organization and was holding everyone back. He revamped his role, delegating almost all of the tasks on his to-do list to his senior managers and had them do the same to their direct reports. Rather than simply creating more work for junior employees, this emphasis on leading rather than doing resulted in greater efficiencies throughout the company. In my client’s words, “We went from being firefighters to being fire marshals,” taking a more strategic approach to the business, redesigning inefficient systems, and solving problems before they became crises.

This emphasis on leading and not merely doing has had a profound impact on management education. In 2010 Dean Garth Saloner of the Stanford Graduate School of Business (where I’m an Instructor) told McKinsey that, “The harder skills of finance and supply chain management and accounting…have become what you think of as a hygiene factor: everybody ought to know this… But the softer skill sets, the real leadership, the ability to work with others and through others, to execute, that is still in very scarce supply.” We expect our students to have solid technical and analytical skills—to be effective doers. But we also expect that within a few years of graduation our students will be managing people who are even more technically and analytically capable than they are—and this requires them to be effective leaders.

Many of my executive coaching clients and MBA students at Stanford are going through a transition that involves a step up to the next level in some way. They’re on the cusp of a big promotion, or they’ve launched a startup, or their company just hit some major milestone. Very few, if any, of these people would say that they’ve “made it”; they’re still overcoming challenges in pursuit of ambitious goals. And yet their current success has created a meaningful inflection point in their careers; things are going to be different from now on. The nature of this difference varies greatly from one person to another, but I see a set of common themes that I think of as “the problems of success.” You can read my first and second posts on “the problems of success.”

More blog posts by 
More on: Leadership
80-ed-batista

Ed Batista (@edbatista) is an executive coach and an Instructor at the Stanford Graduate School of Business. He writes regularly on issues related to coaching and professional development at edbatista.com, he contributed to the HBR Guide to Coaching Your Employees, and is currently writing a book on self-coaching for HBR Press.

Gruen: A unified economic theory of privately provided public goods and social capital

Nicholas delivers a terrific presentation (1hr 7mins) to The Australian Centre for Social Innovation about the private provision of public goods and the subsequent generation of social capital.

It is as crisply considered as it is thought provoking:

  • after all this time, I think I finally understand that economists believe themselves to be the purveyors and arbiters of well-being, and ultimately health – no wonder they’re so suspicious of, and have so much trouble relating to medicine and health care
  • it would be interesting to apply this prism to health care – I don’t think it will present favourably

The presentation wraps with a 5 min video of the Family by Family program – a compelling sounding intervention that generates an abundance of social capital by developing and then using resources embedded in the community.

Many references to Adam Smith, Hayek and Robert Putnam.

Great to see something not confined to the sub-20min constraint.

Bravo Nicholas!!


http://clubtroppo.com.au/2013/12/20/public-goods-privately-provided-the-video/

 

 

Justin Coleman: The ethical imperative to tackle overdiagnosis and overtreatment

  • Beautifully written, wise piece by a friend and colleague of Gavin Mooney
  • Archie Cochrane humorous anecdote
  • Donald Berwick’s 30% waste JAMA link
  • Futility of spinal fusions
  • Futility of knee arthroscopies
  • Testosterone over-prescribing
  • EBM is a necessary but not sufficient condition for practising good medicine.When my friend Prof Gavin Mooney gave me his book, he explained why he’d called it EBM ‘in its place’.

    He did not want to promote a system of slavish adherence to a deontology. As a leftie health economist—a rare breed indeed—his primary concern was always one of health equity. Not health equality, which is clearly unattainable, but equity, where we strive for equal access to equal care for equal need.

    An equitable health system does not mean trying to give everyone the very best, if by that you mean the most; the most tests, the most expense, the most treatments. Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient.

    Gavin was killed in tragic circumstances last Christmas, and I dedicate this article to his memory.

    His philosophy was that, sometimes, less is more. We must pare things back, strip away excesses and judiciously apply what we know works, rather than enthusiastically embrace what we wish would work.

    As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.

Source: http://blogs.crikey.com.au/croakey/2013/06/23/the-naked-doctor-an-indepth-look-at-the-pitfalls-of-cutting-edge-medicine/

The Naked Doctor: an indepth look at the pitfalls of “cutting edge” medicine

MELISSA SWEET | JUN 23, 2013 5:15PM | EMAIL | PRINT

The Naked Doctor is an ongoing project at Croakey that aims to encourage discussion and awareness of the opportunities to do more for health by doing less.

In this latest edition, Dr Justin Coleman suggests that an equitable health system does not mean trying to give everyone the very best, if that means “the most tests, the most expense, the most treatments”.

“Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient,” he says.

Perhaps one area where more intervention is needed is in tackling overdiagnosis and overtreatment – Dr Coleman suggests that if the ‘medical market’ is left unchecked, the balance naturally tips towards overtreatment.

He concludes with a powerful call to action:

“As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.”

The article below is based upon his plenary address to the Qld RACGP Annual Clinical Update in Brisbane last month.

It is dedicated to the late Professor Gavin Mooney, whose philosophy was that we must “judiciously apply what we know works, rather than enthusiastically embrace what we wish would work”.

***

The ethical imperative to tackle overdiagnosis and overtreatment

Justin Coleman writes:

Two years ago my good friend Gavin Mooney gave me a signed copy of his latest—and, as it turned out—last book, Evidence-Based Medicine in its Place. 

Professor of Health Economics at Curtin University, Gavin was an irascible Scot, and his book detailed his work with another great Scotsman, Archie Cochrane, who of course pioneered the science of Evidence-Based Medicine.

According to Mooney, after their first meeting, Cochrane informed him that he had revised his opinion of economists.

On the basis of the evidence of an afternoon with Mooney, he now placed them second bottom, with sociologists at the bottom. This merely confirmed for Mooney that there was much on which they agreed.

Mooney told me the story, repeated in his book, of how Archie Cochrane first gained notoriety as a very junior staff member at the massive Department of Health in London.

The young Archie presented slides from an RCT on outcomes after heart attacks following rehabilitation, either while remaining a hospital inpatient or after early discharge home.

London’s ‘Who’s Who’ of learned physicians nodded sagely as Archie showed the crucial slides where the hospital outcomes—represented in red—outdid the blue columns of home-based outcomes across nearly every parameter. A couple of supportive comments, no questions.

Then the young epidemiologist pretended to look flustered. ‘I’m terribly sorry. I seem to have mixed up the red and the blue!’

He had deliberately switched the labels. All the better outcomes were in fact in the home-based, early discharge group.

Needless to say, chaos ensued as suddenly a hundred disgruntled audience members grilled him on every possible dubious aspect of the study design!

Best practice or common practice?

Until that time, there had been no reason for a London physician to doubt that an intensive, expensive, high-tech hospital stay would improve health outcomes.

It made perfect sense, and a whole bunch of highly intelligent, caring physicians had spent their careers ensuring that such a system existed. Where it wasn’t affordable, public and charity funds were sought to ensure more people could get longer hospital stays.

This was best-practice care, in the same way that bed rest for back pain, monthly breast self-examinations, and antibiotics for sore throats have been understood by clever and well-meaning people to be fairly obvious best care. More about Archie—and Gavin—later.

In the brilliant Mitchell and Webb parody of a Homeopathic Emergency Department, Webb attempts to save a trauma victim’s life by drawing on his palm in pen to extend his life line. He justifies it by asking ‘Have you got a better idea?’

Luckily, the answer is ‘yes’.

There are some things that do work better than a pen mark, or a homeopathic vial of water, even a vial where the water molecules somehow retain the memory of a herb they once knew, while conveniently forgetting they were once flushed down a toilet.

And there are some things that do work better than our mainstream medical interventions, even when tens of thousands of medical practitioners believe they are doing the right thing.

This has always been true, and will ever be so. Our mistakes from the past remind us that we are making mistakes right now. Full credit to all those anonymous doctors and researchers who unwrapped these anomalies.

The art of discovering nothing

History rightly lauds those who discovered ‘something’; Alexander Fleming and penicillin.

But I also dips me lid to those who discovered ‘nothing’. Bloodletting doesn’t work. Arsenic doesn’t work. Keeping kids with polio in hospital back straighteners for six months of their lives doesn’t work.

In many cases, our patients would be better off if we chose not to act.

There’s a minimum standard in the medical profession—not the gold standard, but let’s call it the bronze.

The bronze standard is that the patient is no worse off as a result of seeing us. The bronze standard is probably achieved by enthusiasts who light ear candles and discover people’s chakras. Let’s at least stop doing things which fall below the bronze standard.

We must balance the important and exciting work of discovering new stuff with the un-sexy hard-slog science of analysing those times where we have over-reached and over-enthused.

The best of our medical predecessors started this process and we must continue it; this is why we are a science and not merely a tradition.

Two hundred years ago, the French physician Phillipe Pinel cared enough about the damage his colleagues were doing to his psychiatric patients to observe:

“It is an art of no little importance to administer medicines properly: but, it is an art of much greater and more difficult acquisition to know when to suspend or altogether to omit them.”

It took a young epidemiologist Archie Cochrane to highlight the flaws in obstetric practice that should ideally have already been obvious to the world’s leading obstetricians and their institutions.

And these were not minor flaws. Obstetrics units in one part of the world were teaching methods which had already been shown in another part of the world to kill women and babies, and vice versa.

Cochrane didn’t do the research himself; his genius was to inspire others—in this case, Iain Chalmers— to collect, collate and analyse all the available evidence and, importantly, reject the shoddy stuff: the anecdote and the meaningless trial, so that obstetricians and their departments could make informed decisions as to how to get the best outcomes.

Archie never delivered a baby nor managed a single maternal complication, but his legacy would probably have saved more lives than any doctor watching his slide presentation in London.

Somewhere on the spectrum

Let’s look at chronic diseases, and use diabetes as an example.

Insulin’s invention in 1922 was a miracle, which converted the inevitable rapid death sentence of Type 1 diabetes into a chronic disease. Chronic in the best sense of the word, because insulin bought you time; years, decades.

That simple chemical justifiably sits at the high table in the pantheon of superb medical interventions.

But diabetes, like most chronic diseases, has nominal cut-off points which define its existence and degree. Diseases stretch themselves out along a spectrum, blissfully unaware of how we choose to dissect them.

Medical tests and interventions that work brilliantly at the sharp end of the spectrum do not work nearly so well when we slide towards the middle and enter the grey zone.

Any gains to be had here in the land of the long grey cloud are far foggier than anything out at the extreme edge.

Benefits diminish; every diagnostic test becomes less accurate; false positives increase exponentially; patient numbers increase—and with them, costs, pain and inconvenience; health gains are smaller in this less-sick population; and suddenly being diagnosed with a chronic disease such as diabetes or pre-diabetes doesn’t look so good any more.

Instead of being grateful to Chronos, the Greek god of time who grants you each extra year of life, suddenly the old bugger expects you to jab your finger three times a day, jab your stomach three times a day, and to roll a boulder up Sisyphus’s mountain just to have your liver pecked out by Prometheus’s eagle.

There comes a point where ignoring your diabetes educator becomes…to continue the theme…tantalising!

If the ‘medical market’ is left unchecked, the balance naturally tips towards overtreatment.

The paradigm promoted by industry, the media and some doctors, particularly the sub-sub-specialists, is that the only important news is a new invention, new drug, robotic surgery, more MRIs.

Is the best doctor always the one at the cutting edge? Is the best endocrinologist for my grandmother the one who has just spent a year in America learning the finer points of subcutaneous insulin infusion pumps?

There exists a cut-off point on every disease spectrum, inevitably ignored by drug companies and often enough by doctors, where medications simply don’t help. At that point, they do nothing. Beyond that point, they actively harm. This is true almost by definition for every medical or surgical intervention.

There is pressure from multiple sources—patient, doctor, pharma, specialist, psychologist, media, disease-awareness campaigns, patient advocacy groups—to nudge this point towards the midline of the spectrum.

This is true for diabetes, but also for depression, ADHD, lipid levels, cardiac stents, and deficiencies of a host of replaceable substances including testosterone, oestrogen, and various vitamins, the trendiest of which is Vitamin D.

If we don’t test early and often for all these problems, we are ignoring our duty of care and if we don’t treat when the test result comes back in red, we are downright obstructive and possibly liable.

Andropause: the new epidemic

Take testosterone. In the past five years we have witnessed a concerted wave of discussion around the andropause. Feature articles have called it the hidden epidemic, hinting at reverse sexism whereby women get their daily oestrogen but our men’s hormones have rights too!

Disease-awareness campaigns, subtle in Australia compared to countries that allow direct-to-patient advertising, ask males if they ever experience tiredness, weakness or low libido. The suggested remedy is to get your levels checked by your friendly local GP. It’s not advertising: it’s just caring.

This tumescent rise in publicity tied in beautifully with the advent of ‘men’s clinics’, whose doctors were the only clinicians with enough spare time to keep up with all the clever new ways of getting the testosterone into your body; oral, patches, gels, suppositories, inhalants; no orifice was left unsullied in the competition to supply Vitamin T.

The result?

PBS expenditure on testosterone has increased 450 % since 2006. Patients at the pointy end of the spectrum—men with testicular cancer and orchidectomies—have been swamped by the enormous market of men who are…wait for it…ageing. A bit like what’s happening to the percentage of cancer-sufferers in the opioid market.

Last year, the departing boss of the US Medicare system, Dr Donald Berwick,estimated that 20-30 per cent of US health spending is ‘waste’—as in; it yields no benefit to patients. That one quarter of the US health budget wasted could power the entire GDP of most countries on the planet.

Berwick listed five reasons for this catastrophic waste, and the first of them was ‘overtreament’. We are not talking a minor problem here.

Why do we overtest and overtreat?

Let’s look at some causes of overtesting and overtreating. Why do we do it?

Some of it is simply because the evidence doesn’t exist yet.

There was no shame in a medical graduate treating headaches with bloodletting a century ago; no-one knew any better. According to the prevailing understanding of the human body, it made sense and it no doubt appeared to work in some people.

But lack of evidence is not the only reason for our actions.

I like the list prepared by Australian surgeon Dr Skeptic (clearly his parents were prescient when naming him) of the reasons why we act even when evidence tells us ‘Don’t just do something, stand there!’

Defensive medicine: If you miss one rarity and thereby harm one person, this is more likely to end you up in court than causing far more harm by routinely overtreating everyone.

It takes an epidemiologist to tell you about the latter, whereas a lawyer will be quite happy to keep you posted about the former.

The language of inaction vs. action: Overinvestigation and overtreatment are very difficult concepts to convey to patients.

If we tell the patient ‘I really don’t know precisely why you have low back pain; would you like me to run a few tests?’ then the answer will be ‘yes’.

Our choice of language suggests that after doing the tests, we will know why they have low back pain. But whether doctor, physiotherapist or chiropractor, ye may ask the gods of radiology but shall not receive an answer.

If we give a glucometer to a person with pre-diabetes, or with diabetes that doesn’t require insulin, we will indeed get an answer as to precisely what their blood sugar is at any given moment, but this knowledge will not actually improve health outcomes.

The answer does not help the patient, therefore we are asking the wrong question.

This flawed logic of Test = Answer = Cure is used by iridologists and scientologists. And doctors.

Influence of recent experience: obstetricians who attend a birth with complications are significantly more likely to recommend a Caesarean section in their next 50 cases, before they settle back into a more sensible, case-by-case evidence-based approach.

The lottery mindset: Few people have a good understanding of risk.

My chances of winning the first division prize in tattslotto this Saturday are the same whether or not I buy a tattslotto ticket. The same. Not absolutely, mathematically, precisely the same, but the same in any meaningful, ordinary sense of the word.

People don’t understand tiny chances. I have more chance of being dead next Saturday than being both alive and collecting my winnings.

Studies consistently show that both doctors and patients, just like gamblers and stockbrokers, overestimate gains and underestimate losses.

People will jump at a whole body CT scan to ‘rule out’ a tiny risk of cancer, and ignore the fact that the radiation from each such scan increases their lifetime cancer risk by about 1%.

The prevailing wisdom: Medical students come out of university knowing thousands of new words and knowing about thousands of new interventions. The consultants taught us all the pharmaceutical and surgical interventions in their own specialised area of expertise.

But it’s not really anyone’s job to teach you about how to avoid patient referrals into the system; how to stop the cascade before it starts.

A recent Australian study showed that half of all IV cannulas inserted in ED are never used. Why does every junior ED doctor put the IVs in? Because everyone else has always put them in.

When ‘more’ is harmful

If a junior doctor is trained in breast surgery outpatients and has met women whose cancer was detected by screening mammogram, it takes some active un-training not to assume that therefore all women are better off having a mammogram.

When I was a student, my consultant orthopod took the time to kindly explain the intricacies of spinal fusion and of arthroscopic debridement for osteoarthritic knees, and I think he probably mentioned that ‘some patients don’t seem to gain as much as others’.

However, this is a starkly different prevailing wisdom from the reviews that have shown that neither spinal fusions nor arthroscopies for osteoarthritic knees differ much from placebo. In the US alone, 650 000 such arthroscopies were performed each per year in the late 1990s.

Ironically, sometimes the richer you are, with more access to the private system and doctors who will cut corners for you, the more intervention you get and the more harm is done.

The extreme of this is the Hollywood celebrity with their own physician on call, who would feel like a fool telling his client that for their thousand-dollar callout fee they get absolutely nothing except ‘watch and wait’.

When Michael Jackson went to his umpteenth plastic surgeon, she didn’t say ‘no’. When he complained he was getting anxious and couldn’t sleep and needed something more than light sleeping tablets, I bet Dr Conrad Murray now wishes he had opted for conservative management.

Making the system work

I believe it is our ethical responsibility to avoid overtreatment at an individual level, and also to support system-wide changes in the way we spend money on health.

I am no slave to evidence-based medicine; not one of those sceptical EBM types who eat gruel for breakfast and secretly believe deep down that nothing works. Although, on a bad day this pessimism reaches its ultimate fruition—absolutely nothing I do works!

EBM is a necessary but not sufficient condition for practising good medicine.

When my friend Prof Gavin Mooney gave me his book, he explained why he’d called it EBM ‘in its place’.

He did not want to promote a system of slavish adherence to a deontology. As a leftie health economist—a rare breed indeed—his primary concern was always one of health equity. Not health equality, which is clearly unattainable, but equity, where we strive for equal access to equal care for equal need.

An equitable health system does not mean trying to give everyone the very best, if by that you mean the most; the most tests, the most expense, the most treatments. Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient.

Gavin was killed in tragic circumstances last Christmas, and I dedicate this article to his memory.

His philosophy was that, sometimes, less is more. We must pare things back, strip away excesses and judiciously apply what we know works, rather than enthusiastically embrace what we wish would work.

As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.

• Dr Justin Coleman is a GP at Inala Centre of Excellence in Aboriginal and Torres Strait Islander Health. He is senior lecturer at Griffith University and University of Queensland, and President of the Australasian Medical Writers Association (AMWA). Twitter: @drjustincoleman. Web:http://drjustincoleman.com/

• You can read more about Naked Doctor here and this Croakey page has been established as a memorial to Professor Gavin Mooney.

WHO leaks sugar report – industry will be displeased

  • WHO will recommend sugar be limited to 5 teaspoons per day or 5% of total calories (current AU consumption around 35-45 teaspoons)

Source: http://www.raisin-hell.com/2014/01/the-world-health-organisation-has-taken.html

Sunday, January 5, 2014

The World Health Organisation has taken a tough stand on sugar. It’s about time we listened.

Last week the WHO (World Health Organization) leaked a draft report about sugar. The report will tell the world’s health authorities that they should be severely limiting the amount of sugar we all eat. It will recommend that we consume no more than 5 teaspoons of sugar a day. Given the average Australian is putting away somewhere closer to 35-45 teaspoons a day, it’s a very big call indeed.
The WHO is the health policy unit of the United Nations. Its aim is provide evidence based leadership on health research. It is well funded, free from corporate influence and motivated entirely by a desire to ensure that the 92 UN member countries get the best possible, evidence based, health advice. The WHO doesn’t run a Tick program or receive sponsorship from the processed food industry. Indeed it has even recently taken the extraordinary step of banning one ‘research’ group sponsored by industry from participating in its decision making processes.
Shrinath Reddy, a cardiologist and member of the WHO panel of experts, told the Sunday Times the WHO is moving on sugar because “There is overwhelming evidence coming out about sugar-sweetened beverages and other sugar consumption links to obesity, diabetes and even cardiovascular disease.”
The worldwide burden for those diseases is accelerating very quickly. According to a new report out this week the number of overweight and obese in the developing world has quadrupled since 1980.
A billion people in the developing world are now on the chronic disease express. But don’t worry, we still win. Less than a third of the population in China and India is overweight compared to our two thirds or more. They are just starting to get the hang of this Western Diet Thingy, so expect very big rises in the very near future.
The WHO have looked dispassionately at the evidence and have seen the tsunami of human misery caused by sugar coming for more than a decade They publicly warned that sugar was strongly implicated in obesity, type II diabetes, hypertension and heart disease in 2003.
They then took the extraordinary step of telling member governments that they should ensure their populations limited sugar consumption to a maximum of 10% of total calories (around 10 teaspoons of sugar a day – the same amount you would find in a Coke or a large Apple Juice). They did this despite an overt and vicious public campaign conducted by the Food industry.
The US sugar lobby demanded that the US Congress end its $406 million funding of the WHO. This is the same WHO that co-ordinates global action against epidemics like HIV, Bird Flu and SARS. But the US food industry wanted it destroyed because it dared to suggest we eat less sugar.
The lobbying behind the scenes was even more ruthless. Derek Yach, the WHO Executive Director who drove the sugar reduction policy work told a British documentary crew in 2004, that millions were spent trying to torpedo the policy. US Senators wrote directly to the WHO threatening its very existence. They also threatened the Food and Agriculture Organisation (a sister UN department concerned with food production) with a cut in funding.
In the end the food industry campaign paid off. The WHO removed its 10% recommendation from the final text of its recommendation. It was watered down to a suggestion that people ‘cut the amount of sugar in the diet’.
As one of the people involved at the time, Professor Phillip James, Chairman of the International Obesity Taskforce, predicted “we’ll end up with nice little policies telling [us] to have ‘just a bit less sugar and a little more balanced diet’ the nonsense that’s gone on since the Second World War during which time we’ve had this vast epidemic of heart disease, diabetes and obesity.”
Even the briefest glance at the official dietary guidance on sugar in Australia or the UK will tell you Professor James wasn’t too far from the mark with his prediction. Our guidelines are stuffed with words like ‘moderation’ and ‘balanced diet’ when it comes to sugar.
But the thing about evidence is, it doesn’t go away. And in the 10 years since the WHO last tried to save us from sugar, the evidence has become overwhelming (to quote Dr Reddy).
The WHO got a serious kicking when they tried to suggest a 10 teaspoon upper limit on sugar consumption, so you can imagine that the evidence they have reviewed must be truly overpowering to have them step up to the plate again. But this time they want the limit to be 5% (5 teaspoons) or less. I hope they’re wearing their flak jackets because I suspect a whole heap of blood money from the processed food industry is pouring into ‘lobbyists’ pockets as we speak.
The WHO is not running down sugar because it hates sugar farmers. It is not doing it because it likes getting mauled by the US Government (and its sponsors). It’s doing it because we will all suffer immensely if we don’t act on its advice.
I don’t know if the WHO can withstand the punishment they are about to receive. And I have no confidence that their recommended limit will make it through the firestorm of food industry sponsored ‘science’ which will suddenly surface. But I do know that when good people decide the evidence is so powerful that they should say it anyway, then the rest of us better be bloody listening.

Handler: Big data without the entry

  • the inconvenience of data entry stops tools being used by doctors
  • computer-assisted physician documentation (CAPD) can alleviate the problem
  • natural language understanding (NLU) converts sentences into codes
  • One of the most exciting promises of Big Data combined with advanced technologies like NLU is its ability to “bootstrap” — to identify important missing data and either grab it from another source, infer it from existing data, or prompt the clinician to add it during the normal course of documentation.

Source: http://www.wired.com/insights/2013/12/big-data-insights-without-big-data-entry/

Big Data Insights Without Big Data Entry

  • BY DR. JONATHAN HANDLER, M*MODAL
  • 12.23.13
  • 2:03 PM

Image: openexhibits/Flickr

Image: openexhibits/Flickr

My brand new Xbox One has a new feature — its Kinect camera recognizes users’ faces and automatically logs them in. The new feature saves me nine clicks and 45 seconds compared to my old Xbox 360. Microsoft correctly recognized that people are too busy for their videogames to turn them into data entry clerks.

It’s the same thing for doctors. Doctors can use Big Data algorithms to help provide better patient care, but those algorithms are very data hungry. Who will enter all that data? Patients already wait weeks to be seen and then sit in the waiting room despite an appointment. The last thing we want is doctors doing even more data entry that further delays care. At the same time, we want computers to provide great decision support to doctors.

Can we realize the benefits of Big Data without suffering the pain of Big Data entry? Using modern technologies, this may be possible.

In 1998, my colleagues and I had built some of the earliest online decision support tools, and within a few years more than 50 were freely available. Although very useful, I used those tools only occasionally in my clinical practice because most required me to enter lots of data before providing any help in return. For example, one score for predicting the likelihood a patient will die requires the user to enter 17 pieces of data, two of which are formulas that must be calculated. Big Data can speed up the development of decision support tools and create more accurate algorithms.

However, those algorithms require lots of data. For example, an algorithm for diagnosing heart attacks was developed using 156 data elements, and the final model used 40 data elements. Entering all that data is a lot to ask from physicians trying to manage an already overcrowded emergency department.

Over the last decade, it has been well documented that tools demanding significant data entry were felt to make doctors less efficient and were seen as less useful. Not surprisingly, tools requiring doctors to do lots of data entry were used less often. Although automatically populating these tools with existing data from the Electronic Health Record (EHR) seems the obvious answer, it has real challenges.

Sometimes it is not clear which data to enter. For example, there may be multiple blood pressures documented and it may not be obvious to the computer which one should be used. In other cases, the needed inputs might not be recorded exactly in the right format or with the right details, or the inputs may be completely missing.

New technologies might alleviate the problem. Computer-assisted physician documentation (CAPD) uses real-time natural language understanding (NLU) to convert the clinician’s sentences into computer-readable codes.

Those codes automatically populate decision support algorithms executed by rules engines, and seamlessly display the result. Caregivers are no longer forced to re-document the same information in checkboxes and manually run the tools. The rules engines automatically identify when a required input is missing, and can immediately prompt the clinician to add it if warranted. This enables real-time decision support, so that clinicians can provide better care.

CAPD becomes even more interesting when the rules engines start doing even more work on behalf of the clinician. For example, in addition to auto-populating decision support rules, the system might even save time by auto-generating some of the doctor’s notes, auto-generating billing and regulatory reporting codes, and automatically building a provisional set of orders based on the doctor’s notes.

One of the most exciting promises of Big Data combined with advanced technologies like NLU is its ability to “bootstrap” — to identify important missing data and either grab it from another source, infer it from existing data, or prompt the clinician to add it during the normal course of documentation. With these technologies in place, we can reap the value of Big Data without paying the heavy cost of Big Data entry. The result will be better and faster healthcare for all.

Dr. Jonathan Handler is the Chief Medical Information Officer at M*Modal.

RAND: Impact of food environment

  • studies prove that people’s choices are heavily influenced by the setting, context, framing, and characteristics of the environment in which they make food purchasing decisions.
  • In the early 1980s, manufacturers discovered that how their products were marketed in stores was among the most important factors in influencing the buying habits of consumers. That fueled an acceleration in the practice of buying supermarket shelf space, a deal in which retailers give preferred placement to the products of wholesalers who pay for it. The ends of aisles, near the check out lines and stand-alone floor displays are choice product locations. This is how that Santa cutout ends up hawking candy canes in the middle of the produce section. People are very sensitive to such displays. As a consequence, purchases from these locations are between two and five times higher than when the same items are placed elsewhere. The products displayed in this way comprise an estimated 30 percent of all supermarket sales and provide the largest profits for manufacturers.
  • Lame remedies then follow…

‘Tis the Season to Be Wary

Deborah Cohen  December 23, 2013
 

During the holiday season it’s more important than ever that consumers consider the fundamental force driving the obesity epidemic in America: the tsunami of novel strategies used to market food. When shopping for holiday food, keep in mind that the treats being proffered by that smiling, life-sized Santa cutout in the aisle of your favorite supermarket may not be the healthiest gift for you and your waistline.

During the holiday season, a time when overindulgence is a tradition for many, food marketing creates especially serious challenges for people trying to limit their intake and make careful decisions about healthier eating. Walk through any supermarket or big box store this time of year and it’s impossible not to be confronted with promotions for fatty appetizers and snacks, processed cookies and cakes, holiday themed sugary drinks and cereals and super-sized chocolates and candy canes.

To be sure, it is ultimately up to individuals whether to reach for that highly processed treat that is all but devoid of nutritional value. Yet the common belief that everyone has the capacity to consciously and independently control what they buy or how much and what they eat is challenged by studies that have proven that people’s choices are heavily influenced by the setting, context, framing, and characteristics of the environment in which they make these decisions. This is a problem all year, of course, but it becomes even more difficult to resist in-store temptation when it is bathed in images of holiday good cheer.

Food purchasing environments are controlled by the food industry, whose goal, like all other businesses, is to increase profits. And the food industry is free to craft a seasonal marketing environment that portrays poor nutritional choices as cherished holiday traditions without regard for the consequences on consumers’ health.

In the early 1980s, manufacturers discovered that how their products were marketed in stores was among the most important factors in influencing the buying habits of consumers. That fueled an acceleration in the practice of buying supermarket shelf space, a deal in which retailers give preferred placement to the products of wholesalers who pay for it. The ends of aisles, near the check out lines and stand-alone floor displays are choice product locations. This is how that Santa cutout ends up hawking candy canes in the middle of the produce section.

People are very sensitive to such displays. As a consequence, purchases from these locations are between two and five times higher than when the same items are placed elsewhere. The products displayed in this way comprise an estimated 30 percent of all supermarket sales and provide the largest profits for manufacturers. They also disproportionately feature highly processed, low-nutrient, “value added” products — the worst for your health. People typically do not recognize that placement figures in their selection of such products, and instead, tend to blame themselves when their holiday shopping trip yields enough fat and sugar to swell even Santa’s ample waistline.

With increasing demand from manufacturers for this premium shelf space, supermarkets have grown larger and larger. The growing variety of products, especially when the holidays are here, can lead people to resort to a type of cognitive processing that relies on mental shortcuts instead of thoughtful decisions. This can lead to impulsive, poor choices based upon superficial characteristics like appearance, pricing, and salience. Thus, the modern supermarket is an environment that increases the risk of chronic diseases all year, but especially now.

Unless we grow our own food, we humans have a limited capacity to avoid exposure to these risk factors. The burden on individuals to keep up their guard, to be wary, and to actively resist an overwhelming food environment has become more than most of us can bear. If we really want to help consumers achieve their long-term goals of controlling their weight and eating a diet that won’t lead to heart disease or diabetes, we need solutions that won’t force people to work so hard.

So how do we make it easier? We need very specific consumer research on how to place products in stores so they don’t overwhelm consumers. Maybe we should segregate all the foods known to increase the risk of chronic diseases from the foods that don’t. Then people who want to limit their exposure can do so, and those who don’t will still be able to choose what they want. Maybe we should set limits on which products can be placed in salient promotional displays. Would consumers feel that their rights had been abridged if they had to travel to the back of the store to get candy and soda, but could find skim milk right up front?

Ordinarily our society does not tolerate flawed designs or business practices that increase the risk of illness or injury. We should no longer accept food marketing practices that undermine our health. As the most important consumer season gets underway, we need to start mitigating these factors if we want better health in 2014.


Deborah A. Cohen, M.D., is a senior natural scientist at the RAND Corporation and the author of the forthcoming book, A Big Fat Crisis: The Hidden Influences Behind the Obesity Epidemic — and How We Can End It.

This commentary appeared on The RAND Blog on December 23, 2013.