Category Archives: healthcare

US Healthcare Price Transparency

An interesting observation – unintended consequence of non-universal healthcare?: As consumers are being asked to pay more, so they’re trying to become better health-care shoppers.

  • states have passed transparency laws
  • medicare has started to dump raw service cost data
  • private firms are developing their own transparency tools
  • a report recommends:
    • total estimated price
    • out-of-pocket costs
    • patient safety and clinical outcome data

“Care providers, employers and health plans have negotiated rates, which isn’t necessarily something they want out in the public. They warn making those negotiations publicly could actually discourage negotiations for lower prices — naturally, there are conflicting opinions on this point.”

 

http://www.washingtonpost.com/blogs/wonkblog/wp/2014/04/16/price-transparency-stinks-in-health-care-heres-how-the-industry-wants-to-change-that/

Price transparency stinks in health care. Here’s how the industry wants to change that.

By Jason Millman Updated: April 16

There’s been much written in the past year about just how hard it is to get a simple price for a basic health-care procedure. The industry has heard the rumblings, and now it’s responding.

About two dozen industry stakeholders, including main lobbying groups for hospitals and health insurers, this morning are issuing new recommendations for how they can provide the cost of health-care services to patients.

The focus on health-care price transparency — discussed in Steven Brill’s 26,000-word opus on medical bills for Time last year — has intensified, not surprisingly, as people are picking up more of the tab for their health care. Employers are shifting more costs onto their workers, and many new health plans under Obamacare feature high out-of-pocket costs.

The health care-industry has some serious catching up to do on the transparency front. States have passed their own health price transparency laws, Medicare has started to dump raw data on the cost of services and what doctors get paid, and private firms have developed their own transparency tools.

“We need to own this as an industry. We need to step up,” said Joseph Fifer, president and CEO of the Healthcare Financial Management Association, who coordinated the group issuing the report this morning. The stakeholder group includes hospitals, consumer advocates, doctors and health systems.

Their recommendations delineate who in the health-care system should be responsible for providing pricing information and what kind of information to provide depending on a person’s insurance status. Just getting the different stakeholders on the same page was difficult enough in the past, said Rich Umbdenstock, president and CEO of the American Hospital Association.

“We couldn’t agree on whose role was what. We were using terms differently,” he said.

The report’s major recommendations include how to provide patients with:

  • the total estimated price of the service
  • a clear indication of whether the provider is in-network or where to find an in-network provider
  • a patient’s out-of-pocket costs
  • and other relevant information, like patient safety scores and clinical outcomes.

“I think that the focus now, unlike three years ago when it was on access, the focus is about affordability,” said Karen Ignagni, president and CEO of America’s Health Insurance Plans. “What are the prices being charged? It leads consumers to want to know, ‘How do I evaluate all that?'”

To give a sense of just how murky health pricing can be, one of the group’s recommendations is for providers to offer uninsured patients their estimated cost for a standard procedure and to make clear how complications could increase the price. You would think that shouldn’t be too hard — there’s no insurer to deal with, no contracts to consult.

But previous research points out just how difficult it can be to get the price for a basic, uncomplicated procedure. In a study published this past December, researchers found that just three out of 20 hospitals could say how much an uninsured person should expect to pay for a simple test measuring heartbeat rate.

The group’s recommendations also touches on limits to transparency and the “unintended consequences” of too much data being public. Care providers, employers and health plans have negotiated rates, which isn’t necessarily something they want out in the public. They warn making those negotiations publicly could actually discourage negotiations for lower prices — naturally, there are conflicting opinions on this point.

The report nods to other ways at achieving transparency. For example, it talks about “reference pricing” in self-funded employer health plans, in which employers limit what they’ll pay for an employee’s health-care services — thus setting the reference price.

“The employer communicates to employees a list of the providers who have agreed to accept the reference price (or less) for their services. If an employee chooses a provider who has not accepted the reference price, the employee is responsible for the amount the provider charges above the reference price,” the report reads, noting that Safeway grocery stores implemented a successful pilot program that expanded a few years ago.

Perhaps what’s most significant about these recommendations is the stakeholders’ acknowledgement that the health-care market is changing. Consumers are being asked to pay more, so they’re trying to become better health-care shoppers

AHIP’s Ignagni said most insurers already provide cost calculator tools and quality data on their Web sites. Providers, said the AHA’s Umbdenstock, need to be more accommodating to patients’ price-sensitivity.

“‘We can’t answer your question’ may have worked in the past, but it doesn’t fly any longer,” said Mark Rukavina, principal with Community Health Advisors and a report contributor. “This [report] basically lays out the principles for creating a new response to the question.”

Jason Millman covers all things health policy, with a focus on Obamacare implementation. He previously covered health policy for Politico. He is an unapologetic fan of the New York Yankees and Giants, though the Nationals and Teddy Roosevelt hold a small place in his heart. He’s on Twitter.

Katz on managing severe obesity

good, balanced diatribe..

http://www.linkedin.com/today/post/article/20140408142414-23027997-severe-obesity-let-em-eat-kale

Severe Obesity? Let ‘Em Eat Kale!

The tale of aristocratic indifference on the part of Marie Antoinette, Queen of France at the time of the French Revolution, wife of Louis XVI, is, we now know, likely apocryphal. Still, like many historical distortions, this one reverberates through modern culture just the same, and harbors meaning as archetype, if not as reliably archived fact. You no doubt know the tale:

The peasants were starving and had no bread. Marie allegedly suggested: “let them eat cake!”

We find a modern day analogue in the advice dispensed by foodie elite who suggest that the masses should just eat “real” food. The definition of “real” is generally left open to interpretation- but of course, Marie never said what kind of cake, either.

The connotations of “real” are clear enough: pure, unpackaged foods; those icons of nutritional virtue about which the wholesome truth is so self-evident that ingredient lists and nutrition fact panels are superfluous. Wild salmon comes to mind. And broccoli, presumably organic. And fresh berries.

In other words, since the people have no whole-grain bread: let ‘em eat kale!

Now, frankly, I’m quite partial to kale. And, for that matter, the potentially even more nutritious fiddlehead ferns. But I have a real antipathy for fiddling around, or issuing jejune exhortations, while Rome is burning. And burning, it is.

For those inclined to celebrate the recent and radically distorted ping about childhood obesity rates ‘plummeting,’ came this week’s predictably countervailing pongthey have not plummeted after all. More importantly, the most recent paper on childhood obesity trends shows that severe obesity is rising disproportionately.

That’s worth reiterating: whatever is happening to overall obesity rates, rates of severe obesity are rising briskly in children. Prior research had already indicated that was true in both children and adults, so speaking of cake, this is really just icing on what was already well baked. But we seemed in need of a timely reminder.

Fundamentally, this means that it may no longer help us much to ask and answer: how many Americans are overweight or obese? That number, or percentage, may now be level and rather uninteresting, if only because it is pressed up against the ceiling. To gauge the severity of hyperendemic obesity in our culture, we may now need to ask: how overweight and obese are the many?

The answer, ever more often, is: severely.

That severe obesity rates are rising steadily and perhaps steeply has two flagrant implications. The first is that we are not doing nearly enough at the level of our culture to make eating well, being active, and thereby controlling weight the prevailing norm. These two behaviors and one outcome remain exception rather than rule, costing us dearly- in every currency that matters, human potential above all.

The second implication is that we need good treatments for severe obesity, since it is already well established among us.

I have first hand experience with severe obesity, in adults and kids alike. Unlike garden-variety weight gain, severe obesity generally occurs in the context of diverse hardships. Sometimes, there is the duress of a dysfunctional family dynamic. Sometimes there is an underlying mental health problem. Sometimes the propagating factors are preferentially, if not exclusively, socioeconomic: a rough neighborhood, with lack of access to “real” food and recreational opportunities, and the inevitable clustering of fast food franchises. That latter peril makes me think of wolves surrounding the most vulnerable member of a herd. Almost inevitably, there is ridicule, disparagement, and disadvantage; the literal, daily addition of insult to injury.

Bariatric surgery is effective treatment for severe obesity, and I have long advocated strenuously that it should be available, and reimbursable, for all who truly need it. But meaning no disrespect to the surgeons who provide or patients who receive it, it’s a rather poor option and should be a last resort, not a first, especially for children. The surgery is potentially major, and thus encumbered by all of the customary risks. The long-term effects are far from perfect, and substantially unknown for children. The monetary costs are apt to be unmanageable if this becomes the “go to” solution for an increasingly prevalent problem.

And most importantly: nobody learns anything under general anesthesia. The root causes of severe obesity are not addressed with scalpels. There is no way to share the benefits of a redirected gastrointestinal tract. In contrast, “skillpower” can be shared. A systematic effort to empower those most in need with the skills and resources needed to eat well, be active, lose weight, and find health- physical and mental- would allow for paying it forward, to family and friends, and the next generation. The good of surgery is contained within a body. The good of propagating skills and resources for healthy living reverberates throughout the body politic.

My friend David Freedman, the highly accomplished health journalist, and I have had a spirited and fairly public exchange on the topic of “getting there” from here. When Mr. Freedman suggested that better junk food could be part of the answerI protested: anything that is genuinely part of the solution is, by definition, no longer junk. When I emphasized the importance of knowing what dietary pattern is best for healthMr. Freedman parried back that I might be diverting attention from the critical need to pave a way of getting there from here, accessible in particular for those currently most forestalled.

But in the end, our private exchanges indicated that our public argument was mostly smoke and just about no fire. We both agree that we can’t have good diets supporting good health if we don’t acknowledge we know what a good diet is. And we both agree that knowing that “real” food is good does just about nothing to help modify and improve the diets and health of real people.

For that, we need an expansive cultural commitment; a movement; perhaps even a revolution. We need approaches to severe obesity that don’t just fix it after it happens. Big Surgery and Big Pharma may be beneficiaries of this, but the rest of us will be in one helluva fix. The better way is introducing innovative solutions that confront it at its origins and spread of their own accord.

We need to reorient our cultural attitude about obesity so it is not an excuse to argue the respective merits of personal responsibility and public policy. Rather, if we are to fix it at its origins, we need to acknowledge that people who are empowered are most capable, and most inclined, to exercise responsibility. So let’s build it, and see what comes.

We can, and should, empower people to trade up the food choices they are already making.Better chips may not satisfy the purists, but the evidence is in hand that improving food choices- even among the homely fare that comes in bags, boxes, bottles, jars, and cans- adds up to make a truly important difference for populations, and individuals alike. This can be done without spending more moneyurban legend to the contrary notwithstanding. Still, we could likely accomplish far more by combining nutrition guidance systems with financial incentives that encourage their use.

Among such approaches, too, are community and New-Age approaches to gardening that might even allow many more of us to grow our own kale- and perhaps fiddlehead ferns.

But “let ‘em eat kale” simply won’t do. It’s fatuous, unrealistic, elitist nonsense. It’s fiddling around. And all the while, Rome burns.

-fin

Dr. David L

an idea of earth shattering significance

ok.

been looking for alignment between a significant industry sector and human health. it’s a surprisingly difficult alignment to find… go figure?

but I had lunch with joran laird from nab health today, and something amazing dawned on me, on the back of the AIA Vitality launch.

Life (not health) insurance is the vehicle. The longer you pay premiums, the more money they make.

AMAZING… AN ALIGNMENT!!!

This puts the pressure on prevention advocates to put their money where their mouth is.

If they can extend healthy life by a second, how many billions of dollars does that make for life insurers?

imagine, a health intervention that doesn’t actually involve the blundering health system!!?? PERFECT!!!

And Australia’s the perfect test bed given the opt out status of life insurance and superannuation.

Joran wants to introduce me to the MLC guys.

What could possibly go wrong??????

Flu Trends fails…

  • “automated arrogance”
  • big data hubris
  • At its best, science is an open, cooperative and cumulative effort. If companies like Google keep their big data to themselves, they’ll miss out on the chance to improve their models, and make big data worthy of the hype. “To harness the research community, they need to be more transparent,” says Lazer. “The models for collaboration around big data haven’t been built.” It’s scary enough to think that private companies are gathering endless amounts of data on us. It’d be even worse if the conclusions they reach from that data aren’t even right.

But then this:
http://www.theatlantic.com/technology/archive/2014/03/in-defense-of-google-flu-trends/359688/

 

http://time.com/23782/google-flu-trends-big-data-problems/

Google’s Flu Project Shows the Failings of Big Data

Google flu trends
GEORGES GOBET/AFP/Getty Images

A new study shows that using big data to predict the future isn’t as easy as it looks—and that raises questions about how Internet companies gather and use information

Big data: as buzzwords go,it’s inescapable. Gigantic corporations like SAS andIBM tout their big data analytics, while experts promise that big data—our exponentially growing ability to collect and analyze information about anything at all—will transform everything from business to sports to cooking. Big data was—no surprise—one of the major themes coming outof this month’s SXSW Interactive conference. It’s inescapable.

One of the most conspicuous examples of big data in action is Google’s data-aggregating tool Google Flu Trends (GFT). The program is designed to provide real-time monitoring of flu cases around the world based on Google searches that match terms for flu-related activity. Here’s how Google explains it:

We have found a close relationship between how many people search for flu-related topics and how many people actually have flu symptoms. Of course, not every person who searches for “flu” is actually sick, but a pattern emerges when all the flu-related search queries are added together. We compared our query counts with traditional flu surveillance systems and found that many search queries tend to be popular exactly when flu season is happening. By counting how often we see these search queries, we can estimate how much flu is circulating in different countries and regions around the world.

Seems like a perfect use of the 500 million plus Google searchesmade each day. There’s a reason GFT became the symbol of big data in action, in books like Kenneth Cukier and Viktor Mayer-Schonberger’s Big Data: A Revolution That Will Transform How We Live, Work and Think. But there’s just one problem: as a new article in Science shows, when you compare its results to the real world, GFT doesn’t really work.

GFT overestimated the prevalence of flu in the 2012-2013 and 2011-2012 seasons by more than 50%. From August 2011 to September 2013, GFT over-predicted the prevalence of the flu in 100 out 108 weeks. During the peak flu season last winter, GFTwould have had us believe that 11% of the U.S. had influenza, nearly double the CDC numbers of 6%. If you wanted to project current flu prevalence, you would have done much better basing your models off of 3-week-old data on cases from the CDC than you would have been using GFT’s sophisticated big data methods. “It’s a Dewey beats Truman moment for big data,” says David Lazer, a professor of computer science and politics at Northeastern University and one of the authors of the Sciencearticle.

Just as the editors of the Chicago Tribune believed it could predict the winner of the close 1948 Presidential election—they were wrong—Google believed that its big data methods alone were capable of producing a more accurate picture of real-time flu trends than old methods of prediction from past data. That’s a form of “automated arrogance,” or big data hubris, and it can be seen in a lot of the hype around big data today. Just because companies like Google can amass an astounding amount of information about the world doesn’t mean they’re always capable of processing that information to produce an accurate picture of what’s going on—especially if turns out they’re gathering the wrong information. Not only did the search terms picked by GFT often not reflect incidences of actual illness—thus repeatedly overestimating just how sick the American public was—it also completely missed unexpected events like the nonseasonal 2009 H1N1-A flu pandemic. “A number of associations in the model were really problematic,” says Lazer. “It was doomed to fail.”

Nor did help that GFT was dependent on Google’s top-secret and always changing search algorithm. Google modifies its search algorithm to provide more accurate results, but also to increase advertising revenue. Recommended searches, based on what other users have searched, can throw off the results for flu trends. While GFT assumes that the relative search volume for different flu terms is based in reality—the more of us are sick, the more of us will search for info about flu as we sniffle above our keyboards—in fact Google itself alters search behavior through that ever-shifting algorithim. If the data isn’t reflecting the world, how can it predict what will happen?

GFT and other big data methods can be useful, but only if they’re paired with what the Science researchers call “small data”—traditional forms of information collection. Put the two together, and you can get an excellent model of the world as it actually is. Of course, if big data is really just one tool of many, not an all-purpose path to omniscience, that would puncture the hype just a bit. You won’t get a SXSW panel with that kind of modesty.

A bigger concern, though, is that much of the data being gathered in “big data”—and the formulas used to analyze it—is controlled by private companies that can be positively opaque. Google has never made the search terms used in GFT public, and there’s no way for researchers to replicate how GFT works. There’s Google Correlate, which allows anyone to find search patterns that purport to map real-life trends, but as the Scienceresearchers wryly note: “Clicking the link titled ‘match the pattern of actual flu actvity (this is how we built Google Flu Trends!)’ will not, ironically, produce a replication of the GFT search terms.” Even in the academic papers on GFT written by Google researchers, there’s no clear contact information, other than a generic Google email address. (Academic papers almost always contain direct contact information for lead authors.)

At its best, science is an open, cooperative and cumulative effort. If companies like Google keep their big data to themselves, they’ll miss out on the chance to improve their models, and make big data worthy of the hype. “To harness the research community, they need to be more transparent,” says Lazer. “The models for collaboration around big data haven’t been built.” It’s scary enough to think that private companies are gathering endless amounts of data on us. It’d be even worse if the conclusions they reach from that data aren’t even right.

Ornish on Digital Health

The limitations of high-tech medicine are becoming clearer—e.g., angioplasty, stents, and bypass surgery don’t prolong life or prevent heart attacks in stable patient; only one out of 49 men treated for prostate cancer benefit from the treatment, and the other 48 often become impotent, incontinent or both; and drug treatments of type 2 diabetes don’t work nearly as well as lifestyle changes in preventing the horrible complications.

http://www.forbes.com/sites/johnnosta/2014/03/17/the-stat-ten-dean-ornish-on-digital-health-wisdom-and-the-value-of-meaningful-connections/

3/17/2014 @ 11:09AM |1,095 views

The STAT Ten: Dean Ornish On Digital Health, Wisdom And The Value Of Meaningful Connections

STAT Ten is intended to give a voice to those in digital health. From those resonant voices in the headlines to quiet innovators and thinkers behind the scenes, it’s my intent to feature those individuals who are driving innovation–in both thought and deed. And while it’s not an exhaustive interview, STAT Ten asks 10 quick questions to give this individual a chance to be heard.  

Dean Ornish, MD is a fascinating and important leader in healthcare.  His vision has dared to question convention and look at health and wellness from a comprehensive and unique perspective.  He is a Clinical Professor of Medicine, UCSF Founder & President, nonprofit Preventive Medicine Research Institute.

Dr. Ornish’s pioneering research was the first to prove that lifestyle changes may stop or even reverse the progression of heart disease and early-stage prostate cancer and even change gene expression, “turning on” disease-preventing genes and “turning off” genes that promote cancer, heart disease and premature aging. Recently, Medicare agreed to provide coverage for his program, the first time that Medicare has covered an integrative medicine program. He is the author of six bestselling books and was recently appointed by President Obama to the White House Advisory Group on Prevention, Health Promotion, and Integrative and Public Health. He is a member of the boards of directors of the San Francisco Food Bank and the J. Craig Venter Institute. The Ornish diet was rated #1 for heart health by U.S. News & World Report in 2011 and 2012. He was selected as one of the “TIME 100” in integrative medicine, honored as “one of the 125 most extraordinary University of Texas alumni in the past 125 years,” recognized by LIFE magazine as “one of the 50 most influential members of his generation” and by Forbes magazine as “one of the 7 most powerful teachers in the world.”

The lexicon of his career is filled with words that include innovator, teacher and game-changer.  And with this impressive career and his well-established ability to look at health and medicine in a new light, I thought i would be fun–and informative–to ask Dr. Ornish some questions about digital health.

Dean Ornish, MD

Dean Ornish, MD

 1. Digital health—many definitions and misconceptions.  How would describe this health movement in a sentence or two?

“Digital health” usually refers to the idea that having more quantitative information about your health from various devices will improve your health by changing your behaviors.  Information is important but it’s not usually sufficient to motivate most people to make meaningful and lasting changes in healthful behaviors.  If it were, no one would smoke cigarettes.

2. You’ve spoken of building deep and authentic connection among  patients as key element of your wellness programs.  Can digital health foster that connection or drive more “techno-disconnection”?

Both.  What matters most is the quality and meaning of the interaction, not whether it’s digital or analog (in person).  Study after study have shown that people who are lonely, depressed, and isolated are three to ten times more likely to get sick and die prematurely compared to those who have a strong sense of love and community.  Intimacy is healing.  In our support groups, we create a safe environment in which people can let down their emotional defenses and communicate openly and authentically about what’s really going on in their lives without fear they’ll be rejected, abandoned, or betrayed.  The quality and meaning of this sense of community is often life-transforming.  It can be done digitally, but it’s more effective in person.  A digital hug is not quite as fulfilling, but it’s much better than being alone and feeling lonely.

3. How can we connect clinical validation to the current pop culture trends of “fitness gadgets”?

Awareness is the first step in healing.  In that context, information can raise awareness, but it’s only the first step.

 4. Can digital health help link mind and body wellness?

Yes.  Nicholas Christakis’ research found that if your friends are obese, your risk of obesity if 45% higher.  If your friends’ friends are obese, your risk of obesity if 25% higher.  If your friends’ friends’ friends are obese, your risk is 10% higher—even if you’ve never met them.  That’s how interconnected we are.  Their study also showed that social distance is more important than geographic distance.  Long distance is the next best thing to being there (and in some families, even better…).

5. Are there any particular area of medicine and wellness that might best fit in the context of digital health (diet, exercise, compliance, etc.)?

They all do.

6. There is much talk on the empowerment of the individual and the “democratization of data”.  From your perspective are patients becoming more engaged and involved in their care?

Patients are becoming more empowered in all areas of life, not just with their health care.  Having access to one’s clinical data can be useful, but even more empowering is access to tools and programs that enable people to use the experience of suffering as a catalyst and doorway for transforming their lives for the better.  That’s what our lifestyle program provides.

 7. Is digital health “sticking” in the medical community?  Or are advances being driven more by patients?

Electronic medical records are finally being embraced, in part due to financial incentives.  Also, telemedicine is about to take off, as it allows both health care professionals and patients to leverage their time and resources more efficiently and effectively.  But most doctors are not prescribing digital health devices for their patients.  Not yet.

 8. Do you personally use any devices?  Any success (or failure) stories?

I weigh myself every day, and I work out regularly using weight machines and a treadmill desk.  I feel overloaded by information much of the day, so I haven’t found devices such as FitBit, Nike Plus, and others to be useful.  These days, I find wisdom to be a more precious commodity than information.

 9. What are some of the exciting areas of digital health that you see on the horizon?

The capacity for intimacy using digital platforms is virtually unlimited, but, so far, we’ve only scratched the surface of what’s possible.  It’s a testimony to how primal our need is for love and intimacy that even the rather superficial intimacy of Facebook (or, before that, the chat rooms in AOL, or the lounges in Starbucks) created multi-billion-dollar businesses.

My wife, Anne, is a multidimensional genius who is developing ways of creating intimate and meaningful relationships using the interface of digital technologies and real-world healing environments.  She also designed our web site (www.ornish.com) and created and appears in the guided meditations there; Anne has a unique gift of making everyone and everything around her beautiful.

 10. Medicare is now covering Dr. Dean Ornish’s Program for Reversing Heart Disease as a branded program–a landmark event–and you recently formed a partnership with Healthways to train health care professionals, hospitals, and clinics nationwide.  Why now?

We’re creating a new paradigm of health care—Lifestyle Medicine—instead of sick care, based on lifestyle changes astreatment, not just as prevention.  Lifestyle changes often work better than drugs and surgery at a fraction of the cost—and the only side-effects are good ones.  Like an electric car or an iPhone, this is a disruptive innovation.  After 37 years of doing work in this area, this is the right idea at the right time.

The limitations of high-tech medicine are becoming clearer—e.g., angioplasty, stents, and bypass surgery don’t prolong life or prevent heart attacks in stable patient; only one out of 49 men treated for prostate cancer benefit from the treatment, and the other 48 often become impotent, incontinent or both; and drug treatments of type 2 diabetes don’t work nearly as well as lifestyle changes in preventing the horrible complications.

At the same time, the power of comprehensive lifestyle changes is becoming more well-documented.  In our studies, we proved, for the first time, that intensive lifestyle changes can reverse the progression of coronary heart disease and slow, stop, or reverse the progression of early-stage prostate cancer.  Also, we found that changing your lifestyle changes your genes—turning on hundreds of good genes that protect you while downregulating hundreds of genes that promote heart disease, cancer, and other chronic diseases.  Our most recent research found that these lifestyle changes may begin to reverse aging at a cellular level by lengthening our telomeres, the ends of our chromosomes that control how long we live.

Finally, Obamacare turns economic incentives on their ear, so it becomes economically sustainable for physicians to offer training in comprehensive lifestyle changes to their patients, especially now that CMS is providing Medicare reimbursement and insurance companies such as WellPoint are also doing so.  Ben Leedle, CEO of Healthways, is a visionary leader who has the experience, resources, and infrastructure for us to quickly scale our program to those who most need it.  Recently, we trained UCLA, The Cleveland Clinic, and the Beth Israel Medical Center in New York in our program, and many more are on the way.

 

Anne Wojcicki lays out 23andMe’s vision…

 

http://www.engadget.com/2014/03/09/future-of-preventative-medicine/

Anne Wojcicki and her genetic sequencing company 23andMe are locked in abattle with the FDA. Even though it can’t report results to customers right now, Wojcicki isn’t letting herself get bogged down in the present. At SXSW 2014 she laid out her vision of the future of preventative medicine — one where affordable genome sequencing comes together with “big data.” In addition to simply harvesting your genetic code, the company is doing research into how particular genes effect your susceptibility to disease or your reaction to treatments. And 23andMe isn’t keeping this information locked down. It has been building APIs that allow it to share the results of its research as well as the results your genetic tests, should you wish to.

It’s when that data is combined with other information, say that harvested from a fitness tracker, and put in the hands of engineers and doctors. In the future she hopes that you’ll see companies putting the same effort into identifying and addressing health risks as they do for tracking your shopping habits. Targetfamously was able to decode that a woman was pregnant before she told her father, based purely on her purchase history. One day that same sort of predictive power could be harnessed to prevent diabetes or lessen a risk for a heart attack. Whether or not that future is five, 10 or 15 years off is unclear. But if Wojcicki has her way, you’ll be able to pull up health and lifestyle decisions recommended for you with the same ease that you pull up suggested titles on Netflix.

On bureaucracies

The American economist William A. Niskanen considered the organisation of bureaucracies and proposed a budget maximising model now influential in public choice theory. It stated that rational bureaucrats will “always and everywhere seek to increase their budgets in order to increase their own power.”

An unfettered bureaucracy was predicted to grow to twice the size of a comparable firm that faces market discipline, incurring twice the cost.

http://theconversation.com/reform-australian-universities-by-cutting-their-bureaucracies-12781

Reform Australian universities by cutting their bureaucracies

Australian universities need to trim down their bureaucracies. University image from www.shutterstock.com

Universities drive a knowledge economy, generate new ideas and teach people how to think critically. Anything other than strong investment in them will likely harm Australia.

But as Australian politicians are preparing to reform the university sector, there is an opportunity to take a closer look at the large and powerful university bureaucracy.

Adam Smith argued it would be preferable for students to directly pay academics for their tuition, rather than involve university bureaucrats. In earlier times, Oxford dons received all tuition revenue from their students and it’s been suggested that they paid between 15% and 20% for their rooms and administration. Subsequent central collection of tuition fees removed incentives for teachers to teach and led to the rise of the university bureaucracy.

Today, the bureaucracy is very large in Australian universities and only one third of university spending is allocated to academic salaries.

 

The money (in billions) spent by the top ten Australian research universities from 2003 to 2010 (taken from published financial statements).Authors
Click to enlarge

 

Across all the universities in Australia, the average proportion of full-time non-academic staff is 55%. This figure is relatively consistent over time and by university grouping (see graph below).

Australia is not alone as data for the United Kingdom shows a similar staffing profile with 48% classed as academics. A recent analysis of US universities’ spending argues:

Boards of trustees and presidents need to put their collective foot down on the growth of support and administrative costs. Those costs have grown faster than the cost of instruction across most campuses. In no other industry would overhead costs be allowed to grow at this rate – executives would lose their jobs.

We know universities employ more non-academics than academics. But, of course, “non-academic” is a heterogeneous grouping. Many of those classified as “non-academic” directly produce academic outputs, but this rubs both ways with academics often required to produce bureaucratic outputs.

An explanation for this strange spending allocation is that academics desire a large bureaucracy to support their research efforts and for coping with external regulatory requirements such as the Excellence in Research for Australia (ERA) initiative, theAustralian Qualifications Framework (AQF) and the Tertiary Education Quality and Standards Agency (TEQSA).

 

Staffing profile (% of total FTE classed as academic) of Australian universities 2001-2010, overall and by university groupings/ alliances.Authors

 

Another explanation is that university bureaucracies enjoy being big and engage in many non-academic transactions to perpetuate their large budget and influence.

The theory to support the latter view came from Cyril Northcote Parkinson, a naval historian who studied the workings of the British civil service. While not an economist, he had great insight into bureaucracy and suggested:

There need be little or no relationship between the work to be done and the size of the staff to which it may be assigned.

Parkinson’s Law rests on two ideas: an official wants to multiply subordinates, not rivals; and, officials make work for each other. Inefficient bureaucracy is likely not restricted to universities but pervades government and non-government organisations who escape traditional market forces.

Using Admiralty Statistics for the period between 1934 and 1955, Parkinson calculated a mean annual growth rate of spending on bureaucrats to be 5.9%. The top ten Australian research universities between 2003 and 2010 report mean annual growth in spending on non-academic salary costs of 8.8%. After adjusting for inflation the annual growth rate is 5.9%.

The American economist William A. Niskanen considered the organisation of bureaucracies and proposed a budget maximising model now influential in public choice theory. It stated that rational bureaucrats will “always and everywhere seek to increase their budgets in order to increase their own power.”

An unfettered bureaucracy was predicted to grow to twice the size of a comparable firm that faces market discipline, incurring twice the cost. Some insight and anecdotal evidence to support this comes from a recent analysis of the paperwork required for doctoral students to progress from admission to graduation at an Australian university.

In that analysis, the two authors of this article (Clarke and Graves) found that 270 unique data items were requested on average 2.27 times for 13 different forms. This implies the bureaucracy was operating at more than twice the size it needs to. The university we studied has since slimmed down the process.

Further costs from a large bureaucracy arise because academics are expected to participate in activities initiated by the bureaucracy. These tend to generate low or zero academic output. Some academics also adopt the behaviour of bureaucrats and stop or dramatically scale back their academic work.

The irony is that those in leadership positions, such as heads of departments, are most vulnerable, yet they must have been academically successful to achieve their position.

Evidence of this can be seen from the publication statistics of the professors who are heads of schools among nine of the top ten Australian research universities. Between 2006 and 2011, these senior academics published an average of 1.22 papers per year per person as first author.

This level of output would not be acceptable for an active health researcher at a professor, associate professor or even lecturer level.

The nine heads of school are likely tied up with administrative tasks, and hence their potential academic outputs are lost to signing forms, attending meetings and pushing bits of paper round their university.

If spending on the costs of employing non-academics could be reduced by 50% in line with a Niskanen level of over-supply, universities could employ additional academic staff. A further boost to productivity could be expected as old and new staff benefit from a decrease in the amount of time they must dedicate to bureaucratic transactions.

If all Australian universities adopted the staffing profile of the “Group of 8” institutions, which have the highest percentage of academics (at 51.6%), there would have been up to nearly 6,500 extra academics in 2010.

While no economist would question the need for some administration, there needs to be a focus on incentives to ensure efficient operation. It’s possible to run a tight ship in academic research as shown by Alan Trounson, president of the California Institute for Regenerative Medicine (CIRM).

In 2009, Trounson pledged to spend less than 6% of revenues on administration costs, a figure that is better than most firms competing in markets. So far, this commitment has been met.

It’s clear then that finding solutions to problems in modern Australian universities calls for a better understanding of economics and a reduction in bureaucracy.

A couple of terrific safety quality presentations

 

Rene Amalberti to a Geneva Quality Conference:

b13-rene-amalberti

http://www.isqua.org/docs/geneva-presentations/b13-rene-amalberti.pdf?sfvrsn=2

 

Some random, but 80 slides, often good

Clapper_ReliabilitySlides

http://net.acpe.org/interact/highReliability/References/powerpoints/Clapper_ReliabilitySlides.pdf

Big data in healthcare

A decent sweep through the available technologies and techniques with practical examples of their applications.

Big data in healthcare

Big data in healthcare

big data in healthcare industrySome healthcare practitioners smirk when you tell them that you used some alternative medication such as homeopathy or naturopathy to cure some illness. However, in the longer run it sometimes really is a much better solution, even if it takes longer, because it encourages and enables the body to fight the disease naturally, and in the process build up the necessary long term defence mechanisms. Likewise, some IT practitioners question it when you don’t use the “mainstream” technologies…  So, in this post, I cover the “alternative” big data technologies. I explore the different types of big data datatypes and the NoSQL databases that cater for them. I illustrate the types of applications and analyses that they are suitable for using healthcare examples.

 

Big data in healthcare

Healthcare organisations have become very interested in big data, no doubt fired on by the hype around Hadoop and the ongoing promises that big data really adds big value.

However, big data really means different things to different people. For example, for a clinical researcher it is unstructured text on a prescription, for a radiologist it is the image of an x-ray, for an insurer it may be the network of geographical coordinates of the hospitals they have agreements with, and for a doctor it may refer to the fine print on the schedule of some newly released drug. For the CMO of a large hospital group, it may even constitute the commentary that patients are tweeting or posting on Facebook about their experiences in the group’s various hospitals. So, big data is a very generic term for a wide variety of data, including unstructured text, audio, images, geospatial data and other complex data formats, which previously were not analysed or even processed.

There is no doubt about that big data can add value in the healthcare field. In fact, it can add a lot of value. Partially because of the different types of big data that is available in healthcare. However, for big data to contribute significant value, we need to be able to apply analytics to it in order to derive new and meaningful insights. And in order to apply those analytics, the big data must be in a processable and analysable format.

Hadoop

Enter yellow elephant, stage left. Hadoop, in particular, is touted as the ultimate big data storage platform, with very efficient parallelised processing through the MapReduce distributed “divide and conquer” programming model. However, in many cases, it is very cumbersome to try and store a particular healthcare dataset in Hadoop and try and get to analytical insights using MapReduce. So even though Hadoop is an efficient storage medium for very large data sets, it is not necessarily the most useful storage structure to use when applying complex analytical algorithms to healthcare data. Quick cameo appearance. Exit yellow elephant, stage right.

There are other “alternative” storage technologies available for big data as well – namely the so-called NoSQL (not only SQL) databases. These specialised databases each support a specialised data structure, and are used to store and analyse data that fits that particular data structure. For specific applications, these data structures are therefore more appropriate to store, process and extract insights from data that suit that storage structure.

Unstructured text

A very large portion of big data is unstructured text, and this definitely applies to healthcare too. Even audio eventually becomes transformed to unstructured text. The NoSQL document databases are very good for storing, processing and analysing documents consisting of unstructured text of varying complexity, typically contained in XML, JSON or even Microsoft Word or Adobe format files. Examples of the document databases are Apache CouchDB and MongoDb. The document databases are good for storing and analysing prescriptions, drug schedules, patient records, and the contracts written up between healthcare insurers and providers.

On textual data you perform lexical analytics such as word frequency distributions, co-occurrence (to find the number of occurrences of particular words in a sentence, paragraph or even a document), find sentences or paragraphs with particular words within a given distance apart, and other text analytics operations such as link and association analysis. The overarching goal is, essentially, to turn unstructured text into structured data, by applying natural language processing (NLP) and analytical methods.

For example, if a co-occurrence analysis found that BRCA1 and breast cancer regularly occurred in the same sentence, it might assume a relationship between breast cancer and the BRCA1 gene. Nowadays co-occurrence in text is often used as a simple baseline when evaluating more sophisticated systems.

Rule-based analyses make use of some a priori information, such as language structure, language rules, specific knowledge about how biologically relevant facts are stated in the biomedical literature, the kinds of relationships or variant forms that they can have with one another, or subsets or combinations of these. Of course the accuracy of a rule-based system depends on the quality of the rules that it operates on.

Statistical or machine-learning–based systems operate by building classifications, from labelling part of speech to choosing syntactic parse trees to classifying full sentences or documents. These are very useful to turn unstructured text into an analysable dataset. However, these systems normally require a substantial amount of already labelled training data. This is often time-consuming to create or expensive to acquire.

However, it’s important to keep in mind that much of the textual data requires disambiguation before you can process, make sense of, and apply analytics to it. The existence of ambiguity, such as multiple relationships between language and meanings or categories makes it very difficult to accurately interpret and analyse textual data. Acronym / slang / shorthand resolution, interpretation, standardisation, homographic resolution, taxonomy ontologies, textual proximity, cluster analysis and various other inferences and translations all form part of textual disambiguation. Establishing and capturing context is also crucial for unstructured text analytics – the same text can have radically different meanings and interpretations, depending on the context where it is used.

As an example of the ambiguities found in healthcare, “fat” is the official symbol of Entrez Gene entry 2195 and an alternate symbol for Entrez Gene entry 948. The distinction is not trivial – the first is associated with tumour suppression and with bipolar disorder, while the second is associated with insulin resistance and quite a few other unrelated phenotypes. If you get the interpretation wrong, you can miss or erroneously extract the wrong information.

Graph structures

An interesting class of big data is graph structures, where entities are related to each other in complex relationships like trees, networks or graphs. This type of data is typically neither large, nor unstructured, but graph structures of undetermined depth are very complex to store in relational or key-value pair structures, and even more complex to process using standard SQL. For this reason this type of data can be stored in a graph-oriented NoSQL database such as Neo4J, InfoGrid, InfiniteGraph, uRiKa, OrientDB or FlockDB.

Examples of graph structures include the networks of people that know each other, as you find on LinkedIn or Facebook. In healthcare a similar example is the network of providers linked to a group of practices or a hospital group. Referral patterns can be analysed to determine how specific doctors and hospitals team together to deliver improved healthcare outcomes. Graph-based analyses of referral patterns can also point out fraudulent behaviour, such as whether a particular doctor is a conservative or a liberal prescriber, and whether he refers patients to a hospital that charges more than double than the one just across the street.

Another useful graph-based analysis is the spread of a highly contagious disease through groups of people who were in contact with each other. An infectious disease clinic, for instance, should strive to have higher infection caseloads across such a network, but with lower actual infection rates.

A more deep-dive application of graph-based analytics is to study network models of genetic inheritance.

Geospatial data

Like other graph-structured data, geospatial data itself is pretty structured – coordinates can simply be represented as pairs of coordinates. However, when analysing and optimising ambulance routes of different lengths, for example, the data is best stored and processed using a graph structures.

Geospatial analyses are also useful for hospital and practice location planning. For example, Epworth HealthCare group teamed up with geospatial group MapData Services to conduct an extensive analysis of demographic and medical services across Victoria. The analysis involved sourcing a range of data including Australian Bureau of Statistics figures around population growth and demographics, details of currently available health services, and the geographical distribution of particular types of conditions. The outcome was that the ideal location and services mix for a new $447m private teaching hospital should be in the much smaller city of Geelong, instead of in the much larger but services-rich city of Melbourne.

Sensor data

Sensor data often are also normally quite structured, with an aspect being measured, a measurement value and a unit of measure. The complexity comes in that for each patient or each blood sample test you often have a variable record structure with widely different aspects being measured and recorded. Some sources of sensor data also produce large volumes of data at high rates. Sensor data are often best stored in key-value databases, such as Riak, DynamoDB, Redis Voldemort, and sure, Hadoop.

Biosensors are now used to enable better and more efficient patient care across a wide range of healthcare operations, including telemedicine, telehealth, and mobile health. Typical analyses compare related sets of measurements for cause and effect, reaction predictions, antagonistic interactions, dependencies and correlations.

For example, biometric data, which includes data such as diet, sleep, weight, exercise, and blood sugar levels, can be collected from mobile apps and sensors. Outcome-oriented analytics applied to this biometric data, when combined with other healthcare data, can help patients with controllable conditions improve their health by providing them with insights on their behaviours that can lead to increases or decreases in the occurrences of diseases. Data-wise healthcare organisations can similarly use analytics to understand and measure wellness, apply patient and disease segmentation, and track health setbacks and improvements. Predictive analytics can be used to inform and drive multichannel patient interaction that can help shape lifestyle choices, and so avoid poor health and costly medical care.

Concluding remarks

Although there are merits in storing and processing complex big data, we need to ensure that the type of analytical processing possible on the big data sets lead to valuable enough new insights. The way in which the big data is structured often has an implication on the type of analytics that can be applied to it. Often, too, if the analytics are not properly applied to big data integrated with existing structured data, the results are not as meaningful and valuable as expected.

We need to be cognisant of the fact that there are many storage and analytics technologies available. We need to apply the correct storage structure that matches the data structure and thereby ensure that the correct analytics can be efficiently and correctly applied, which in turn will deliver new and valuable insights.