Category Archives: data saving lives

Facial hair trends:1842 – 1976

  • Terrific find on facial hair trends over the 20th century
  • The introduction of the safety razor and world wars have had neglible impact on trends

Source Paper: robinson1976a_facialhairtrends

Source: http://flowingdata.com/2014/01/08/facial-hair-trends-over-time/

Facial hair trends over time

JANUARY 8, 2014  |  STATISTICAL VISUALIZATION

Facial hair trends

In 1976, Dwight E. Robinson, an economist at the University of Washington, studied facial hair of the men who appeared in the Illustrated London News from 1842 to 1972 [pdf].

The remarkable regularity of our wavelike fluctuations suggests a large measure of independence from outside historical events. The innovation of the safety razor and the wars which occurred during the period studied appear to have had negligible effects on the time series. King C. Gillette’s patented safety razor began its meteoric sales rise in 1905. But by that year beardlessness had already been on the rise for more than 30 years, and its rate of expansion seems not to have augmented appreciably afterward.

Someone has to update this to the present. I’m pretty sure we’re headed towards a bearded peak, if we’re not at the top already.

 

BUT THEN THIS FROM THE ATLANTIC:

http://www.theatlantic.com/health/archive/2014/01/the-rise-and-fall-and-rise-of-facial-hair/282951/

The Rise and Fall—and Rise—of Facial Hair

There was a time when the best option was to wear both sideburns and a mustache.
 Library Company of Philadelphia/flickr

In 1940, the anthropologists Jane Richardson and Alfred Kroeber examined pictures of catalogues, magazines, and drawings dating back to the 1600s in an attempt to find trends in the cuts and styles of women’s dresses. What they produced were fascinating graphs of evolving social mores, with periods of plunging necklines quickly succeeded by buttoned-up decades of modesty, and vice-versa. One particularly entertaining chart shows generally Amish-length skirts throughout history — save for a racy, rapid shortening during the libidinous 1920s.

Skirt lengths by decade, from 1600 to 1940. (Richardson and Kroeber)

In 1976, University of Washington economist Dwight E. Robinson sought to apply the same technique to fashion trends in the opposite sex—specifically, in men’s “facial barbering.”

For the study, published in the American Journal of Sociology, he examined the period between 1842 and 1972, the years of continuous weekly publication of the Illustrated London News. Since this was the “world’s most venerable pictorial news magazine,” it would serve as his sole source.

With the acknowledgement that the “gentlemen of the News” were largely limited to prominent members of society, he set about counting the frequency with which five different facial hair styles appeared: sideburns alone, sideburns and mustache, a beard (“any amount of whiskers centering on the chin,” in case you were confused), mustache alone, and clean-shaven. He excluded pictures of royalty, models, and non-Europeans, and gathered about 100 images for each year.

Here are the bristly results:

American Journal of Sociology

Beards and sideburns began losing their luster in the mid-late 1800s, while mustaches hit their apex in the early 20th century and have been increasingly less popular ever since. The number of brave souls who sported both sideburns and mustaches peaked in 1877, though the study did not address their later resurgence in modern-day Bushwick.

Few were clean-shaven in the late 1800s, but by the 1970s, nearly everyone was:

American Journal of Sociology

What’s more, the great “beard wave” of 1844 to 1955 corresponded to a similar heydey, for whatever reason, of extra-wide skirts in the Richardson-Kroeber study:

American Journal of Sociology 

Robinson’s theory as to why fashion—both sartorial and hirsute—seems to come in waves is this: Young people tend to eschew the tastes of their elders, but old trends seem new again after a sufficient amount of time has passed. So while long skirts may fall out of favor for one generation, their grandchildren will think they’re the cat’s pajamas.

1890s and 1950s dresses (Herbert Art Gallery & Museum, Coventry/Bess Georgette/flickr)

And most men might have been anti-beard between the 1940s and 1976, but a spin around the nearest artisanal cheese shop today will show that’s no longer the case.

2014 Big Data Predictions

  • growth will be 6 times the overall IT market
  • shortage of talent > analytics as a service, small/nimble analytics, cloud will grow quickly at 50% CAGR
  • VC investment moving to the top layer – from information management to analytics, discovery and applications
  • “Analytics 3.0” (IIA) and “Digitization of Everything” – companies across industries will use analytics on their accumulated data to develop new products and services — G.E. is the poster boy for this
  • automation solutions – cognitive computing, rules management, analytics, biometrics, rich media recognition – will replace knowledge worker roles
  • over-automation of decisions will result in an optimal mix of human and machine learning
  • Heightened focus on governance and privacy will improve results – governance will be a driver for ROI (Capgemini)

 

Gil Press, Contributor

I write about technology, entrepreneurs and innovation.

12/12/2013 @ 11:18AM |18,465 views

$16.1 Billion Big Data Market: 2014 Predictions From IDC And IIA

Both IDC and The International Institute of Analytics (IIA) discussed their big data and analytics predictions for 2014 in separate webcasts earlier this week. Here is my summary of their predictions plus a few nuggets from other sources.

IDC predicts that the market for big data will reach $16.1 billion in 2014, growing 6 times faster than the overall IT market. IDC includes in this figure Infrastructure (servers, storage, etc., the largest and fastest growing segment at 45% of the market), services (29%) and software (24%). IDC commented that the benefits of big data are not always clear today (indeed, BNY Mellon recently asked its 50,000 employees “for ideas about how to harness the power of Big Data”). IIA predicted that companies will want to see demonstrable value in 2014 and will focus on embedding big data analytics in business processes to drive process improvement.

The much-discussed shortage of analytics and data science talent led IIA to make three separate but related predictions. One prediction is that the adoption of analytics-as-a-service will accelerate with “ready-made analytics in the cloud” offering an attractive option for quickly testing big data analytics or scaling up existing programs. Similarly, Capgemini predicts (in an email to me) “smaller, nimble analytics,” as a result of the rise of machine-to-machine data, “making cloud the de facto solution.” And IDC predicts that cloud infrastructure will be the fastest-growing sub-segment of the big data market, with a 2013-2017 CAGR of close to 50%.

Another IIA prediction related to the dearth of talent is the increasing attention paid by companies to organizing in teams the analysts and data scientists they currently have on board, either embedded in the business units or in a center of excellence. The focus will be on making these teams more effective by establishing and sharing best practices and by “operationalizing and managing models,” with the rest of the world getting closer to the proficiency level of the financial industry in this regard (in other words,keeping up with the quants? hopefully, also learning from the financial industry’s failures in this regard—see financial crisis, 2008 edition).

As for the prospects for alleviating the talent shortage, IIA commented that there are now well over 100 programs at universities in the US where analytics and data science “are in focus” (see my list of graduate programs here). IDC, for its part, cautioned that these programs “will bear fruit only in four to five years,” referring obviously to the newly-established data science programs. IDC agrees with IIA that companies providing big data analytics services will fill the gap in the meantime and predicts that the big data professional services market will exceed $4.5 billion in 2014. The number of vendors providing such services will triple over the next three years, according to IDC, and these firms will “aggressively acquire scarce big data talent,” making it scarcer.

A very interesting dimension to the dearth of talent raised by IDC is the shortage of IT professionals capable of dealing with the new big data requirements. 33% of respondents to an IDC and Computerworld survey earlier this year noted as one of their big data challenges the “lack of sufficiently skilled big data and analytics IT staff” (“lack of sufficient number of staff with appropriate analytics skills” was selected by 45% of respondents).

Also interesting was IDC’s expansion of the services part of the market to include “value added content providers.” These include “traditional vendors” such as Thompson, LexisNexis, and Experian; “new wave vendors” such as DataSift, Gnip, and LinkedIn; “company and personal information vendors” such as Acxiom, Equifax, and Tarsus; and “search engine/aggregators” such as Yahoo, Google, and Salesforce/Data.com. IDC believes that this market segment will be “challenged by lack of business model clarity and standards.”

A related prediction from IDC is that VC investment will shift to the top layers of the big data software stack, from information management to the “analytics & discovery” and “applications” layers. New types of applications (“use cases”), such as personalized medicine, will emerge out of what IDC predicts will be the blurring of the boundaries between high-performance computing (previously limited to scientific/engineering applications) and “enterprise big data” (i.e., mainstream applications managed by an IT department). IIA sees other new horizons for the application of big data, predicting that companies in a variety of industries will increasingly use analytics on the data they have accumulated to develop new products and services. GE has been the poster boy for this emerging trend, called “Analytics 3.0” by IIA, or “the digitization of everything” by me (you decide).

Another application, security, will become the next big front for big data, IDC predicts, as security infrastructure will increasingly take on big data-like attributes. Big data will be used to correlate log data and identify malicious activity in real time, allowing companies to react quickly, rather than after the event. Gartner begs to differ, however, predicting that “big data technology in security contexts will stay immature, expensive and difficult to manage for most organizations as targeted attacks become more stealthy and complex to identify in progress. … The noise about big data for security has grown deafening in the industry, but the reality lags far, far behind.”

In a somewhat far-out prediction, IIA talked about facial recognition and wearable device data that will be incorporated into predictive analytics. One of the examples given was “pet stores could use facial recognition to greet dogs as well as customers.” IDC was a bit closer to 2014 (or was it?) when it predicted that the “proliferation of sensor, mobile, wearable, and embedded devices (Internet of Things) will become a significant driver of the big data market,” stressing the need for investment in “Data-in-Motion” and “real-time analysis of geo-dispersed incoming data streams,” primarily in the cloud (that you don’t need wearables or geo-whatever to satisfy your obsession with quantifying your life, was recently demonstrated by the resident data scientist at MarkITx who crunched his lunches to come up with a happiness-per-gram metric).

Both IDC and IIA got a bit more into the technologies behind big data analytics, with IDC predicting the co-habitation for the foreseeable future (my words) of traditional database technology (RDBMS) with the newer Hadoop ecosystem and NoSQL databases, concluding that “in the short term,” information management will become more complex for most organizations (see shortage of qualified IT staff above); and IIA predicting that “the adoption of data visualization will accelerate in both the high and low ends of the complexity spectrum [for analytics].” Humans, however, don’t comprehend things in more than two dimensions or, at most, three dimensions, so IIA advised tempering our enthusiasm “a bit” (this came from self-described Tom “Curmudgeon” Davenport so you may want to consider how much tempering you want to do; as for me, I always opt for being “uber-curmudgeon”).

Last but certainly not least, both IDC and IIA talked about automation in the context of big data. IDC predicts that “decision and automation solutions, utilizing a mix of cognitive computing, rules management, analytics, biometrics, rich media recognition software and commercialized high-performance computing infrastructure [phew!], will proliferate.” Some of these solutions, IDC says (warns?), “will begin to replace or significantly impact knowledge worker roles.”  IIA predicts that “we will see a continued move to machine learning and automation to keep pace with speed and volume of data” and that “as they strive to operationalize analytics but encounter challenges with the over-automation of decisions, companies will focus more on the optimal mix between human and machine capability and judgment.” If you take humans too much out of the equation, their decision making will atrophy, warned IIA, asking “If you don’t have experts, who will train the next generation of [machine learning] software?” From the IIA’s lips, to the NSA’s ears, I say. (Well, we can assume these words were collected and stored by the omnipresent sleuths the second they were uttered; the question is: do they understand what the words mean?)

One prediction that didn’t make the official list of IIA’s predictions, but Davenport nevertheless managed to include in the webcast, was that “companies will need to hire lawyers to verify that they actually own the data.” Indeed, the nagging issues—that I think will be even more prominent in 2014—of privacy and governance were largely missing from the IDC and IIA discussions (Capgemini, in contrast, contributed this: “A heightened focus on governance will improve analytic results… Governance will need to be a driver in shaping the ROI story for Big Data in 2014”).  Also missing were discussions of “open data” and the increased use of big data by the public sector (outside of the NSA) to name just a few pertinent big data trends not on their list of predictions. But of course, the challenge is to select the nine or ten most important ones and we have lots to chew on with IDC’s and IIA’s lists.

Listeners to the IIA webcast were given the opportunity to vote on which predictions they thought would come true:

IIA_Poll-Results

Participants in the IIA webcast included Sarah Gates, Tom Davenport, Bob Morison, Bill Franks, Greta Roberts, Omer Sohail and Sanjeev Kumar; IDC’s webcast was delivered by Dan Vesset and Ashish Nadkarni; Capgemini’s predictions were attributed to SVP for Business Information Management Scott Schlesinger.

Follow me on Twitter @GilPress or Facebook or Google+ 

Justin Coleman: The ethical imperative to tackle overdiagnosis and overtreatment

  • Beautifully written, wise piece by a friend and colleague of Gavin Mooney
  • Archie Cochrane humorous anecdote
  • Donald Berwick’s 30% waste JAMA link
  • Futility of spinal fusions
  • Futility of knee arthroscopies
  • Testosterone over-prescribing
  • EBM is a necessary but not sufficient condition for practising good medicine.When my friend Prof Gavin Mooney gave me his book, he explained why he’d called it EBM ‘in its place’.

    He did not want to promote a system of slavish adherence to a deontology. As a leftie health economist—a rare breed indeed—his primary concern was always one of health equity. Not health equality, which is clearly unattainable, but equity, where we strive for equal access to equal care for equal need.

    An equitable health system does not mean trying to give everyone the very best, if by that you mean the most; the most tests, the most expense, the most treatments. Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient.

    Gavin was killed in tragic circumstances last Christmas, and I dedicate this article to his memory.

    His philosophy was that, sometimes, less is more. We must pare things back, strip away excesses and judiciously apply what we know works, rather than enthusiastically embrace what we wish would work.

    As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.

Source: http://blogs.crikey.com.au/croakey/2013/06/23/the-naked-doctor-an-indepth-look-at-the-pitfalls-of-cutting-edge-medicine/

The Naked Doctor: an indepth look at the pitfalls of “cutting edge” medicine

MELISSA SWEET | JUN 23, 2013 5:15PM | EMAIL | PRINT

The Naked Doctor is an ongoing project at Croakey that aims to encourage discussion and awareness of the opportunities to do more for health by doing less.

In this latest edition, Dr Justin Coleman suggests that an equitable health system does not mean trying to give everyone the very best, if that means “the most tests, the most expense, the most treatments”.

“Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient,” he says.

Perhaps one area where more intervention is needed is in tackling overdiagnosis and overtreatment – Dr Coleman suggests that if the ‘medical market’ is left unchecked, the balance naturally tips towards overtreatment.

He concludes with a powerful call to action:

“As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.”

The article below is based upon his plenary address to the Qld RACGP Annual Clinical Update in Brisbane last month.

It is dedicated to the late Professor Gavin Mooney, whose philosophy was that we must “judiciously apply what we know works, rather than enthusiastically embrace what we wish would work”.

***

The ethical imperative to tackle overdiagnosis and overtreatment

Justin Coleman writes:

Two years ago my good friend Gavin Mooney gave me a signed copy of his latest—and, as it turned out—last book, Evidence-Based Medicine in its Place. 

Professor of Health Economics at Curtin University, Gavin was an irascible Scot, and his book detailed his work with another great Scotsman, Archie Cochrane, who of course pioneered the science of Evidence-Based Medicine.

According to Mooney, after their first meeting, Cochrane informed him that he had revised his opinion of economists.

On the basis of the evidence of an afternoon with Mooney, he now placed them second bottom, with sociologists at the bottom. This merely confirmed for Mooney that there was much on which they agreed.

Mooney told me the story, repeated in his book, of how Archie Cochrane first gained notoriety as a very junior staff member at the massive Department of Health in London.

The young Archie presented slides from an RCT on outcomes after heart attacks following rehabilitation, either while remaining a hospital inpatient or after early discharge home.

London’s ‘Who’s Who’ of learned physicians nodded sagely as Archie showed the crucial slides where the hospital outcomes—represented in red—outdid the blue columns of home-based outcomes across nearly every parameter. A couple of supportive comments, no questions.

Then the young epidemiologist pretended to look flustered. ‘I’m terribly sorry. I seem to have mixed up the red and the blue!’

He had deliberately switched the labels. All the better outcomes were in fact in the home-based, early discharge group.

Needless to say, chaos ensued as suddenly a hundred disgruntled audience members grilled him on every possible dubious aspect of the study design!

Best practice or common practice?

Until that time, there had been no reason for a London physician to doubt that an intensive, expensive, high-tech hospital stay would improve health outcomes.

It made perfect sense, and a whole bunch of highly intelligent, caring physicians had spent their careers ensuring that such a system existed. Where it wasn’t affordable, public and charity funds were sought to ensure more people could get longer hospital stays.

This was best-practice care, in the same way that bed rest for back pain, monthly breast self-examinations, and antibiotics for sore throats have been understood by clever and well-meaning people to be fairly obvious best care. More about Archie—and Gavin—later.

In the brilliant Mitchell and Webb parody of a Homeopathic Emergency Department, Webb attempts to save a trauma victim’s life by drawing on his palm in pen to extend his life line. He justifies it by asking ‘Have you got a better idea?’

Luckily, the answer is ‘yes’.

There are some things that do work better than a pen mark, or a homeopathic vial of water, even a vial where the water molecules somehow retain the memory of a herb they once knew, while conveniently forgetting they were once flushed down a toilet.

And there are some things that do work better than our mainstream medical interventions, even when tens of thousands of medical practitioners believe they are doing the right thing.

This has always been true, and will ever be so. Our mistakes from the past remind us that we are making mistakes right now. Full credit to all those anonymous doctors and researchers who unwrapped these anomalies.

The art of discovering nothing

History rightly lauds those who discovered ‘something’; Alexander Fleming and penicillin.

But I also dips me lid to those who discovered ‘nothing’. Bloodletting doesn’t work. Arsenic doesn’t work. Keeping kids with polio in hospital back straighteners for six months of their lives doesn’t work.

In many cases, our patients would be better off if we chose not to act.

There’s a minimum standard in the medical profession—not the gold standard, but let’s call it the bronze.

The bronze standard is that the patient is no worse off as a result of seeing us. The bronze standard is probably achieved by enthusiasts who light ear candles and discover people’s chakras. Let’s at least stop doing things which fall below the bronze standard.

We must balance the important and exciting work of discovering new stuff with the un-sexy hard-slog science of analysing those times where we have over-reached and over-enthused.

The best of our medical predecessors started this process and we must continue it; this is why we are a science and not merely a tradition.

Two hundred years ago, the French physician Phillipe Pinel cared enough about the damage his colleagues were doing to his psychiatric patients to observe:

“It is an art of no little importance to administer medicines properly: but, it is an art of much greater and more difficult acquisition to know when to suspend or altogether to omit them.”

It took a young epidemiologist Archie Cochrane to highlight the flaws in obstetric practice that should ideally have already been obvious to the world’s leading obstetricians and their institutions.

And these were not minor flaws. Obstetrics units in one part of the world were teaching methods which had already been shown in another part of the world to kill women and babies, and vice versa.

Cochrane didn’t do the research himself; his genius was to inspire others—in this case, Iain Chalmers— to collect, collate and analyse all the available evidence and, importantly, reject the shoddy stuff: the anecdote and the meaningless trial, so that obstetricians and their departments could make informed decisions as to how to get the best outcomes.

Archie never delivered a baby nor managed a single maternal complication, but his legacy would probably have saved more lives than any doctor watching his slide presentation in London.

Somewhere on the spectrum

Let’s look at chronic diseases, and use diabetes as an example.

Insulin’s invention in 1922 was a miracle, which converted the inevitable rapid death sentence of Type 1 diabetes into a chronic disease. Chronic in the best sense of the word, because insulin bought you time; years, decades.

That simple chemical justifiably sits at the high table in the pantheon of superb medical interventions.

But diabetes, like most chronic diseases, has nominal cut-off points which define its existence and degree. Diseases stretch themselves out along a spectrum, blissfully unaware of how we choose to dissect them.

Medical tests and interventions that work brilliantly at the sharp end of the spectrum do not work nearly so well when we slide towards the middle and enter the grey zone.

Any gains to be had here in the land of the long grey cloud are far foggier than anything out at the extreme edge.

Benefits diminish; every diagnostic test becomes less accurate; false positives increase exponentially; patient numbers increase—and with them, costs, pain and inconvenience; health gains are smaller in this less-sick population; and suddenly being diagnosed with a chronic disease such as diabetes or pre-diabetes doesn’t look so good any more.

Instead of being grateful to Chronos, the Greek god of time who grants you each extra year of life, suddenly the old bugger expects you to jab your finger three times a day, jab your stomach three times a day, and to roll a boulder up Sisyphus’s mountain just to have your liver pecked out by Prometheus’s eagle.

There comes a point where ignoring your diabetes educator becomes…to continue the theme…tantalising!

If the ‘medical market’ is left unchecked, the balance naturally tips towards overtreatment.

The paradigm promoted by industry, the media and some doctors, particularly the sub-sub-specialists, is that the only important news is a new invention, new drug, robotic surgery, more MRIs.

Is the best doctor always the one at the cutting edge? Is the best endocrinologist for my grandmother the one who has just spent a year in America learning the finer points of subcutaneous insulin infusion pumps?

There exists a cut-off point on every disease spectrum, inevitably ignored by drug companies and often enough by doctors, where medications simply don’t help. At that point, they do nothing. Beyond that point, they actively harm. This is true almost by definition for every medical or surgical intervention.

There is pressure from multiple sources—patient, doctor, pharma, specialist, psychologist, media, disease-awareness campaigns, patient advocacy groups—to nudge this point towards the midline of the spectrum.

This is true for diabetes, but also for depression, ADHD, lipid levels, cardiac stents, and deficiencies of a host of replaceable substances including testosterone, oestrogen, and various vitamins, the trendiest of which is Vitamin D.

If we don’t test early and often for all these problems, we are ignoring our duty of care and if we don’t treat when the test result comes back in red, we are downright obstructive and possibly liable.

Andropause: the new epidemic

Take testosterone. In the past five years we have witnessed a concerted wave of discussion around the andropause. Feature articles have called it the hidden epidemic, hinting at reverse sexism whereby women get their daily oestrogen but our men’s hormones have rights too!

Disease-awareness campaigns, subtle in Australia compared to countries that allow direct-to-patient advertising, ask males if they ever experience tiredness, weakness or low libido. The suggested remedy is to get your levels checked by your friendly local GP. It’s not advertising: it’s just caring.

This tumescent rise in publicity tied in beautifully with the advent of ‘men’s clinics’, whose doctors were the only clinicians with enough spare time to keep up with all the clever new ways of getting the testosterone into your body; oral, patches, gels, suppositories, inhalants; no orifice was left unsullied in the competition to supply Vitamin T.

The result?

PBS expenditure on testosterone has increased 450 % since 2006. Patients at the pointy end of the spectrum—men with testicular cancer and orchidectomies—have been swamped by the enormous market of men who are…wait for it…ageing. A bit like what’s happening to the percentage of cancer-sufferers in the opioid market.

Last year, the departing boss of the US Medicare system, Dr Donald Berwick,estimated that 20-30 per cent of US health spending is ‘waste’—as in; it yields no benefit to patients. That one quarter of the US health budget wasted could power the entire GDP of most countries on the planet.

Berwick listed five reasons for this catastrophic waste, and the first of them was ‘overtreament’. We are not talking a minor problem here.

Why do we overtest and overtreat?

Let’s look at some causes of overtesting and overtreating. Why do we do it?

Some of it is simply because the evidence doesn’t exist yet.

There was no shame in a medical graduate treating headaches with bloodletting a century ago; no-one knew any better. According to the prevailing understanding of the human body, it made sense and it no doubt appeared to work in some people.

But lack of evidence is not the only reason for our actions.

I like the list prepared by Australian surgeon Dr Skeptic (clearly his parents were prescient when naming him) of the reasons why we act even when evidence tells us ‘Don’t just do something, stand there!’

Defensive medicine: If you miss one rarity and thereby harm one person, this is more likely to end you up in court than causing far more harm by routinely overtreating everyone.

It takes an epidemiologist to tell you about the latter, whereas a lawyer will be quite happy to keep you posted about the former.

The language of inaction vs. action: Overinvestigation and overtreatment are very difficult concepts to convey to patients.

If we tell the patient ‘I really don’t know precisely why you have low back pain; would you like me to run a few tests?’ then the answer will be ‘yes’.

Our choice of language suggests that after doing the tests, we will know why they have low back pain. But whether doctor, physiotherapist or chiropractor, ye may ask the gods of radiology but shall not receive an answer.

If we give a glucometer to a person with pre-diabetes, or with diabetes that doesn’t require insulin, we will indeed get an answer as to precisely what their blood sugar is at any given moment, but this knowledge will not actually improve health outcomes.

The answer does not help the patient, therefore we are asking the wrong question.

This flawed logic of Test = Answer = Cure is used by iridologists and scientologists. And doctors.

Influence of recent experience: obstetricians who attend a birth with complications are significantly more likely to recommend a Caesarean section in their next 50 cases, before they settle back into a more sensible, case-by-case evidence-based approach.

The lottery mindset: Few people have a good understanding of risk.

My chances of winning the first division prize in tattslotto this Saturday are the same whether or not I buy a tattslotto ticket. The same. Not absolutely, mathematically, precisely the same, but the same in any meaningful, ordinary sense of the word.

People don’t understand tiny chances. I have more chance of being dead next Saturday than being both alive and collecting my winnings.

Studies consistently show that both doctors and patients, just like gamblers and stockbrokers, overestimate gains and underestimate losses.

People will jump at a whole body CT scan to ‘rule out’ a tiny risk of cancer, and ignore the fact that the radiation from each such scan increases their lifetime cancer risk by about 1%.

The prevailing wisdom: Medical students come out of university knowing thousands of new words and knowing about thousands of new interventions. The consultants taught us all the pharmaceutical and surgical interventions in their own specialised area of expertise.

But it’s not really anyone’s job to teach you about how to avoid patient referrals into the system; how to stop the cascade before it starts.

A recent Australian study showed that half of all IV cannulas inserted in ED are never used. Why does every junior ED doctor put the IVs in? Because everyone else has always put them in.

When ‘more’ is harmful

If a junior doctor is trained in breast surgery outpatients and has met women whose cancer was detected by screening mammogram, it takes some active un-training not to assume that therefore all women are better off having a mammogram.

When I was a student, my consultant orthopod took the time to kindly explain the intricacies of spinal fusion and of arthroscopic debridement for osteoarthritic knees, and I think he probably mentioned that ‘some patients don’t seem to gain as much as others’.

However, this is a starkly different prevailing wisdom from the reviews that have shown that neither spinal fusions nor arthroscopies for osteoarthritic knees differ much from placebo. In the US alone, 650 000 such arthroscopies were performed each per year in the late 1990s.

Ironically, sometimes the richer you are, with more access to the private system and doctors who will cut corners for you, the more intervention you get and the more harm is done.

The extreme of this is the Hollywood celebrity with their own physician on call, who would feel like a fool telling his client that for their thousand-dollar callout fee they get absolutely nothing except ‘watch and wait’.

When Michael Jackson went to his umpteenth plastic surgeon, she didn’t say ‘no’. When he complained he was getting anxious and couldn’t sleep and needed something more than light sleeping tablets, I bet Dr Conrad Murray now wishes he had opted for conservative management.

Making the system work

I believe it is our ethical responsibility to avoid overtreatment at an individual level, and also to support system-wide changes in the way we spend money on health.

I am no slave to evidence-based medicine; not one of those sceptical EBM types who eat gruel for breakfast and secretly believe deep down that nothing works. Although, on a bad day this pessimism reaches its ultimate fruition—absolutely nothing I do works!

EBM is a necessary but not sufficient condition for practising good medicine.

When my friend Prof Gavin Mooney gave me his book, he explained why he’d called it EBM ‘in its place’.

He did not want to promote a system of slavish adherence to a deontology. As a leftie health economist—a rare breed indeed—his primary concern was always one of health equity. Not health equality, which is clearly unattainable, but equity, where we strive for equal access to equal care for equal need.

An equitable health system does not mean trying to give everyone the very best, if by that you mean the most; the most tests, the most expense, the most treatments. Not only will that aspiration require others to miss out on even the second-best treatment, but it too often also actively harms the recipient.

Gavin was killed in tragic circumstances last Christmas, and I dedicate this article to his memory.

His philosophy was that, sometimes, less is more. We must pare things back, strip away excesses and judiciously apply what we know works, rather than enthusiastically embrace what we wish would work.

As a GP, I am a gatekeeper to a most powerful, expensive, superb and dangerous health system and I must never forget that sometimes my job is to shut the gate.

• Dr Justin Coleman is a GP at Inala Centre of Excellence in Aboriginal and Torres Strait Islander Health. He is senior lecturer at Griffith University and University of Queensland, and President of the Australasian Medical Writers Association (AMWA). Twitter: @drjustincoleman. Web:http://drjustincoleman.com/

• You can read more about Naked Doctor here and this Croakey page has been established as a memorial to Professor Gavin Mooney.

Handler: Big data without the entry

  • the inconvenience of data entry stops tools being used by doctors
  • computer-assisted physician documentation (CAPD) can alleviate the problem
  • natural language understanding (NLU) converts sentences into codes
  • One of the most exciting promises of Big Data combined with advanced technologies like NLU is its ability to “bootstrap” — to identify important missing data and either grab it from another source, infer it from existing data, or prompt the clinician to add it during the normal course of documentation.

Source: http://www.wired.com/insights/2013/12/big-data-insights-without-big-data-entry/

Big Data Insights Without Big Data Entry

  • BY DR. JONATHAN HANDLER, M*MODAL
  • 12.23.13
  • 2:03 PM

Image: openexhibits/Flickr

Image: openexhibits/Flickr

My brand new Xbox One has a new feature — its Kinect camera recognizes users’ faces and automatically logs them in. The new feature saves me nine clicks and 45 seconds compared to my old Xbox 360. Microsoft correctly recognized that people are too busy for their videogames to turn them into data entry clerks.

It’s the same thing for doctors. Doctors can use Big Data algorithms to help provide better patient care, but those algorithms are very data hungry. Who will enter all that data? Patients already wait weeks to be seen and then sit in the waiting room despite an appointment. The last thing we want is doctors doing even more data entry that further delays care. At the same time, we want computers to provide great decision support to doctors.

Can we realize the benefits of Big Data without suffering the pain of Big Data entry? Using modern technologies, this may be possible.

In 1998, my colleagues and I had built some of the earliest online decision support tools, and within a few years more than 50 were freely available. Although very useful, I used those tools only occasionally in my clinical practice because most required me to enter lots of data before providing any help in return. For example, one score for predicting the likelihood a patient will die requires the user to enter 17 pieces of data, two of which are formulas that must be calculated. Big Data can speed up the development of decision support tools and create more accurate algorithms.

However, those algorithms require lots of data. For example, an algorithm for diagnosing heart attacks was developed using 156 data elements, and the final model used 40 data elements. Entering all that data is a lot to ask from physicians trying to manage an already overcrowded emergency department.

Over the last decade, it has been well documented that tools demanding significant data entry were felt to make doctors less efficient and were seen as less useful. Not surprisingly, tools requiring doctors to do lots of data entry were used less often. Although automatically populating these tools with existing data from the Electronic Health Record (EHR) seems the obvious answer, it has real challenges.

Sometimes it is not clear which data to enter. For example, there may be multiple blood pressures documented and it may not be obvious to the computer which one should be used. In other cases, the needed inputs might not be recorded exactly in the right format or with the right details, or the inputs may be completely missing.

New technologies might alleviate the problem. Computer-assisted physician documentation (CAPD) uses real-time natural language understanding (NLU) to convert the clinician’s sentences into computer-readable codes.

Those codes automatically populate decision support algorithms executed by rules engines, and seamlessly display the result. Caregivers are no longer forced to re-document the same information in checkboxes and manually run the tools. The rules engines automatically identify when a required input is missing, and can immediately prompt the clinician to add it if warranted. This enables real-time decision support, so that clinicians can provide better care.

CAPD becomes even more interesting when the rules engines start doing even more work on behalf of the clinician. For example, in addition to auto-populating decision support rules, the system might even save time by auto-generating some of the doctor’s notes, auto-generating billing and regulatory reporting codes, and automatically building a provisional set of orders based on the doctor’s notes.

One of the most exciting promises of Big Data combined with advanced technologies like NLU is its ability to “bootstrap” — to identify important missing data and either grab it from another source, infer it from existing data, or prompt the clinician to add it during the normal course of documentation. With these technologies in place, we can reap the value of Big Data without paying the heavy cost of Big Data entry. The result will be better and faster healthcare for all.

Dr. Jonathan Handler is the Chief Medical Information Officer at M*Modal.

FDA rearguard frame…

It’s all happening anyway. Eventually, the tide will surge and the wall will burst.

Already, an explosion of monitoring, testing, and sensing devices are coming on the market, providing consumers with instant analysis of their fitness, blood chemistry, sleep patterns and food intake. It’s only a matter of time before regulators feel compelled by consumer demand to find a way to accommodate better and cheaper innovations, and for slowly changing industries to dramatically restructure themselves in the face of overwhelming new opportunities. The long-term potential of vast databases of genomic data to improve health outcomes, reduce costs, and reorient the debate on medical priorities is too valuable to be held back for long — and arguably the biggest transformation for the healthcare industry since the discovery of antibiotics in the early 20th century.

http://www.wired.com/opinion/2014/01/the-fda-may-win-the-battle-this-holiday-season-but-23andme-will-win-the-war/

Regulating 23andMe to Death Won’t Stop the New Age of Genetic Testing

  • BY LARRY DOWNES AND PAUL NUNES
  • 01.01.14
  • 6:30 AM

 

Image: ynse/Flickr

 

Market disruptions often occur — or not — as the direct result of unintended collisions between breakthrough technologies and their more incremental regulators. In the latest dust-up, the U.S. Food and Drug Administration (FDA) last month ordered startup 23andMe to stop marketing its $99 genetic analysis kit, just before the Christmas shopping season kicked into high gear.

To date, over half a million customers have taken the swab in return for detailed ancestry data and personalized information on 248 genetic traits and health conditions. The company, which launched in 2007 with substantial backing from Google, has been working closely — albeit more slowly than the FDA would have liked — with the FDA to ensure it complies with federal health and safety regulations. But the agency concluded in its recent warning letter that 23andMe was marketing a “device” that was “intended for use in the diagnosis of diseases or other conditions,” and as such, its marketing materials required pre-approval from the FDA, which includes extensive research studies.

23andMe is an example of what we call a “Big Bang Disruption” — a product or service innovation that undermines existing markets and industries seemingly overnight by being simultaneously better andcheaper than the competition. What’s happening in genomic testing (and healthcare in general) is consistent with our research in over 30 different industry segments, from manufacturing to financial services to consumer products.

When technologies improve exponentially, many industry incumbents — and the regulators who oversee them — are kept constantly off-balance. That’s because incumbents have been indoctrinated by a generation of academic literature and MBA training to ignore disruptive products until they had a chance to mature in the market, assuming they would first appear as cheaper but inferior substitutes that would only appeal to niche market segments.

Doctors — who are also incumbents in this situation — are struggling to respond to disruptive medical technologies that change the power dynamic in the patient relationship. Several 23andMe users have reported taking the FDA’s advice of reviewing their genetic results with their physicians, only to find the doctors unprepared, unwilling, or downright hostile to helping interpret the data.

Often, incumbents’ only competitive response — or the only one they can think of — is to run to the regulators. That’s what’s has been happening to car-sharing services such as Uber, Lyft, and Sidecar; to private drone makers; and casual accommodation services such as Airbnb, to name just a few examples. And now it’s happening to 23andMe, one of hundreds of new startups aimed at giving healthcare consumers more and better information about their own bodies — information that has long been under the exclusive and increasingly expensive control of medical professionals.

Absent any real law on the subject, the agency has strained credulity to categorize 23andMe’s product as a diagnostic “device” — making it subject to its most stringent oversight. The FDA’s letter focuses intently on the potential that consumers will both under- and over-react to the genetic information revealed. The agency fears that users will pressure their doctors for potentially unnecessary surgery or medication to treat conditions for which they are genetically pre-disposed, for example. And it assumes that the costs of such information abuse outweigh any benefits — none of which are mentioned in the agency’s analysis.

The company, of course, has agreed to comply with the FDA’s stern warning, and has ceased providing its customers with anything other than hereditary data. For now. Perhaps it will reach some accommodation with the agency, or perhaps the FDA’s ire will prove untamable, an end to the innovative startup and whatever value its technology might have delivered.

But as with every Big Bang Disruptor in our study, winning the battle and winning the war are two very different things.

The FDA is applying a least common denominator standard to 23andMe, and applying it arbitrarily. Already, an explosion of monitoring, testing, and sensing devices are coming on the market, providing consumers with instant analysis of their fitness, blood chemistry, sleep patterns and food intake. It’s only a matter of time before regulators feel compelled by consumer demand to find a way to accommodate better and cheaper innovations, and for slowly changing industries to dramatically restructure themselves in the face of overwhelming new opportunities. The long-term potential of vast databases of genomic data to improve health outcomes, reduce costs, and reorient the debate on medical priorities is too valuable to be held back for long — and arguably the biggest transformation for the healthcare industry since the discovery of antibiotics in the early 20th century.

The information flood is coming. If not this Christmas season, then one in the near future. Before long, $100 will get you sequencing of not just the million genes 23andMe currently examines, but all of them. Regulators and medical practitioners must focus their attention not on raising temporary obstacles, but on figuring out how they can make the best use of this inevitable tidal wave of information.

Whatever the outcome for 23andMe, this is a losing battle for industry incumbents who believe they can hold back the future forever.

 

Larry Downes & Paul Nunes

Larry Downes and Paul Nunes are co-authors of Big Bang Disruption: Strategy in the Age of Devastating Innovation (Penguin Portfolio 2014). Downes is Research Fellow with the Accenture Institute for High Performance, where Nunes serves as its Global Managing Director of Research. Their book has been selected as a 2014 book of the year by the Consumer Electronics Association.

A behavioural vaccine

  • the marshmallow experiment gone wild >> a behavioural vaccine
  • paying tobacconists for cigarettes they refuse to sell to kids
  • paying smoking mothers to quit

 

Listen to the story: 

Good Behavior’ More Than A Game To Health Care Plan

by KRISTIAN FODEN-VENCIL

Danebo Elementary in Eugene, Ore., is one of 50 schools receiving money to teach classes while integrating something called the “Good Behavior Game.” Teacher Cami Railey sits at a small table, surrounded by four kids. She’s about to teach them the “s” sound and the “a” sound. But first, as she does every day, she goes over the rules.

“You’re going to earn your stars today by sitting in the learning position,” she says. “That means your bottom is on your seat, backs on the back of your seat. Excellent job, just like that.”

For good learning behavior, like sitting quietly, keeping their eyes on the teacher and working hard, kids get a star and some stickers.

Railey says the game keeps the kids plugged in and therefore learning more. That in turn makes them better educated teens and adults who’re less likely to pick up a dangerous habit, like smoking.

The Washington, D.C., nonprofit Coalition for Evidence Based Policy says it works. It did a studythat found that by age 13, the game had reduced the number of kids who had started to smoke by 26 percent — and reduced the number of kids who had started to take hard drugs by more than half.

The fact that a teacher is playing the Good Behavior Game isn’t unusual. What is unusual is that Trillium is paying for it. Part of the Affordable Care Act involves the federal government giving money to states to figure out new ways to prevent people from getting sick in the first place.

So Trillium is setting aside nearly $900,000 a year for disease prevention strategies, like this one. Jennifer Webster is the disease prevention coordinator for Trillium Community Health, and she thinks it’s a good investment.

“The Good Behavior Game is more than just a game that you play in the classroom. It’s actually been called a behavioral vaccine,” she says. “This is really what needs to be done. What we really need to focus on is prevention.”

Trillium is paying the poorer schools of Eugene’s Bethel School District to adopt the strategy in 50 classrooms.

Trillium CEO Terry Coplin says changes to Oregon and federal law mean that instead of paying for each Medicaid recipient to get treatment, Trillium gets a fixed amount of money for each of its 56,000 Medicaid recipients. That way Trillium can pay for disease prevention efforts that benefit the whole Medicaid population, not just person by person as they need it.

“I think the return on investment for the Good Behavior Game is going to be somewhere in the neighborhood of 10 to one,” Coplin says.

So, for each dollar spent on playing the game, the health agency expects to save $10 by not having to pay to treat these kids later in life for lung cancer because they took up smoking.

Coplin concedes that some of Trillium’s Medicaid recipients will leave the system each year. But he says prevention still makes medical and financial sense.

“All the incentives are really aligned in the right direction. The healthier that we can make the population, the bigger the financial reward,” he says.

The Oregon Health Authority estimates that each pack of cigarettes smoked costs Oregonians about $13 in medical expenses and productivity losses.

Not all the money Trillium is spending goes for the Good Behavior Game. Some of it is earmarked to pay pregnant smokers cold, hard cash to give up the habit. There’s also a plan to have kids try to buy cigarettes at local stores, then give money to store owners who refuse to sell.

This story is part of a reporting partnership with NPR, Oregon Public Broadcasting and Kaiser Health News.

WIRED: Analytics in 2014

  • Recently, Bain & Co surveyed executives at more than 400 companies around the world (most with revenues of more than one billion dollars). It found that only four percent of companies are really good at analytics, an elite group that puts into play the right people, tools, data and willpower into their analytic initiatives. This elite group is already using insights to change the way they improve their products and services. And the difference is already quite stark:
    • Twice as likely to be in the top quartile of financial performance within their industries
    • Three times more likely to execute decisions as intended
    • Five times more likely to make decisions faster
    • As healthcare industry’s payer/provider model undergoes systemic reformatting, the smart players are already making game-changing moves. Kaiser Permanente is blending various data sources to improve enrollment and primary care. Colorado Hospital Association is modeling future impacts of Obamacare as it is (slowly) rolling out. Cardinal Health is optimizing the efficiency of their product distribution to hospital networks
  • software providers need to answer the call for providing better tools to support the current generation of analytic minds that are destined to change the world.
  • The new guard software firms – Alteryx, Cloudera, Tableau, and others – are growing 30-80 percent annually mainly by disrupting their comparable mega-vendors (3-8% growth).

 

Source: http://www.wired.com/insights/2013/12/analytics-eats-world-2014/

Analytics Eats the World in 2014

  • BY GEORGE MATHEW, ALTERYX
  • 12.23.13
  • 3:02 PM

Old-school Big Data: A huge disk from the c1967 Atlas Disc file. Image: dullhunk/Flickr

Old-school Big Data: A huge disk from the c1967 Atlas Disc file. Will analytics eat the world in 2014? Have your say below. Image: dullhunk/Flickr

 

As 2013 is quickly coming to a close, I return to Marc Andreessen’s seminal thesis that Software is Eating the World. It amazes me to see how much analytics is the metabolic agent driving this shift. Whether analytics is explicitly emphasized in your company’s DNA or it’s invisibly embedded into your business processes; it is the defining value driver of our generation. For industries and firms that embrace this reality, the rewards will be disproportional. To those who don’t make the shift to a data-driven culture: you will be left behind.

Recently, Bain & Co surveyed executives at more than 400 companies around the world (most with revenues of more than one billion dollars). It found that only four percent of companies are really good at analytics, an elite group that puts into play the right people, tools, data and willpower into their analytic initiatives. This elite group is already using insights to change the way they improve their products and services. And the difference is already quite stark:

  • Twice as likely to be in the top quartile of financial performance within their industries
  • Three times more likely to execute decisions as intended
  • Five times more likely to make decisions faster

Industrial Reinvention

Cutting-edge analytics has been integral for the consumer Internet (e.g. mobile gaming). Going into 2014, the reinvention of mainstream industries is where the substantial breakthroughs are occurring:

  • In automotive sector, we see both the emphasis on the culture of analytics at Ford (the only one of the big three that didn’t go through bankruptcy) and in the state-of-the-art in embedded analytics in the Tesla Model S.
  • While Blockbuster shuts down their last retail location in 2013, Redbox is witnessing hyper-growth in their brick-and-mortar DVD rental model through predictive modeling of consumer behavior.
  • As healthcare industry’s payer/provider model undergoes systemic reformatting, the smart players are already making game-changing moves. Kaiser Permanente is blending various data sources to improve enrollment and primary care. Colorado Hospital Association is modeling future impacts of Obamacare as it is (slowly) rolling out. Cardinal Health is optimizing the efficiency of their product distribution to hospital networks.

2014 Predictions

The industry examples mentioned above are just a thin sliver of this enormous watershed. My fundamental belief is that if you are not already ‘moneyballing’ your respective industry, someone is else is already doing it. Some of the drivers that will come to bear in 2014 include:

  • Analysts will matter more than data scientists. There are more than 2.5 Million data analysts in line-of-business functions serving the analytic needs of firms. As much as we wish data science will solve all the world’s analytic problems; there simply aren’t enough data scientists to go around. At the same time, software providers need to answer the call for providing better tools to support the current generation of analytic minds that are destined to change the world.
  • Hadoop moves from curiosity to critical. Hadoop is quickly becoming the general-purpose compute infrastructure for storing well … everything. You can already see this in all the new engines such (e.g. OLTP, real-time, graph, and search) that are already being supported by the Hadoop community.
  • Big Data brings its A-game in marketing. Analytics will have another big year in the Marketing Department influencing advertising, promotions, and consumer behavior. Specifically, sports marketing will put enormous advertising budgets in play as the World Cup in Brazil and the Winter Olympics in Russia.

The New Guard in Analytics

As the 2014 predictions play out, the need for a purpose-built analytics experience is never more real. It is also clear that yesterday’s software was not built for today’s analytics needs:

  • We should never have to worry about the source or shape of the data that is in the hands of data analysts. We should be able to easily blend data across structured, unstructured, and semi-structured sources seamlessly.
  • We should not be dealing with clumsy, 40-year-old programming languages (you know who you are), instead using the sleek, modern algorithms represented in R and Julia.
  • We should not have to deal with unwieldy reporting and dashboard platforms, but treat every data interaction with the ease and visual grace of Tableau.

This is now playing out the financial outcomes in the analytics software market. Yesterday’s mega-vendors are seeing their growth slow to three to eight percent annually. The new guard with Alteryx, Cloudera, Tableau, and others are growing 30-80 percent annually mainly by disrupting their comparable mega-vendors. Clearly, 2013 is ending with a groundswell of analytic users voting for change with their wallets. This will only continue to accelerate, as Marc’s perspective on software becomes prophecy. As we enter 2014, I’m just thrilled to be working on mainstreaming analytics (the main course) as it drives the generational shift of our times.

George Mathew is president and COO of Alteryx. He is on Twitter @gkm1.

Big Data supporting NZ diabetes policy

  • NZ is using big data to drive improvements in diabetes policy and planning
  • The  Virtual Diabetes Register (VDR) is aggregating data from 6 data sources:
  1. hospital admissions coded for diabetes
  2. outpatient attendees for diabetes
  3. diabetes retinal screening
  4. prescriptions of specific antidiabetic therapies
  5. laboratory orders for measuring diabetes management
  6. primary health (general practitioner) enrollments
  • the analytics showed that Indian and Pacific people have the highest diabetes prevalence rates

http://www.futuregov.asia/articles/2013/dec/13/new-zealand-health-improves-diabetes-policy-big-da/

NEW ZEALAND HEALTH IMPROVES DIABETES POLICY WITH BIG DATA ANALYTICS

By Kelly Ng | 13 December 2013 | Views: 2743

The Ministry of Health New Zealand uses big data analytics to accurately determine current and predict future diabetic population to improve diabetes policy planning.

In collaboration with experts from the New Zealand Society for the Study of Diabetes (NZSSD), the ministry created a Virtual Diabetes Register (VDR) that pulls and filters health data from six major databases.

The six data sources were: hospital admissions coded for diabetes, outpatient attendees for diabetes and diabetes retinal screening, prescriptions of specific antidiabetic therapies, laboratory orders for measuring diabetes management and primary health (general practitioner) enrollments.

According to Emmanuel Jo, Principal Technical Specialist at Health Workforce New Zealand, Ministry of Health, the previous way of measuring diabetes using national surveys was inefficient, expensive and had a high error rate.

The new analytical model, using SAS software, significantly improved the accuracy and robustness of the system, combining several data sources to generate greater insights.

Interestingly, analytics showed that Indian and Pacific people have the highest diabetes prevalence rate, said Dr. Paul Drury, Clinical Director of the Diabetes Auckland Centre and Medical Director of NZSSD. Health policies can therefore be focused on this group.

“We have 20 different District Health Boards, and the data can show them how many diabetic people are in their area,” Drury said.

“GPs should know already how many they have, but the VDR is also able to help them predict who may be at risk so they can be prepared. By knowing the populations where diabetes is more prevalent, more resources can be directed at them to provide clinical quality improvements,” he added

Patient privacy is protected by regulating access to data in the VDR.

Forcing the prevention industry – a 10 year journey

Vision

  • The Future of Human API www.thehumanapi.com
  • Forcing the prevention industry into existence
  • Stage Zero disease detection and treatment

Critical trends:

  • lab-in-a-box diagnostics
  • quantified self
  • medical printing

When these trends converge, there’ll be an inflection point where a market is established.

Health data moves from system of record >> system of engagement.

Promoting the evolution from a Product mentality to a Market mentality

As treatment starts to focus on Stage Zero/pre-clinical disease,  it turns into prevention.

 

Video: http://www.youtube.com/watch?feature=player_embedded&v=gJHaoqeucX8

http://www.forbes.com/sites/johnnosta/2013/12/12/the-asymptotic-shift-from-disease-to-prevention-thoughts-for-digital-health

The Asymptotic Shift From Disease To Prevention–Thoughts For Digital Health

It’s been said that good artists borrow and great artist steal.  And I believe that Picasso was right.  So, I guess I’m somewhere between a thief and a artist and that suits me just fine.

I’ve stolen from two great thinkers, so let’s get that out of the way.  The first isDaniel Kraft, MD. Daniel Kraft is a Stanford and Harvard trained physician-scientist, inventor, entrepreneur, and innovator. He’s the founded and Executive Director of FutureMed, a program that explores convergent, rapidly developing technologies and their potential in biomedicine and healthcare. He’s also a go-to source on digital health. I’m stealing “zero stage disease” from Dr. Kraft. Simply put, it’s the concept of disease at its most early, sub-clinical stage.  It’s a point where interventions can halt or change a process and potentially eliminate any significant manifestation of disease.

The second source of inspiration is Richie Etwaru.  He is a brilliant and compelling speaker and a champion for global innovation, Mr. Etwaru, is responsible for defining and delivering the global next generation enterprise product suite for health and life sciences at Cegedim RelationshipManagement. His inspiring video, The Future of Human API really got me thinking.

At the heart of Mr. Etwaru’s discussion is the emergence of prevention–not treatment–as the “next big thing”.

EtwaruSlide

Ok, nothing new so far.  But the important changes seen in the digital health movement have given us a profound opportunity to move away from the conventional clinical identification of a that golf-ball sized tumor in your chest to a much more sophisticated and subtle observation. We are beginning to find a new disease stage–different from the numbers and letters seen in cancer staging.  The disease stage is getting closer and closer to zero.  It’s taking an asymptotic path that connects disease with prevention. The point here is that the holy grail of prevention isn’t born of health and wellness.  Prevention is born out of disease and our new-found ability to find it by looking closer and earlier.  Think quantified self and Google Calico.

And here lies the magic.

We all live in the era of disease.  And the vast majority of healthcare costs are spent after something happens. The simple reality is that prevention is difficult to fund and the health-economic model is so skewed to sickness and the end of life that it’s almost impossible to change. But if we can treat illness earlier and earlier–the concept of an asymptote–we build a model where prevention and disease share the very same border.  They become, in essence, the same. And it’s here that early, early, early disease stage recognition (Stage Zero) becomes prevention. The combination of passive (sensor mediated) observation and proactive life-style strategies for disease suppression can define a new era of health and wellness.

Keep Critical! Follow me on Twitter and stay healthy!

 

Prevention Economics

Right. So I’m now comfortable with the idea that the greatest failing of modern healthcare is for it to have extended lifespan without having extended healthy life years. The challenge then, is to extend fully productive life to something far closer to our life expectancy. This can be done with a plant based diet, fasting and moderate exercise. No pills. No fads. Jus a new norm.

But how do we pay for it? Determine the economic cost of extending a life’s productivity by a year seems like a reasonable first step. Then take a piece of that?

Bring in the direct beneficiaries of such a change – the life insurers, super funds and broccoli farmers.

What a great bunch of business partners they’d be.

Giddy up….