Cochrane study misses the mark on efficacy data for digital diabetes interventions

By: Jonah Comstock | Apr 1, 2013        

Tags: | | | | | | | | |  |

WellDoc DiabetesManagerA recent meta-analysis from The Cochrane Library found that computer-based interventions for Type 2 diabetes provide, at best, a small improvement from traditional care methods. The analysis, based on 16 studies, found that the interventions reduced A1C (the standard measure for blood glucose control) by 0.2 percent, although that rose to 0.5 percent for mobile-based interventions. Measures of other outcomes, such as weight and quality of life, weren’t statistically significant.

Researchers looked at 16 studies with a total of 3,758 participants, culled from nine databases of published papers or theses, going back as far as 1997. The analysis was limited to adult-focused, computer-based interventions for Type 2 diabetes that included randomized control trials (RCTs). Studies were also excluded if their intervention didn’t focus on self-management, such as if they primarily used computers to connect patients with human caregivers.

The review seems, on the surface, to be somewhat disheartening for mobile health, a field where diabetes management is considered one of the most promising avenues. Companies like WellDoc and Omada Health stress the efficacy research that underlies their products. is currently working on a pilot for mobile diabetes management, as well.

However, the differences between the various interventions in the study are considerable, to the point where it’s hard to say how valuable the data really is. Even the study authors acknowledge this point in the paper.

“Given the heterogeneity in design, reporting and effect of computer-based interventions it is also important to find the most effective components or behaviour change techniques to achieve the desired impact,” they wrote.

Some of the interventions included in the meta-analysis only used the computers to present information (which the control group received on printouts) with no interactive component. In others, the computer intervention was a social network of some kind, with the study designed to look at the effect of online social support on diabetes management. Some were tracking programs or text-messaging interventions; one even used pagers. The study included WellDoc’s two efficacy studies, alongside a 1997 study that used a touchscreen kiosk patients interacted with only twice.

The studies included were unified by form, not content. It might be comparable to a meta-analysis on drugs in general for a given condition. Among the most interesting findings of the analysis is, in fact, the .3 percent jump in A1C effectiveness for mobile vs. nonmobile interventions. However, the 16 interventions the study looked at included only four mobile interventions, two of which were the WellDoc studies.

“It is not clear why interventions delivered over mobile phones appear to be more effective,” wrote the authors. “It could be due to convenience (and therefore adherence), intensity of the interventions (mobile phone interventions were more likely to have multiple daily contacts) or the behavior change techniques used by the interventions (mobile phone interventions were more likely to use cues to prompt behavior and provide rapid feedback afterwards).”

While meta-analyses can be a good benchmark for trends and efficacy data, this one may be too broad in scope to make any conclusions about the general effectiveness of digital diabetes tools.

  • Chris Bergstrom

    Apropos headline. It’s nice to see a growing baseline of evidence available for meta review. However, as well intentioned as the research may have been it seems to have missed the trees for the forest. Akin to reading a study that concludes that the average fuel efficiency of SUV’s is 17mpg (e.g. Computers -0.2A1C) and sedans 22mpg (e.g. Mobile -0.5A1C), all the while, overlooking that vehicles powered by an innovative electric engine on an optimized chassis – like Tesla, Leaf, or Volt – achieve 50-100+mpg (e.g. a clinical and behavioral expert system deployed across multiple technology platforms can achieve a -2.0A1C).

    Still, the paper provides a catalyst for people to ask the right questions…
    “What products work? Which ones don’t work? How do we know which is which? And ultimately, Why do some work and others not?”

  • MobiHealthNews

    Chris Bergstrom Agreed. Think you nailed it with that analogy.

  • Jack J Florio

    I fully agree with Chris as well. We need to be cutting the data to see what worked – ant with the rapid changes in our field, combined with the ability to learn what is working and what is not in almost real time, I am guessing that the studies done in the last 5 years will tell a different story when the data in analyzed.