Update: CMS released the proposed structure of the program on July 15, 2016, as part of the annual Physician Fee Schedule rulemaking; see here.
I am strongly in favor of the program, and glad this is happening. Broadly, Diabetes Prevention Counseling is supported by multiple trials and endorsed by USPSTF. For a favorable NYT article on DPP in generation, here and here; for a favorable NEJM Op-Ed, here.
That said, in this essay, I am concocting a critic's viewpoint of what is happening. Why? Because things that payor policymakers do not like are often subjected to withering analysis - pecked to death, so to speak.
What would happen if the more typical, even hypercritical coverage posture was applied to the YMCA Diabetes Prevention Program findings?
Documents referred to in this blog will include:
HHS announcement, March 23, here.
RTI commissioned cost assessment, here.
CMS Office of the Actuary assessment, here.
New York Times article (March 23), here.
NPR coverage, here.
American Journal of Managed Care coverage, here.
CDC article in AJMC, 2015, here.
Article on Digital DPP in AJMC, 2015, here.
What Is CMMI?
The Center for Medicare & Medicaid Innovation was established by the Affordable Care Act of 2010, creating section 1115A of the Social Security Act (here; CMMI website here.) CMMI has an open-ended mandate to test "innovative payment and service delivery models to reduce program expenditures under the applicable titles while preserving or enhancing the quality of care furnished to individuals." CMMI shall test models that are likely to meet these goals, and if the models are validated as meeting the goals, the demonstration project can be made a permanent CMS benefit. CMS shall "conduct an evaluation of each model tested" against the cost and quality criteria, and may conduct "expansion of models" through "rulemaking." While implementing the model, CMS may "waive" any section of Medicare law and "there shall be no administrative or judicial review" of the projects, scope, or "elements" of the projects.
CMMI has announced several high-profile demonstrations in the past year, including a novel payment system in an Oncology Care Model, a bundled physician/hospital/aftercare system for joint replacements, and in March 2016, a wholesale revamping of the Part B physician and hospital outpatient drug payment system. CMMI also created a Pioneer ACO program that differs from the ACO program created by the Affordable Care Act itself. Until the past year (with the exception of the Pioneer ACO program), most CMMI projects were small in scope, involving limited numbers of patients, like the YMCA Diabetes Prevention Program.
What is the YMCA Diabetes Prevention Program Demonstration Project?
Diabetes Prevention Programs are endorsed under criteria managed by the CDC (here) and there are currently over 600 endorsed programs, including individual, group, and even online programs. The YMCA received a $12M grant to test a DPP in Medicare-age patients. The grant was received in 2012 (here) and the program launched in early 2013 (here). There were 5,696 enrollees, at a net cost of about $2000 per enrollee. CMS is contemplating a permanent model that would have costs (payments) of $450 in Year 1 and $180 in Year 2, so the permanent program will run at about 1/3 the cost per person under the research project.
What Did RTI Find (Bottom Line)?
RTI, a research institute which frequently contracts to CMS, reviewed annual reports submitted by the YMCA to CMS; data was provided to RTI in May 2015. CMS has released RTI's 38 page summary of the YMCA program. There were 5,696 enrollees, 15% over age 85 and 181 below age 65. 61% were female. During the program, YMCA began enrolling Medicare Advantage beneficiaries, but the proportion of Medicare Advantage or Dual eligibles was not available to RTI. Patients did not need to complete the 16 session core program. 17% had 1-3 sessions and 26% had 1-8 sessions. Only 40% more than 8 sessions. Financial analysis was based on those patients enrolled prior to December 31, 2014, at which time 4,345 patients had been recruited, and "we present Medicare claims data through December 31, 2014." This suggests that some patients were enrolled with their first session in December 2014 while their healthcare claims data for December 2014 was tallied. The comparator group aka "placebo" group was constructed by screening the Medicare Chronic Conditions Data Warehouse for patients with pre-diabetic-like ICD-9 codes such as 277.1 metabolic syndrome or the diagnosis code for "impaired fasting glucose," in the absence of a diagnosis code for diabetes. Using these criteria alone, the comparison group had more healthcare needs ($1,913 versus 1,302 in the quarter before the test period). If I read the propensity score table correctly, cost may have been a propensity score factor rather than an independent variable. The most notable oddity is that the comparison group excluded patients with a diabetes diagnosis ever, while 34% of the treatment group had a prior diabetes diagnosis. When this group was excluded from the treatment group, thus providing a better match to the control group, savings were halved. If the YMCA group was typical, and if the control group was well-matched, about 50 patients per 1000 in the YMCA group would have converted to diabetes in one year, and about 100 patients per 1000 in the control group (here). But this type of medical data was not provided.
OK, here we go.
- Data is not published
- Noisy numbers
- Parametric statistics and non parametric data
- Use of P=.10
- Not an RCT
- Not a pre-specified analysis
- Lack of Medicare Advantage info
- Actuary ignores costs of survival
- CMS doesn't know lifetime marginal costs
- Cryptic Statement, "Model estimated small program cost"
- Failure to cite prior cost/QALY cost studies of DPP
- No discussion of external validity
- Sniff test
Data is Not Published
Typically, CMS and MACs only consider published peer-reviewed data. This data here appears to be internal YMCA data collated and reported to CMS and then assessed by the contracted consultancy, which is paid by CMS. A died-in-the-wool COI skeptic could ask if it is a conflict of interest that commissioning body wants its demo to be assessed favorably and is paying for the assessment of itself.
Use of P=.10
RTI does not provide any details of its statistical analysis, but the simplest concern is that they applied parametric statistics to non-parametric data. For example, the costs in the prior quarter were $1,302 with a standard deviation (SD) of $3,192 in the treatment group and $1,913 SD $5,825 in the comparison group. This means that the distribution is not Gaussian or normal, and the math for SD is not applicable. Where simple math for SD values fails, simple math for p-values will also be invalid.
Final results were assessed as "significant at the 10 percent level," quite a bombshell that only the word-by-word reader will pick up (at line 5 of page 31.)
Noisy Numbers
The savings are extremely noisy when presented by quarter. For example, the program "saved" variably $1087 to $1301 in the quarters of the year before intervention (!sic) and lost -$108 in the fourth post-intervention quarter while saving $2,861 in the seventh post-intervention quarter but then becoming a loss of -$316 in the eighth post intervention quarter.
Is this "just noise?" Yes, OK. But RTI carefully presents and discusses data by quarter. And our hypothetical overcaffeinated skeptic at CMS might say the numbers are so noisy, varying almost by a factor of ten from quarter to quarter, that something could be unreliable here. "More studies are needed."
Not a Randomized Controlled Trial
DPP was not evaluated in a randomized controlled trial. It is evaluated for its main agency end point, cost savings, only by a post hoc comparison to administrative claims data by random selection after the use of matching protocols.
In other circumstances, CMS coverage or MAC coverage staff would be highly skeptical of such data. When CMS evaluates technologies for the Inpatient DRG New Technology payment, for example, it is usually highly dismissive of single arm studies, whether or not they report good results in comparison to historical controls. Stakeholders also see MACs reject data based on historical or administrative controls when submitted as part of an LCD reconsideration request. The MAC would just shoot back a curt one- or two-sentence reply within the week that coverage is denied because the data was inadequate.
CMS does however use matched historical, administrative control data when it chooses to. Examples being the DPP project I am discussing, and a large in-progress Alzheimer PET scan study run under CMS "Coverage with Evidence Development" (IDEAS; here, here.) This is good; observational studies often do give similar results to RCTs, just with less assurance (urging context-specific judgment see here).
Not a Pre-Specified Analysis. There's no clear statement that the matching cohort was set and locked by a pre-determined analysis. Hence, a relentless skeptic might accuse the study of post-hoc data fishing and model-tweaking until a happy outcome is found (e.g. here; here; here; here and also here, here).
JAMA authors recently warned that when the causality in observational studies is not clear, the conclusions should be viewed with more caution (here; similarly here). RTI may be flagging this concept when it admits that "the source of the short term savings is unclear" (page 31; only short term savings were assessed.) Since DPP somewhat reduces the conversion rate to diabetes (incrementally, from 10 per 100 person years to 5 per 100 person years, here), and over a period of years, and diabetes complications are the source of the costs, why this DPP cuts costs a lot in a couple quarters is less clear. The research did not consider Part D spending, which might have shown as insulin or other drugs being higher in the control group.
For more on the weakness of data windows based on an arbitrary interventional entry point as "T-zero" and compared to past spending, see Al Lewis' book, 'Why Nobody Believes the Numbers" (Amazon here, where it has 42 five-star reviews; another review here).Lack of Dual Bene or Medicare Advantage Info
YMCA apparently did not tabulate whether patients were dual beneficiaries (Medicare/Medicaid) or in Medicare Advantage, although it was aware of this since it began enrolling Medicare Advantage patients in mid study and this improved enrollment a lot, per RTI. But RTI says it has no access to such information, and places data rows in its tables for Medicaid and Dual Bene and Medicare Advantage Status, but leaves them with zero's. However, CMS gave RTI access to patient level claims data, and the agency obviously knew, right in the same computer files, what type of patient enrollment was involved. The balance of often-healthier Medicare Advantage patients in the Y cohort versus the administrative controls is unknown, which could matter, since both patient characteristics and coding characteristics differ in Medicare Advantage populations. It is possible the claims data had only fee for service patients. Generally, it can't be assumed that CPT codes and ICD-9 codes are reported identically in Medicare Advantage populations.
CMS Actuary Ignores Costs of Survival
Once again, I am not complaining about this, but only making a point as wearing the hypothetical critic's hat. CMMI has a clear cut legislative mandate: to carry forward only those projects that will save the Medicare program costs, while not adversely impacting quality. Under the law, CMS must determine the program, if expanded, "is expected to reduce spending" or "improve quality...without increasing spending."
The CMS actuary notes that diabetes prevention programs are expected to increase lifespan and therefore, are expected to increase Medicare spending. He then states "mortality improvement shall not be considered for the purpose of [program] certification." There's no footnote for this. It is a fiat. This rule may exist somewhere else in the statute... but it does not exist to my reading in the CMMI legislation. The accountant's maneuver seems to contradict the plain language of the CMMI legislation (here).
CMS Doesn't Know Lifetime Marginal Costs of Diabetes
Hardly a week goes by without banner headlines about the costs of diabetes and the value of diabetes prevention, but the nearly trillion-dollar CMS agency writes that, "Expected lifetime marginal costs of diabetes were difficult to determine" (Actuary, page 8). It is impossible to project the costs over time of patients with and without diabetes, if you don't know the marginal cost of diabetes in the population.
"We Estimate a Small Program Cost"
Buried in the same section, we find the cryptic sentence, "Taking all of these assumptions together, the model estimated a small program cost (Actuary, page 8)." Since the overall headline at CMS is program savings, I am stumped by what that means.
Failure to Cite Prior Studies of Costs & Cost Effectiveness
Three prior studies on the cost effectiveness of pre diabetic lifestyle intervention were easily found on PubMed. I am not an economist and am quoting from my reading of the conclusions. They seem to find that DPP is cost effective (reasonable cost) but not cost saving. A 2012 study in Diabetes Care was a ten-year study and found very slight improvements in accrued QALYs per ten years (6.74 for placebo and 6.89 for lifestyle intervention graduates), with cost per QALY of $12,878 (not cost saving; here). Another analysis without discounting found the lifestyle intervention cost below $5000/QALY but again, this is not "cost-saving" (here). A large multi-intervention meta-analysis published in Diabetes Care in 2010 placed pre-diabetes lifestyle intervention into the "highly cost effective" set of interventions, but did not place DPP into the cost-saving group (here). I believe conclusions are similar in a 2016 review (here). An authoritative review article in NEJM in April 2016 cites the cost effectiveness of DPP as $14,000/QALY - good but far from "cost savings" (here). In July 2016, ICER/CTAF published a 145 review of the field, and the economic reviews show variable cost/QALY but rarely if ever cost saving results (here; circa p. ES10-12).[*] Finally, where HHS announces cost savings of "$2,650 per enrollee," most pharmacoeconomic studies have very wide error bars and wide confidence ranges depending on diverse uncertainties and assumptions and under a good journal's peer review will demand a lot of sensitivity testing graphics because of these many uncertainties.
A systematic review, not available to CMS policmakers in 2015, appeared at the beginning of 2017 (Barry et al., BMJ, here.) Again, results were much less favorable than CMS's point estimate of population benefit.
There was no recognition of the glaring differences between the fantastical economic results of the short, informal, and uncontrolled "Y" data extrapolated by CMS and the far lesser results of years of controlled studies in trade journal press such as at HFMA (here). Imagine if Toyota said they were using a slightly new type of tire on a Corolla and it increased mileage from 28 mpg to 100 mpg - wouldn't experienced Road & Track journalists apply a "sniff test" before believing the claim? For an example of a more tempered model of the scale of CMS savings for diabetes prevention, see a report by Avalere on proposed legislation HR 962/S 452, which expected about $8B in spending per decade for $9B in savings, or a small marginal net savings around 10% of spending. This is clearly a "margin" that would decay to zero or go negative if any of many assumptions in the modeling were overoptimistic. Avalere model here.
No Discussion of External Validity
Both the CMS coverage group and MAC policymakers raise high concerns when reviewing your data, that not enough is known about external validity. They often reject a service for coverage on this basis. There is little discussion of external validity here in either the RTI report or the Actuary report (not none, but little) Admittedly, for its part, RTI was assessing YMCA data, not hired to opine on external validity. For his part, the Actuary may view external validity as undefinable and therefore not really germane within the bounds of an accountant's assessment. So the project escapes getting much of an external validity assessment.
Sniff Test
Let's say there were 1000 patients in the YMCA group and 1000 patients in the administrative control group, the matched historical cohort. In the YMCA group, the average patient is 5-10 pounds lighter. In the YMCA group, over a year, 50 patients convert to diabetes, and in the control group, 100 do (inferred, here). Across all the patients (including the 950 who do not convert in the treatment group and the 900 who do not convert in the control group), for each pair of patients, the treated patient has $2500 less medical costs. The treatment group thus saves $2.5M per 1000 patients. The treatment group typically had 8 or less group health behavior sessions. If all the cost difference was concentrated in the control group's predicted surplus 50 diabetic patients, in their first diagnosed year, each of those patients in their first diagnosed diabetic year would have $50,000 in excess costs ($2.5M/50). This is as much as the lifetime surplus diabetic costs estimated by the CMS Actuary (page 8 line 6). This suggests the cost savings (if real) are not due to the "prevention of diabetes" alone, but due to other kinds of differences between the treatment and control groups.
Another use of the Sniff Test is that myriad groups are constantly combing through every nook of Medicare and climbing all over the Hill looking for pay-for's to pay for something new for which funding is desired. If the data were really solid that the DPP saves $2500 per Medicare patient per year, and if there are 1M Medicare patients who qualify, then the benefit will save $2.5B per year or $25B every ten years. No one has ever suggested this as a pay-for that would withstand CBO scrutiny.
What Happens Next?
HHS writes,
"The Diabetes Prevention Program can prevent disease and help people live healthier lives,” said Dr. Patrick Conway, CMS Deputy Administrator and Chief Medical Officer... We are now working to determine the best strategies for incorporating the Diabetes Prevention Program into Medicare."
....The Administration supports expansion of the Diabetes Prevention Program. This certification is a critical step in expanding the Diabetes Prevention Program for Medicare beneficiaries with pre-diabetes.
CMS is considering how it would expand this model broadly throughout the Medicare program. More information about how CMS could expand the Diabetes Prevention Program will be included in the CY 2017 Medicare Physician Fee Schedule proposed rule. (press release here).This packs in a lot of endorsements. The benefit is endorsed not only by the CMS Chief Medical Officer and by the Secretary of Health but also by the Administration as a whole. Here, Dr. Conway is the "Deputy Administrator" of CMS endorsing the CMMI program - in fact, he is the Deputy Adminstrator of CMMI itself.
Does All This Parallel an Available, But Boring, NCD/USTPF Route?
When the USPTF endorses a preventive service - and that is a very high barrier - CMS can create a matching benefit for beneficiaries via the NCD process. USPTF endorses screening for impaired glucose metabolism (here, here) which is interwoven with the value of subsequent intervention programs, which could be either early low dose antihyperglycemic drugs or lifestyle interventions. The USPTF recommendation is a paired recommendation for glucose screening age 40-70 in overweight persons coupled with referral to behavioral interventions. CMS wellness visits include diabetes screening, which by definition will pick up either diabetes or pre-diabetes (here). However, Medicare does not offer a pre-diabetes behavioral intervention benefit (so patients who get the glucose testing and fall short of a diabetes diagnosis are in the lurch right now.)
Seemingly, CMS has been able to create an NCD-based benefit at least since October 2015, under the USPTF ruling. As CMS rolls out far bigger CMMI projects and needs credibility for CMMI, CMS may get more bang for the buck by creating a new DPP benefit under the CMMI umbrella instead of the old-fangled USPSTF/NCD method. As a quality measure, there is no reason CMS can't cite the USTPF position as strongly supporting CMMI benefit expansion under the YMCA demo. But the USPTF position doesn't consider cost and won't support the cost analysis at CMMI; RTI is needed for that.
DPP Through Summer PFS Rulemaking
Implementation. CMS can implement the DPP program through rulemaking. The simplest rulemaking would provide piggyback on the existing wellness visit glucose testing benefit, and state that if glucose and BMI fall in a certain risk range, then CDC endorsed preventive counseling is covered. As now, if the test results are worse and fall into the diabetes range, the patient get an existing diabetes education benefit.
Fee and Modality. CMS provides a proposed fee schedule for this, in the Actuary document page 2 (vide infra). CMS needed to propose program costs in advance to allow its Actuary to do a cost neutrality assessment. Using CDC endorsed programs, the benefit could include individual, group, and online therapy delivery since all these types of programs are currently endorsed by the CDC. However, the level of payment offered by CMS might make individual programs unfeasible.
Coding Today. There are several CPT coding choices, 98969, online medical evaluation by a non physician and 99412, group prevention counseling. For 2016, both are status "N" not payable for Medicare (here).
There is also a specific Category III code new for 2016 (here), which states:
0403T: Preventive behavior change, intensive program of prevention of diabetes using aInterestingly for Medicare policy wonks, code 0403T is "carrier priced" rather than "status N" at this time.
standardized diabetes prevention program curriculum, provided to individuals in a group
setting, minimum 60 minutes, per day
The Simplest Thing CMS Could Do:
State Entry Criteria and Cover a CDC-Endorsed Program
The simplest thing for CMS to do is to give coverage under a set of beneficiary criteria (glucose, BMI) and then cover the intervention under any CDC endorsed program (e.g. qualified under the National Diabetes Prevention Recognition Program).
This would mirror how CMS covers diabetes education programs for diabetic patients; it simply deems the ADA or AADE to be endorsing/approving bodies for credentialing (see 42 CFR 410.140ff, here) and is hands-off regarding further details.
Defining the CMS Proposed Benefit (from Actuary's Document)
For the purpose of giving a mandate to its actuary, CMS proposed this benefit:
Benefit:
16 weekly sessions [if all used, $360]
6 additional monthly sessions [if all used, $90]
Maintenance sessions after the first year ["minimum of 3 per quarter with no maximum", $180]
Requirements:
BMI 25 or greater;
A1C of 5.7-6.4 (neither higher nor lower), OR, impaired fasting glucose 110-125, OR, impaired glucose tolerance 140-199.
No previous diagnosis of diabetes.
No life-threatening conditions, no mobility issues prohibiting participation, "etc." (sic).
And Here The Wonk Sees:
Wearing the highly skeptical policy wonk hat, as throughout this blog:
It's unclear what would happen if the patient had an impaired glucose in range, but A1C out of range, or vice versa, or one glucose in range and one out of range. If you tested a few times a borderline patient would eventually randomly have a high enough number.
Next, it's unclear how the doctor or behavioral program (as opposed to a CMS computer) could know if the patient had ever been given a diagnosis code for diabetes by any provider.
Finally, the "etc" tossed into the definition of excluding conditions would need to be improved during summer rulemaking. Without knowing the meaning of "etc" in the population definition it's hard to say the rest of the actuarial modeling could be ironclad. Can we imagine a peer reviewed article in JAMA citing the exclusion criteria as A, B, and "etc."?
Some preventive benefits have an age cap. No age cap is mentioned by CMS/RTI. (15% of the YMCA patients were > 85, but there isn't any subgroup analysis and from the data noise we already see, a subgroup analysis would likely be impossible.) The most rigorous reviewers we have in this area, the USPSTF, did consider age, and as a result could endorse the glucose screening and behavioral intervention benefit only up to age 70. This also covers the first few years of the Medicare population, perhaps one of the reasons CMS has not acted on the USPSTF recommendation on the regular coverage side of the house.
___
Short link for this page: http://tinyurl.com/dppwonk
[*] The CMS CMMI data discussed in this report is quoted briefly at face value by ICER.