This month, two significant CMS positions on software and algorithms appear in proposed rules, which are open for comment for 60 days.
One focuses on software in the Hospital Outpatient (APC) setting. The other focuses on use of parameters like age, sex, race in "computer algorithms" and decision support, to avoid "discrimination."
___
The non discrimination (and computer algorithm rulemaking) is here and will be officially published in the Federal Register on August 4; comment to October 3.
SEE THE FINAL PAGINATED VERSION, FED REG, AUG 4, 2022. 87 FR 47824-920, comments to October 3. Here.
Press release here July 25. See comment at Health Affairs here and Foley Hoag here. The rule, and the comments, also touch on the back-and-forth history of some of this regulation which stems from ACA 1557. I provided a little back story in my July 26 blog here. It does things like amplify Section 1557 by stating that sex discrimination is defined to include gender identity factors, so that sex discrimination extends to conditions like gender surgery.
The hospital outpatient annual rule is here (87 FR 44502, 7/26/22) and comment open to September 13. Focus in this blog on software policy.
___
AGE, SEX, ALGORITHMS AND AI
Regarding non-discrimination and algorithms, CMS proposes that under Section 1557 of the Affordable Care Act, algorithms that use gender, sex, age, or other conditions in a discriminatory way are not allowed.
It's not entirely clear to me (either from the text of the rule or the early commentaries cited above) how this impacts AI or other clinical algorithms that depend on sex and age as major drivers of their accuracy.
Discussion focuses on situations where there have been discriminatory algorithms such as race-weighted algorithms for kidney disease. However, it is common Bayesian statistics to weight predictions by baseline case states, such as the rise in cancer with age.
The discussion is at page 176ff of the draft rule PDF, and the regulation itself is very concise at 92.210 (page 305). "A covered entity must not discriminate on the basis of race, color, national origin, sex, age, or disability in its health programs and activities through the use of clinical algorithms in its decision-making."
For example, they call out the ICU critical care SOFA score as potentially racially biased because Black ICU patients have higher average scores. The causes of the higher scores are important, but it's not clear to me that the "index" SOFA is racially biased. Black patients also have almost 3X higher triple negative breast cancer, (here), but does that mean Her2neu histochemistry is racially biased?
See an LA Times article on a California state investigation on use of race and causes of racial bias in hospital protocols, by Noah Goldberg, here.
# # # # #
HOSPITAL OUTPATIENT PAYMENT: MORE ON SOFTWARE
We were talking about the nondiscrimination and equity rule (algorithm section). Shifting gears.
The regular annual hospital outpatient proposed rule is published and paginated in the Federal Register 7/26/2022, 87 FR 44502 forward. Here. Pricing specific to the Heartflow product is again discussed, page 44569ff. However, in this blog, I would point to the later, and lengthy, general discussion of hospital outpatient software pricing, at 44684ff. This section is titled, "OPPS Payment for Software as a Service."
Note there are some potential differences between physician office (PFS RVU) pricing and hospital outpatient (APC) pricing. Office services are paid by piecemeal CPT codes, although CMS has an option to bundle software costs as overhead vs more specific recognition of costs. CPT codes may contain specified components,e.g. "injection procedure, includes ultrasound if performed.")
Hospital services are paid in broader payment categories (e.g. Radiology Level 1, Radiology Level 2, etc) where different services and often their components are bundled in on APC. To give a simple example, a PET scan and its radiotracer are paid separately in the physician office setting, but the tracer is secondary to (not paid separately) the PET scan in the hospital outpatient setting.
Turning to software as a service, CMS discussion extends from page 44684 to '89, in pretty fine print, and I'm not sure I've absorbed it fully. They are describing what they see as problems for software payment, and in particular focus on imaging software that occurs later than the original CT scan or other imaging, and involves an interpretation technology different than that paid for by the original clinical interpretation. (E.g. very sophisticated next-level software ordered for heart vessel blood flow aka Heartflow company). CMS is focused on seeking comment on its alternatives for this very important topic.
They are dubious of licensing and per click software, "empirical research has shown that pay per use may lead to overuse of AI technology." I suspect somewhere down the line, they don't want a $1M MRI machine with software converted overnight to a $1M MRI machine plus a $1M software package to allow it to be turned on and used.
In July 2022, AMA announced codes for software used in pathology for whole slide imaging, as Category III codes that could be "added" to existing surgical pathology codes. It's unclear whether, in December, CMS will view this as payable in the APC setting or bundled to the primary surgical pathology code. (And in fact, I understand that primary surgical pathology codes for hospital outpatients are now already usually bundled to APC fees to the hospital for the surgery). Much to be written on these topics in the coming months.
__
Though not related to software, some readers may enjoy discussion of payment for Category B investigational trial devices, page 44683ff of the outpatient rule. The outpatient rule on software also has a quick call-out to AI and racial or ethnic nondiscrimination, which is covered in more detail in the separate nondiscrimation rulemaking.