When we hear about AI regulation, it's usually a discussion of the FDA. However, Section 1557 of the Affordable Care Act effectively regulates the use of algorithms of any type, to the extent they might impact discrimination issues of any type. (And the regulation is to demonstrate that they don't.)
The area is complex but covered in detail in a November 14, 2024, article in Health Affairs Forefront by Han, Tsai, and Khazanchi.
Who has to be aware, includes "hospitals, clinics, private or public payors, academic medical centers." What has to be checked, are "diagnostic, screening, prognostic, or risk prediction algorithms." (Abbreviated, PCDST, patient care decision support tools.) What kind of discrimination spans, "age, race, color, sex, age, disability, national origin." (See the long series of details at Han et al.)
The compliance date is May 2025 (the same as Phase 1 of the FDA LDT rule!).
Topics like race-based renal function corrections are discussed.
##
Some parts of the rule are under federal court litigation. The authors write,
In some states, implementation of Section 1557 may be delayed due to ongoing litigation. Several states challenged specific provisions of the Rule that expanded “discrimination on the basis of sex” to include gender identity. In Texas v. Becerra, the court ordered that the effective date of the entirety of the Rule be stayed within Texas and Montana. In Florida v. HHS, the court ordered that the effective date of certain portions of the Rule be stayed within Florida. In Tennessee v. Becerra, the effective date was stayed nationwide specifically where the Rule extends to discrimination by gender identity.
(Some of the debate about "sex discrimination" and relation to "gender" or "transgender" span the Obama, Trump 1, Biden, and Trump 2 administrations, with flip-flops as each election occurred.)
##
I covered the proposed rule in July 2022 here.
I discussed and footnoted some of the flip-flops in July 2024 here.