Letter Text (PDF)

Washington (December 19, 2023) – Senator Edward J. Markey (D-Mass.), chair of the Senate Health, Education, Labor, and Pensions (HELP) Subcommittee on Primary Health and Retirement Security, led his colleagues in a letter to Federal Drug and Administration (FDA) Commissioner Robert Califf urging the agency to ensure that its new Digital Health Technologies Advisory Committee prioritizes the health and safety of all Americans, by asking the FDA to include members with civil rights, medical ethics, and disability rights backgrounds on the Committee. The Committee will advise on the development, regulation, and implementation of digital health technologies (DHTs), such as artificial intelligence (AI), telehealth, and wearable devices. The letter also urged the Committee to consult with health care workers and associated unions when developing their recommendations. The letter is co-signed by Senators Bernie Sanders (I-Vt.), Bob Casey (D-Pa.), Amy Klobuchar (D-Minn.), Tammy Duckworth (D-Ill.), and Alex Padilla (D-Calif.).

Digital health technologies have proliferated throughout the health care system with the promise of reducing costs, maintaining staff retention, and improving patient outcomes. However, studies have shown that when marginalized groups are not considered during the development of these products, DHTs fail to provide robust and equitable care to all patients. Poorly designed models have over-recommended risky medical procedures for Black and Latino pregnant patients, incorrectly assumed that Black patients need less care than white patients, and have routinely underperformed when used on patients with marginalized identities like the elderly and people with disabilities. Without including the voices of underrepresented and marginalized groups, DHTs will continue to perpetuate biases, worsen patient outcomes, and erode both healthcare provider and patient trust.

In their letter to Commissioner Califf, the Senators wrote, “The development, regulation, and implementation of these technologies must be guided by principles that prioritize the health and safety of all individuals. The civil rights and medical ethics implications of DHTs are manifest. It is essential that a voice from a civil rights organization and one with expertise in medical ethics are part of the Digital Health Advisory Committee.”

The Senators continued, “The risks of DHTs to patients, health care providers, and the health system as a whole are significant, but as the Biden administration noted in the Blueprint for an AI Bill of Rights, ‘they are not inevitable.’ Only through robust and clear regulation of these products — including an active and intentional consideration of their ethical implications and the disparities they can propagate — can we hope to ensure public safety.”

Senator Markey has called on the federal government to investigate AI and stop algorithmic injustice. In December, Senator Markey introduced the Eliminating Bias in Algorithmic Systems (BIAS) Act, legislation to ensure that every federal agency using, funding, or otherwise involved with AI maintains an office of civil rights focused on combatting AI bias and discrimination that reports directly to Congress. In November, he Chaired the HELP Subcommittee on Primary Care and Retirement Security’s hearing “Avoiding a Cautionary Tale: Policy Considerations for Artificial Intelligence in Health Care” where concerns about the potential detrimental effects of AI and the need for proactive policy solutions were discussed. In October, he and Representative Jayapal applauded the Biden administration for heeding their call to incorporate the White House Blueprint for an AI Bill of Rights into its artificial intelligence (AI) Executive Order. In July, Senator Markey and Representative Matsui introduced their Algorithmic Justice and Online Platform Transparency Act, legislation to ban harmful algorithms, bolster transparency by holding websites accountable for their content amplification and moderation practices, and commission a cross-government investigation into discriminatory algorithmic processes throughout the economy.

###