Last month, the Center for Democracy & Technology (CDT) released a new report called Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People with Disabilities. The report focuses on algorithm-driven tools that reduce or terminate public benefits. It analyzes how people with disabilities and their lawyers have challenged these tools in court.
In the report, CDT cites several important court decisions when describing states’ constitutional requirements and their obligations under the Administrative Procedure Act and the Americans with Disabilities Act. Some cited cases require states to provide notice to recipients prior to algorithm-driven cuts to benefits, and to provide enough information for people to know how to contest algorithms’ results. Other cases require states to inform the public that they are planning to use the tools and to allow people to submit comments prior to implementation.
The report also describes institutionalization as a form of discrimination on the basis of disability, because it isolates disabled people from the community. Plaintiffs have shared that their care hours were cut almost in half. When algorithm-driven tools cause such deep cuts to supports and services, people with disabilities may have to go to institutions to receive necessary care that they should be able to get at home.
CDT’s recommendations to state governments, attorneys, and disabled self-advocates flow from a few key takeaways. First, when states implement an algorithm-driven tool to make benefits determinations, they are making a policy decision that affects people’s lives and evokes new legal and constitutional questions. Second, disabled people and other experts on algorithms know best the impact of algorithm-driven benefits determinations, so they should drive attorneys’ litigation and advocacy strategies. Finally, in addition to litigation, self-advocates have several avenues to call attention to unjust algorithmic tools, including social media, public government meetings, and the press.