When technology companies and governments collaborate on the production of technologies that affect citizens lives, how should they account for the well-being and needs of the citizens? What if those citizens are already vulnerable and subject to increased surveillance? It is paramount that in such cases design decisions are made carefully and in consultation with stakeholders. In this piece for the Data & Society Medium channel Points I discuss a recently mandated change to Medicaid policy that resulted in a hurried design process that poses significant privacy and security risks for Medicaid clients. The technologies resulting from this process have embedded in them insulting assumptions about how disabled people live their lives and what role technology should have in supporting their dignity and freedom. This piece has already resulted in policy victories on the part of disability rights activists, who won many of the demands listed at the end.
“When caregivers log into a GPS-enabled EVV device, they also provide a precise and analyzable location history of the clients, and by implication a record of private and Constitutionally-protected behavior. This is one of the tricks of data analytics, especially when enhanced by machine learning techniques: data is never just about the thing you originally think it is, it is always also about what can be mechanically inferred. EVV adds an extra layer of intrusion because it unilaterally alters the labor-management relationship between PCA and client while offloading audit responsibility to the client. EVV establishes distrust as a baseline in a relationship that is fundamentally about trust.”