This series of blog posts will address existing and emergent concerns about ethics in the education technology market. The goal of this series is to help technology entrepreneurs begin to think through and respond to the risks their companies face with regard to ethical challenges. Failure to develop an effective and clear ethics policy has destroyed multi-million dollar companies, and will do so again in the future. Too often, entrepreneurs think that they can wait to respond to ethical concerns until after their product has become successful. We have observed over and over again that the most successful products are designed with the ethical use of that product in mind. These posts will encourage readers to consider how successes and failures in the technology market can be understood in terms of ethical approach.
Data Ethics in Education Technology
How data is used in education technology can determine the success or failure of a company. By looking at the history of the ed tech market we can learn how good ethics forms the foundation of successful ed tech companies.
The debate about the role of big data in education has sometimes been cast as a dispute between reformers and traditionalists. Reformers are described as those who support technology as the solution to the problems of the education system, whereas traditionalists tend to support teacher pay increases, reduced class sizes, etc. as the better approach. This debate has often conflated multiple ethical questions within education technology under the same heading: concerns about big data. Disentangling the concerns surrounding the processes of education data is mandatory if companies are to effectively formulate and communicate sound ethical data use policies to their current and potential users. When it comes to ed tech, good intentions and high hopes for digital tools are not enough—parents, teachers and students need concrete evidence that the providers are not merely interested in the bottom line.
Fully understanding what’s at stake requires that we distinguish between issues of data privacy, data commercialization and predictive data modeling. By separating and clarifying threads of the discussion we can better understand the ethical questions in the discussion, avoiding the politicized binary in the debate. I will address each of these issues in separate posts, beginning here with a discussion of some notable successes and failures around data privacy. This will allow us to respond to the technological needs of stakeholders with greater care and efficiency.
The ethical concerns around big data privacy became increasingly pronounced in 2014, and this discussion has become very visible in education technology. The very public collapse of InBloom in failing to properly communicate their policies on data handling helped to bring these questions to the forefront. As researchers Jules Polonetsky and Omer Tene have described it, InBloom’s rapid expansion “brought to the fore weighty policy choices, which required sophisticated technology leadership and policy articulation.” The necessary leadership and articulation was never achieved, and ultimately the public outcry against InBloom forced many districts to end their relationship with company. The lesson to be learned here is not that big data analytics are a failed solution to the problems of our education system, but that big data practices need to be more carefully articulated. Education data needs to be focused on generating demonstrable learning outcomes, rather than the mere collection of data itself. One notable feature of InBloom’s failed policy was the move to collect as many data points about students as possible, rather than targeting data collection practices around specific hypotheses. The “collect and then measure” approach to education data is a mistake. Ed tech companies need to be able to say why they are collecting each data set and exactly what that data is going to achieve. Data practices must also include robust and simple privacy policies and be explained in ways that are easily understandable to the public if they are to succeed.
Mere legal compliance does little to assuage the concerns of school administrators, teachers, students and parents. Adequate privacy legislation has yet emerge in response to the new technological realities we now live with, especially with regard to education. Researchers recognize that “as many businesses have learned firsthand, in an area as fraught with social sensitivities and frequent misperceptions as privacy, legal compliance does not make up for a lack of coherent policy vision.” This suggests that there is no one-size-fits-all policy vision available in EdTech—companies will need to customize and iterate their business plans around a core set of values.
Some companies such as Vivoinspire have sought to avoid storing student data associated with a particular student name, thus avoiding the privacy issue almost entirely. Other companies may need to keep student data associated with student names in order for their software to function, and these organizations have an especially great need to clearly articulate their use of student data. Companies of this type will need to focus extra resources on security and encryption, and make it a point to clearly express their commitment to privacy to administrators, teachers, and parents.
In the next post I will discuss the questions around commercialization and monetization in ed tech with a view to designing products that meet rigorous ethical standards while also supporting a profitable organization.