In 2017, 43 Minnesota school districts were identified by the Minnesota Department of Human Rights for discriminating against students of color in out-of-school suspensions. That next year, Ramsey County, the City of St. Paul, and St. Paul Public Schools entered into a Joint Powers Agreement (JPA) around integrated data sharing, which would assign a risk score to flag children who are at-risk for future involvement with the juvenile justice system.
My background is in education justice. I have focused for years on the school-to-prison pipeline and the disproportionate number of out-of-school suspensions Black and brown children receive and the data that tells us why. When community member Laura LaBlanc learned of the JPA, she called members of the education and juvenile justice advocacy community.
I didn’t know much about the JPA, but I knew that if they were proposing to use suspension data as an indicator of anything, the information would be grounded in racial bias. I joined Jaylani Hussein at the Council on American-Islamic Relations, Muneer Karcher Ramos at Saint Paul Promise Neighborhood, Khulia Pringle from Stand Up MN, along with others, to ask questions of the decision makers regarding what they knew about Big Data, predictive analytics, algorithms, and the potential for harm in proceeding.
As we started unpacking the JPA, we became very concerned. Data had been collected from child protection agencies, the foster care system, and county prosecutors, then merged with records about free and reduced lunch recipients, school attendance, and suspensions. We asked the JPA partners to pause the process they had approved until we could answer questions together. It became clear that there were many red flags:
Advocates are looking for restorative practices. The JPA was putting money into another tool, instead of into solutions.
The idea behind the database was that predictive analysis would be used to identify students most likely to struggle. We asked what difference it would make to have an algorithm make these predictions, rather than having people in relationship with students identify what solutions were needed.
A high school student attended one of our community meetings where we built awareness of the JPA. She learned that the algorithm would be a kind of early warning system, ranking students from 1 to 5. When she went back into her classrooms, she mentally did her own ranking — the kids at the front of the class might be a 1, and the kids who were barely showing up, and on phone devices, would be closer to a 5.
She realized that 18 months earlier, when her family was experiencing housing instability, she would have been pegged as higher risk. “It would not give a clear picture of who I was and what I needed,” she told us. “My past should not predict my future. This is just a different way of labeling.”
Together, community members engaged in conversations that led to the JPA’s cancellation in February 2019.
Nationwide, data is emerging as a new economy without much discussion and regulation about it. So we have created the Data for Public Good campaign. We want to engage parents and students to work with data creators and funders to build a better system from design to delivery. We want stakeholders to be equipped as an oversight body, seeing that data-gathering adheres to ethical principles.
Data and surveillance can do good things. But we need many more safeguards and understanding of the inherent bias in developing these data systems.
Next steps for our network include talking to the Minneapolis City Council about this concept, which they have been exploring, and connecting with philanthropists who are supporting data collection systems. We are partnering with national groups involved in this work to develop principles.
Marika Pfefferkorn (she/her) has a background in education, and is director of the Twin Cities Innovation Alliance (TCIA), which focuses on the role of data in education, governance, infrastructure, and transportation.