When Data Is Misused

Marika Pfefferkorn. Photo by Sarah Whiting

In 2017, 43 Minnesota school districts were identified by the Minnesota Department of Human Rights for discriminating against students of color in out-of-school suspensions. That next year, Ramsey County, the City of St. Paul, and St. Paul Public Schools entered into a Joint Powers Agreement (JPA) around integrated data sharing, which would assign a risk score to flag children who are at-risk for future involvement with the juvenile justice system.

My background is in education justice. I have focused for years on the school-to-prison pipeline and the disproportionate number of out-of-school suspensions Black and brown children receive and the data that tells us why. When community member Laura LaBlanc learned of the JPA, she called members of the education and juvenile justice advocacy community.

I didn’t know much about the JPA, but I knew that if they were proposing to use suspension data as an indicator of anything, the information would be grounded in racial bias. I joined Jaylani Hussein at the Council on American-Islamic Relations, Muneer Karcher Ramos at Saint Paul Promise Neighborhood, Khulia Pringle from Stand Up MN, along with others, to ask questions of the decision makers regarding what they knew about Big Data, predictive analytics, algorithms, and the potential for harm in proceeding.

As we started unpacking the JPA, we became very concerned. Data had been collected from child protection agencies, the foster care system, and county prosecutors, then merged with records about free and reduced lunch recipients, school attendance, and suspensions. We asked the JPA partners to pause the process they had approved until we could answer questions together. It became clear that there were many red flags:

  • The partners had not been transparent and collected data without indicating what it would be used for.
  • The JPA was presented to elected officials as if everyone was on board, when in reality community members did not know what the data was being used for.
  • Because of data privacy, there was no community-based governance. Yet the St. Paul police, as a member of the JPA coalition, had access to the database.
  • There were insufficient safeguards on older technology that had already been compromised.

Advocates are looking for restorative practices. The JPA was putting money into another tool, instead of into solutions.

The idea behind the database was that predictive analysis would be used to identify students most likely to struggle. We asked what difference it would make to have an algorithm make these predictions, rather than having people in relationship with students identify what solutions were needed.

If the intent was to send a case manager to the home to connect families with

resources, we asked, what prevents that from happening now with what we already know? Why invest in a mythical silver bullet rather than directly funding resources?

A high school student attended one of our community meetings where we built awareness of the JPA. She learned that the algorithm would be a kind of early warning system, ranking students from 1 to 5. When she went back into her classrooms, she mentally did her own ranking — the kids at the front of the class might be a 1, and the kids who were barely showing up, and on phone devices, would be closer to a 5.

She realized that 18 months earlier, when her family was experiencing housing instability, she would have been pegged as higher risk. “It would not give a clear picture of who I was and what I needed,” she told us. “My past should not predict my future. This is just a different way of labeling.”

Together, community members engaged in conversations that led to the JPA’s cancellation in February 2019.

Nationwide, data is emerging as a new economy without much discussion and regulation about it. So we have created the Data for Public Good campaign. We want to engage parents and students to work with data creators and funders to build a better system from design to delivery. We want stakeholders to be equipped as an oversight body, seeing that data-gathering adheres to ethical principles.

Data and surveillance can do good things. But we need many more safeguards and understanding of the inherent bias in developing these data systems.

Next steps for our network include talking to the Minneapolis City Council about this concept, which they have been exploring, and connecting with philanthropists who are supporting data collection systems. We are partnering with national groups involved in this work to develop principles.

Action = Change

  • Join the No Data Without Us movement, which includes leadership fellowships to help improve policy protections around data in Minnesota. TCIAMN.org
  • Engage at the local level, with school boards and city councils, to find out how data is being used.
  • Learn about bias in facial recognition software and the movements to ban it in Minnesota.
  • Visit the race exhibit at the Science Museum in St. Paul, which has been updated with a focus on criminal justice and bias.

Marika Pfefferkorn (she/her) has a background in education, and is director of the Twin Cities Innovation Alliance (TCIA), which focuses on the role of data in education, governance, infrastructure, and transportation.