“I Am a Victim of a Harmful Algorithm”

Photo Sarah Whiting

Last October, I got a postcard in my mail saying I may be affected by a settlement between the health insurance company UnitedHealth Group and the U.S. Department of Labor and the New York State Attorney General. The federal government and New York State allege that UnitedHealth Group, based in Minnetonka, violated New York state law and the Mental Health Parity and Addiction Equity Act, popularly known as the Wellstone Act after the late Minnesota Senator Paul Wellstone.

UnitedHealth Group will be paying $13.6 million to affected participants and beneficiaries — which apparently include me. How did UnitedHealth Group, the second-largest health-care company in the world by revenue, allegedly violate the Wellstone Act? The answer from the postcard:

“[The federal government and New York State] allege that United … improperly applied a medical management program for outpatient behavioral health services called Algorithms for Effective Reporting and Treatment (referred to as ‘ALERT’), which resulted in denials of coverage for services. Plaintiffs allege that ALERT violated the requirements of mental health parity laws.”

And why am I affected by this settlement?

“… Because United has identified you as a person covered by a health plan where both medical/surgical and behavioral health/substance use disorder benefits were administered by United … and who had one or more benefit claims that was denied as a result of United’s ALERT Policy.”

I received behavioral health services when I was on MinnesotaCare. I remember calling United Behavioral Health to get approved for these services, giving an anonymous person on the phone personal details about my behavioral health conditions and why I needed the services I wanted the insurer to cover.

According to the Mental Health Association in New York State, “United [Behavioral Health’s ALERT algorithm] employed arbitrary thresholds to trigger utilization review of psychotherapy, which often led to denials of coverage when providers could not justify continued treatment after 20 sessions … These denials violated parity laws because United subjected all outpatient behavioral health psychotherapy to outlier management, but it employed this treatment limitation only to a handful of medical/surgical services.”

In other words, the ALERT algorithm was used to limit or

deny behavioral health services, but it was rarely applied to physical health services.

Along with payout to those impacted, United’s settlement includes discontinuing the use of ALERT.

When I was covered by United, my behavioral health-care service provider had to continuously submit information about my conditions and treatment. I had to contact the insurer many times and request approval for ongoing services.

It was a headache for my service provider, and it made me feel like a child begging for ongoing treatment. This was bad for my behavioral health, ironically. Why couldn’t I just get the services I needed, as long as I needed them, without going through this automated, bureaucratic process?

Behavioral health issues, also known as mental health or psychological health, carry with them a burden of stigma, bullying, and discrimination. I have suffered from all of those traumas for both having behavioral health issues and using behavioral health services. However, this time I was the victim of behavioral health-based discrimination and didn’t even know it.

That is the horror that comes with discrimination via algorithms, the automated computations created by tech companies and sold to corporations who use them to maximize their profits, resulting in the minimizing of human beings.

Algorithms Are Everywhere

The documentary “Coded Bias,” which depicts the systemic biases encoded in facial recognition software and the algorithms that drive them, featured a Houston teacher who — one week after he was named teacher of the month at his school — was labeled as ineffective by an algorithm his school district paid a private company to use. The teacher and his union were able to fight the school district and stop them from using the algorithm.

Many people still do not know what algorithms are, or whether algorithms are being used to deny them coverage for health services of any kind, or deny them credit, or deny them a job. However, algorithms are being created with no accountability to ethics or the law before they get used on, or against, flesh-and-blood people.

Activist Cathy O’Neil is author of “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.” In a video panel discussion regarding “Coded Bias” she says: “Just as drugs have to be [proven as] safe and effective before the FDA approves them. I claim that algorithms should have to prove that they are safe and effective.”

With no one to ensure that algorithms are safe, effective, ethical, and legal, more people are going to find out long after the fact that they have been victimized by corporations that use technology to perpetuate bias.

Stephani Maari Booker (she/her) is the author of “Judie Junkie Blues,”a science fiction story serialized in the Minnesota Women’s Press. For more information about her work, go to athenapersephoni.com.