Columns, Op-Eds, & Interviews Foster Care

The Scarlet Number: Is Pittsburgh’s Ethically Risky System of Big Data for Foster Care Coming to California?

Richard Wexler
Written by Richard Wexler

One massive leak of middle-class Americans’ data seems to have the whole world in an uproar.

A firm known as Cambridge Analytica allegedly improperly obtained personal information given to Facebook by 87 million people. Then, according to The New York Times, “The firm, which was tied to President Trump’s 2016 campaign, used the data to target messages to voters.”

Americans gave this information to Facebook voluntarily. Now that Facebook apparently failed to keep it secure, some Americans are exercising their right to delete their Facebook accounts. Others are exercising their right to share less information on Facebook.

But if this one leak causes so much outrage – outrage that is entirely justified – can you imagine what would happen if, say, people were forced to surrender vast amounts of personal data and a big government agency could use those data to investigate them and even take away their children?

Why if that ever happened the outrage would be – well, actually, it would be almost nonexistent.

Because it’s already happening in Allegheny County (Pittsburgh), Pa. And California is already exploring a similar system with the intention of offering it to the state’s 58 counties.

Instead of outrage, the forced surrender of data and its use as a tool to investigate alleged child maltreatment in Pittsburgh is being celebrated – in the local newspaper and Reason Magazine, which is supposedly dedicated to exposing the harm of big government. Even the New York Times, which has reported extensively about the Cambridge Analytica story, sounds enthusiastic about the way Pittsburgh uses predictive analytics in child welfare.

None of this is really surprising. The cause of this double standard is the same as all the other double standards when it comes to child welfare, such as when behavior that could get a poor parent reported to child protective services, becomes an interesting but harmless anecdote to tell friends for affluent parents. What Allegheny County is doing only targets poor people.

Marc Cherna, Director of the Allegheny County Department of Human Services

The celebration of what’s happening in Pittsburgh comes despite the fact that similar experiments failed spectacularly in Illinois and in Los Angeles.

The Los Angeles model never got past the testing stage. Among the problems: Early tests found a 95 percent rate of “false positives.” That is, 95 percent of the time, when the model predicted that something terrible would happen to a child – it didn’t.

But instead of taking the hint, the California Department of Social Services has contracted with a team at the University of Southern California to develop a new model. And the leader of the team, Prof. Emily Putnam-Hornstein, is one of the designers of the model now in use in Pittsburgh.

So California, in particular, should be looking behind the hype surrounding what’s happening in Pittsburgh.


Pittsburgh & the dicey art of prediction

In Pittsburgh poor families are targeted by something called the Allegheny Family Screening Tool (AFST), a semi-secret “predictive analytics” algorithm that processes vast amounts of data about a family and boils it down to a “risk score” – a single number between 1 and 20 that designates the extent to which a particular child is at risk. Supposedly the higher the number the greater the risk to the child. The score is a secret, like an invisible “scarlet number” etched on a child’s forehead.

Right now, a child is branded with this scarlet number if someone alleges the child is abused or neglected. The county uses the number to decide which cases need to be investigated. But the county is considering something more Orwellian – stamping that scarlet number on every child born in the county, at birth. Such a number could haunt not only the child, but his or her children and grandchildren.

Predictive analytics was brought to the county by someone I admire and respect, Marc Cherna, director of Allegheny’s Department of Human Services (DHS). Unlike most people who run child welfare systems, Cherna has an excellent record of reducing needless foster care. But this time, he’s on the wrong track.

Virginia Eubanks, author of “Automating Inequality”

Rather than eliminating the racial and class biases that chronically permeate child welfare, AFST simply digitizes those biases, a process described in detail in a devastating critique by Prof. Virginia Eubanks in her important new book, Automating Inequality, released in January of this year, and excerpted in Wired magazine.

The problems with AFST are legion:

● AFST doesn’t actually predict child abuse. Rather, claims of accuracy are based on its ability to predict whether, once reported as allegedly abusing or neglecting a child, a family will be reported again or whether the child is likely to be removed from her or his parents. But reporting itself is a highly subjective process. DHS itself acknowledges that decisions to call in a report alleging child abuse are rife with racial and class bias.

● Most reports don’t fit the stereotype of parents brutally beating and torturing children. Far more common are cases in which poverty itself is confused with “neglect.” So if a family is reported because of poverty and, a year later, the family is still poor, there’s a good chance the poverty will be confused with neglect again and the family will be reported again.

● The designers of AFST consider the existence of previous reports among the strongest supposed predictors of future danger. But the vast majority of reports –– are false. Nationwide, more than 80 percent of reports of neglect or abuse are false. Moreover, the one study that attempted to second-guess caseworker decisions found that the workers were two to six times more likely to wrongly decide a case was true than to wrongly label a case false—so the 80 percent figure is an underestimate. But under AFST even false reports count against a family.

● Allegheny County repeatedly has bragged that its algorithm is transparent. Unlike the first Los Angeles experiment, which involved testing an entirely secret algorithm developed by a private company, Allegheny County makes public what goes into AFST. But there is less to the county’s much-praised “transparency” than meets the eye. Yes, the various factors that go into the score are public, but not the weight given to each element. Imagine trying to bake a cake with a list of ingredients, but no idea how much of each to use.


“Poverty profiling”

The various data points that the algorithm uses to create a risk score amount to what Eubanks calls “poverty profiling.” By the county’s own admission, in many cases, probably a majority, if a family seeks out help through public benefits it will raise the risk score.

As Eubanks explains: “Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.” And, of course, because these are public benefits, such as assistance programs like SNAP (formerly known as food stamps), TANF help for needy families, and Medicaid, the data are collected automatically by the county. Unlike when we “like” something on Facebook, the poor have no choice.

Eubanks reports that 25 percent of the variables in AFST are direct measures of poverty. Another 25 percent measure interaction with the child welfare system itself, along with any brushes with the juvenile justice system.

This turns AFST into less prediction than self-fulfilling prophecy: Both poverty and seeking help to cope with poverty raise the AFST score, so more “helping” professionals – who also are mandated reporters of alleged child abuse – are in the family’s life. Both the score and the professionals put the family under a microscope that rarely focuses on more affluent families. If the microscope turns up something that normally would be seen as innocent, it may now suddenly be suspect—simply because the numbers say it is. And so the snowball gathers size and speed as it travels downhill, and the family is reported again, “proving” the algorithm was right.

Another collateral effect is the chance that, once it is known in the community that reaching out for help increases the chances that child protective services will be in your life, parents may be less likely to seek help – and pressures that can lead to actual child abuse could worsen.

The algorithm visits the sins of the parents, real or imagined, upon the children. Eubanks cites a family that was victimized by repeated false reports. When the child the county supposedly was “protecting” grows up and becomes a parent herself, if she becomes the target of a child abuse report she will bear a higher scarlet number – because her own parents were tagged as possibly abusive or neglectful, even though those reports did not pan out. This means that, completely apart from any event, condition or action within the family, her children will be at greater risk of enduring the enormous harm of needless foster care.


About that ethics review…

Stories praising AFST repeatedly cite a so-called “ethics review” commissioned by Allegheny County that gave AFST a seal of approval. But one of the two academicians chosen by Allegheny County for the review not only is a faculty colleague of another AFST designer, Prof. Rhema Vaithianathan, but actually co-authored papers with her. The nine-page review itself is startlingly superficial, the only references cited are academic papers authored either by the designers of AFST or the authors of the review itself.

Even with this odd choice of ethics reviewers, the review based its endorsement in part on the fact that AFST would not target children at birth, but only after a child abuse report had been made.

But that may not last. Eubanks reports that the county is, at a minimum, considering introducing “’a second predictive model …[that] would be run on a daily or weekly basis on all babies born in Allegheny County the prior day or week,’” according to a September 2017 email from [Cherna’s deputy, Erin] Dalton.”

Indeed this “at birth” model is one of the models the designers of AFST proposed to the county in the first place, although Allegheny county rejected it initially as “too much too fast,” according to a PBS story on the matter.

It also was proposed in New Zealand. But New Zealand roundly rejected the Minority Report-esque at-birth model “Not on my watch, these are children not lab rats,” New Zealand’s then Social Development Minister Anne Tolley reportedly wrote in handwritten notes, according to The Press newspaper in July 2015.

(As of December 2017, New Zealand is still studying the ethical risks of the matter.)

So if Allegheny County does what Erin Dalton says is under consideration and starts branding every infant with a scarlet number at birth, a number that will even affect the number assigned to their children and grandchildren, is that model inherently unethical?

Cherna promises that the scarlet numbers under any such system will be used only to find the families most in need of help. He also promises that the current AFST will only be used as it is now, to decide which families to investigate, not whether to take away children. But this is a distinction without a difference. Though the investigators don’t know the exact AFST score, they know that AFST told them to investigate in the first place.


If AFST comes to California

I believe Cherna. His track record shows he will be far more careful than most not to abuse his algorithm. But, human history suggests it is unwise to depend for success—and safety—on the benevolence of a single leader.

That brings us back to California. According to the Daily News, the state’s Department of Social Services (CDSS) new predictive risk modeling tool may be ready to go as soon as this summer.

So what happens when, a year or two later, there’s a high-profile child abuse fatality in some California county and, after a week of horrific local news stories, those in positions of power claim that the pendulum has swung too far toward keeping families together? Will the head of county’s child welfare system decree that a high score on the California version of AFST now should lead to automatic removal? Will this, in turn, cause the system to be flooded with children who don’t need to be there, doing those children enormous harm and, inevitably, so overloading the system as a whole that there is less time to do the hard work of finding children in real danger?

I would say the answer to all of those questions is yes. Because that’s so easy to predict even a human being can do it.


Richard Wexler is executive director of the National Coalition for Child Protection Reform, www.nccpr.org. NCCPR’s full discussion of predictive analytics in child welfare is available here.

An earlier version of this column originally appeared in Youth Today.

1 Comment

  • Maybe Pittsburgh City Schools could allocate additional resources to educate the “Scarlett Letter Kids” in a separate school. Oh yes, I almost forgot that the US Supreme Court struck down segregation in Brown v. BOE Topeka, Kansas. All this proposal does is diminish the American dream and creates a caste society similar to Hindu culture.

Leave a Reply to Michael O'Connell X