Child Welfare Agencies Use Algorithms to Take Kids From Their Parents…Screw Humans…Technology Is More Accurate

Freeograph / shutterstock.com

An algorithm is defined as a procedure for solving a mathematical problem (as finding the greatest common divisor) in a finite number of steps that frequently involve the repetition of an operation. They’re often used by search engines to determine a person’s preferences for advertising purposes.

We’ve all looked at a product online only to be offered a better variety of similar products in our social media feeds. It happens instantly. Algorithms can also be used internally by companies and agencies when searching for employees or customers with similar characteristics. In short, they’re a perfectly legal invasion of privacy and they don’t always get things right.

Attorney Robin Frank of Pittsburgh represents parents who are in jeopardy of losing their children to the state. She sees them at the lowest point in their lives. It’s a tough job but she’s always known how child protective services would present their cases in court and was able to prepare accordingly.

Now, she’s fighting the invisible. Frank has no idea what type of information the state has gathered via specific algorithms targeted straight at the parents in question. She has no idea what she might be blindsided with.

“A lot of people don’t know that it’s even being used,” she said. “Families should have the right to have all of the information in their file.” The algorithms are being used to scan a family’s online usage without their knowledge to determine if social workers need to launch an investigation against them. Humans no longer get to make that choice. 

Child welfare agencies throughout the U.S. are using or at least thinking about implementing algorithms into their systems despite ongoing concerns. Reliability is the main factor. They’ve been accused of selectively singling out low-income families which in turn has led to racial disparity. 

Agencies in Illinois have already ditched the use of algorithms when what the statistics were being calculated from became apparent. Only lower incomed and primarily Black families living in specific areas of specific towns and cities neglected their children. Everyone else was cool. Why? They were the only ones being targeted. The algorithms were income-biased, and this is where it led them.

Frank said that the algorithms being used in Allegheny County where she practices are singling out a highly disproportionate number of Black families for child neglect. A county social worker admitted that at least one-third of what the algorithms report is faulty info based on the wrong criteria. 

Director of Allegheny County’s Department of Human Services, Erin Dalton, pioneered the county getting onboard with algorithms and said they work just fine. “Workers, whoever they are, shouldn’t be asked to make, in a given year, 14, 15, 16,000 of these kinds of decisions with incredibly imperfect information,” she said.

The calculations they’re using are based on anything from inadequate housing, incomes below certain levels, all the way to poor hygiene practices. Dalton believes they’re helping more children than they’re hurting. The algorithms allow them to rescue a greater number of them from despair and desperation. 

Who’s right and who’s wrong? Are child welfare workers investigating the wrong families?  Are the algorithms not flagging wealthier families because they never neglect their children anyway? Or, are they identifying more kids in need of help that wouldn’t have been rescued without them?

You’re the judge on this one. What do ya think?