UConn Researchers Developing Algorithm to Combat Bias in Identifying Child Maltreatment

The tool will help providers overcome barriers to better identify potential cases of child abuse

This work will help combat the impact of provider bias in diagnosing child maltreatment. (Pixabay)

Each year, more than 600,000 children are identified as victims of maltreatment. Experiencing violence early in life is linked to long-term negative physical and mental health outcomes, making this an urgent public health issue.

Amy A. Hunter, assistant professor of public health sciences at UConn School of Medicine, is developing a tool to help overcome barriers in identifying cases of child maltreatment.

Often times, children interact with the medical system multiple times before their abuse is identified as such.

“We know from the literature that children encounter healthcare providers multiple times before their abuse is recognized. It is challenging to recognize abusive symptoms in this fast-paced environment when there are a lot of competing demands to provide screenings and anticipatory guidance,” Hunter says.

Another challenge is that only about 11% of children go to a pediatric emergency department where providers are trained to identify potential abuse cases. But even when children are admitted to pediatric emergency departments, many symptoms of abuse, like vomiting or loss of consciousness, are non-specific and a provider could easily attribute them to other conditions.

Children experiencing maltreatment are often very young and cannot talk yet. This means providers need to rely on information from caregivers who, in most cases, are the perpetrators of abuse.

These factors make it difficult for providers to correctly identify when a child may be suffering maltreatment and connect them with the help they need.

Studies have also found that providers’ implicit bias can impact how they identify maltreatment.

“Provider bias is the implicit or unconscious bias of providers which can affect their decision-making and become a barrier to care,” Hunter says. “In the context of child maltreatment, this bias can be attributed to the characteristics of a child or the characteristics of their family.”

For example, when assessing symptoms in a child, a provider may be more or less likely to suspect maltreatment based on the child’s age, gender, race/ethnicity, or the appearance of their caregivers. Hunter recently conducted a study which found that non-Hispanic white children were more often documented with suspected maltreatment, while explicit or confirmed maltreatment was documented more often in non-Hispanic black children.

In this project, Hunter will develop an algorithm to support providers in diagnosing possible signs of maltreatment. By developing a data-driven tool, Hunter hopes to mitigate the impact of bias in the diagnosis of maltreatment.

I hope that by working with a multidisciplinary team we can develop a tool that supports clinical decision making and improves long-term health outcomes in children. — Amy A. Hunter

The project is supported by a $200,000 mentored research grant from the Robert E. Leet and Clara Guthrie Patterson Trust.

Hunter’s mentor on the grant is Rob Aseltine, professor and chair of behavioral science and community health at UConn Health. Hunter is also working with Dr. Nina Livingston, a pediatrician at Connecticut Children’s and division head for child abuse pediatrics; and Kun Chen, UConn associate professor of statistics.

“I hope that by working with a multidisciplinary team we can develop a tool that supports clinical decision making and improves long-term health outcomes in children,” Hunter says.

The team will use data from Connecticut Children’s and seven Hartford Healthcare locations that engage with pediatric populations. They will look at the level of the patient, provider, and the visit to identify risk factors in maltreatment cases.

The algorithm will eventually become a tool providers can use in-office. Providers will be able to input information about the patient and their symptoms. If the algorithm recognizes that the information matches symptoms consistent with maltreatment, it will prompt the provider to follow up on this case as potential maltreatment.

The research team will work with clinicians to get their input on their experience using the tool.

Hunter says she hopes they can re-tool the algorithm to help identify other forms of violence against children, like sexual abuse, in the future.

“There is no replacement for the expertise and skill our providers acquire during their years of training and experience,” Hunter says. “So, the hope is that this algorithm can support clinical decision-making, especially in those more challenging diagnostic circumstances. We’re hoping the tool can help prompt more timely investigation when there is the potential of abuse.”