Pages

Tuesday, January 13, 2015

Data Collection of Personal Information of Children OK, But Not for Adults

California has jumped into the social policy experimentation game by proposing the construction of
I see CPS in your future
an algorithm to miraculously predict future human behavior.

Of course, the algorithm will not be mutational, meaning it takes into account life experiences and unexpected situations, but that does not matter.

Child welfare is forging ahead with data collection of personal information on children.

Despite the federal initiatives to protect the privacy of individual information, child welfare is excluded from databases of personal information.

Collecting and storing personal information on the behavior of U.S. citizens is nothing new.

It was done during the McCarthy era and has been seen in NSA data collection of phone records.

But unlike the aforementioned activities which have been heavily scrutinized and terminated, child welfare will now be collecting "future" data based on a, more than likely, a poorly constructed algorithm of mystical surveillance.

No attorney will be able to challenge the speculated, predicted results in a court of law due to the fact that in child welfare cases, one is determined guilty until proven innocent.  The bar has been set, very, very high in challenges to due process as the system will probably be super secret.

Why they cannot construct an algorithm to determine Medicaid fraud, I will never know because I did.

Can an algorithm predict child abuse? LA county child welfare officials are trying to find out

The Los Angeles County Department of Children and Family Services has long been under fire for its handling of child abuse cases. At times it was criticized for leaving children in unsafe homes; at other times for tearing too many families apart.

To help with the tricky decision of determining when a child is at risk, officials at DCFS have been quietly testing an algorithm they said could one day soon rank children according to their risk of abuse, and give social workers more access to information that would help them make smarter decisions.

“I think one of the biggest challenges is not having enough time - having limited time to look into all the safety needs of every kid and every family,” said Amy Reitsma, a DCFS social worker who spent more than five years investigating abuse allegations before being promoted to work on more specialized cases.

She and others at DCFS said social workers at times struggle to complete investigations into abuse and neglect within the 30 days required by state law. In part, it's because of large caseloads, which can top 30 children per one social worker in any given month.

Another barrier is uncooperative families - and just, plain complicated cases.

DCFS officials said when some parents realize they are being investigated, they do not answer the door or telephone, and will not tell social workers about any criminal, substance abuse, or mental health issues in the family. This forces social workers into a long, murky process of piecing together a family’s history, delaying them from the important work of helping the family overcome the chronic issues that lead to abuse.

Francesca LeRue, a division chief at DCFS, said a new project called AURA would jump start the investigative process. AURA stands for Approach to Understanding Risk Assessment.

Along with the algorithm, AURA also calls for providing social workers access to some county records they don't automatically see now - such as electronic documents showing a family's past interactions with the county's departments of mental health, public health and probation. That, she said, would give social workers a fuller picture of a family's life instantly, on their computer screens the moment they receive a new investigation.

“L.A. is being very innovative in thinking about whether we can make better decisions around safety, better decisions around services, by making use of data that we have not historically used in a certain way,” said Dr. Emily Putnam-Hornstein, a researcher who heads up the Children’s Data Network at USC’s School of Social Work.

Under Putnam-Hornstein's leadership, the Children's Data Network has published groundbreaking research on rates of child abuse in California and risk factors among young children
She said data risk-modeling is a relatively new concept in the public sector, but many public agencies around the country are experimenting with it.

“Our child protection system is a resource-constrained body. We don’t have enough service slots for everyone, and so – this provides a tool for triaging families into the services that will be most effective, and for those families that most need them,” she said. “We’re still figuring out what is possible with the models, and what are the ethics of doing this kind of work.”

She pointed to child protection officials in Florida and Pennsylvania, who are also using data mining to discover ways to identify risk factors more quickly. Beyond the realm of child protection, other public agencies are testing data models to improve outcomes. Los Angeles county's Social Services Department uses analytics to identify welfare fraud in the CalWorks program. On a national scale, the U.S. Army is testing a methodology to reduce suicide rates among vets and service members.

To develop the AURA algorithm, DCFS hired statisticians at SAS, a private contractor that provides data risk modeling to the public and private sector in the U.S. and abroad. In creating the methodology, SAS used key indicators that have long been associated with higher rates of child abuse.

For example, a wide body of academic research suggests that children under the age of 5 are at high risk for abuse, with babies at the highest risk. Boys are abused in higher rates than girls. Children with parents who struggle with mental illness or substance abuse are also considered to be at greater risk of abuse. Children with young parents, in their teens and early 20s, are also abused in higher rates.

All of these risk factors are part of SAS's methodology, which L.A. county officials said was proprietary. Details of the algorithm are only known by SAS personnel.

To test the algorithm, DCFS gave SAS 2013 county records for families who were investigated for serious abuse, along with demographic information, such as the ages of the family members. These records were "de-identified" which means names were kept anonymous, and replaced with numbers.
The algorithm gave each child a score ranging from zero to 1,000, reflecting each child's risk of abuse. Children who score 800 or above would be considered at highest risk of serious abuse, giving social workers a clear indicator to act quickly.

LeRue said SAS's algorithm accurately identified children in the 800 or higher range - the most serious cases of abuse - 76 percent of the time.
Can an algorithm predict child abuse? LA county child welfare officials are trying to find out
Next, SAS will test DCFS's 2014 child abuse data. DCFS will validate SAS's results. If they check-out, AURA will move into a pilot phase in Los Angeles county. That could launch within six months.
LeRue acknowledged the ethical concerns - and some legal questions - surrounding whether DCFS has the authority to give social workers access to some county records, like a parent's mental health or hospitalization history.
The county's lawyers are trying to figure out what's possible, she said.  
If the legal challenges become too great, LeRue said DCFS may pare down the AURA system, giving social workers access to AURA scores only, but not providing social workers with the records that inform those scores.
LeRue said, the score and the records are just two pieces of the puzzle, stressing that social workers must still do the very important work of communicating with families to fill in the rest of the puzzle, and give them the help they need.

Voting is beautiful, be beautiful ~ vote.©

No comments:

Post a Comment