“No vision” on use of AI by family justice system


Domestic abuse: AI risk assessments better than the police’s

There is currently “not a vision” for how people in the family justice system, clients and professionals, can “safely harness” the benefits of artificial intelligence (AI), the Nuffield Family Justice Observatory has found.

Researcher Aliya Saied-Tessier also warned that people “with insufficient means” to pay for quality advice from humans may turn to lower-quality software tools, some of which use AI to predict financial settlements.

In a briefing, she went on: “We know that AI technologies are available, accessible and already used by many. However, we do not know precisely how AI is being used by users of the family justice system, nor do we know the efficacy of the tools for dealing with requests made about family justice.

“At the moment there is not a vision for how those in the family justice system, both families and professionals, can safely harness the benefits of AI while being protected from the risks.”

Ms Saied-Tessier said research had shown that people were concerned about risks posed by AI, with the majority welcoming regulation of AI to mitigate them.

She warned that “people with insufficient means to fund quality human alternatives may turn to lower quality and/or unregulated software tools in their place”.

One example was people using unregulated AI tools that predicted likely financial settlements following family breakdown as an affordable alternative to legal advice.

“Another example is potentially unreliable AI translation in place of a human interpreter. This tilts the level playing field away from those with fewer financial resources.”

However, Ms Saied-Tessier said there were “numerous potential benefits” of using AI tools in the family justice system.

They could be used for document review, to “scan and classify large volumes of documents for relevance to a case”, reducing the time and cost associated with manual document review.

Case management systems could use AI tools to “classify and route cases to appropriate teams” and they could also be used for drafting.

Ms Saied-Tessier said some had been “explicitly trained to create language and could be prompted to use specific writing styles”, whether formal or child-friendly.

Lawyers could save time by asking large language models like ChatGPT to “redraft text to be understood by a child of a specific age”.

The researcher said some courts and children’s social care departments around the world used predictive analytics to assess the likelihood of various events, such as the likelihood of a child requiring social care or a young person’s vulnerability to gang exploitation.

However, although there was anecdotal evidence of local authority staff finding predictive analysis useful, researchers found that the machine learning models they built to identify children at risk, using local authority social care data, did not perform well.

More successful was a machine learning approach developed by academics for domestic abuse risk assessment, which performed better than the standard police protocol.

“Being transparent about when AI has been used is likely to be critical across a number of domains in order to secure public trust and allow for effective regulation, for example when drafting reports or using algorithms to support with decision-making.”

Researchers could only examine how cases varied when litigants in person used AI if the system actually collected the information.

“Ensuring that the public and legal professionals have trust in AI systems used in the family justice system is paramount.”

Ms Saied-Tessier said she had used ChatGPT “for some initial ideas generation and to refine the drafting of some sections of this report”.




Leave a Comment

By clicking Submit you consent to Legal Futures storing your personal data and confirm you have read our Privacy Policy and section 5 of our Terms & Conditions which deals with user-generated content. All comments will be moderated before posting.

Required fields are marked *
Email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog


Keeping the conversation going beyond Pride Month

As I reflect on all the celebrations of Pride Month 2024, I ask myself why there remains hesitancy amongst LGBTQ+ staff members about when it comes to being open about their identity in the workplace.


Third-party managed accounts: Your key questions answered

The Solicitors Regulation Authority has given strong indications that it is headed towards greater restrictions on law firms when it comes to handling client money.


Understanding vicarious trauma in the legal workplace

Vicarious trauma can happen to anyone who works with clients who have experienced trauma such as domestic or other violence, child abuse, sexual assault, torture or being a refugee.


Loading animation