Authors:Oliver Conway
Created:2024-10-11
Last updated:2024-10-15
Reflections on family law and more … This column was written by a human
.
.
.
Marc Bloomfield
Description: Reflections on family law
Artificial intelligence may have its place, but for social justice lawyers, words are our superpower. Let’s not hand that over to machines, says Oliver Conway.
Another day and another flurry of articles in my inbox about artificial intelligence ‘disrupting and revolutionising’ the legal sector:1See, for example, Bob Ambrogi, ‘AI adoption by legal professionals jumps from 19% to 79% in one year, Clio study finds’, LawSites, 7 October 2024.
We will never have to review documents again because AI can find patterns in words and numbers in seconds!
Letters will be completed in moments using simple client-specific details and clever prompts!
No more receptionists, secretaries, cashiers or office managers. All admin will be carried out by AI in the cloud!
Paralegals will be replaced by AI; solicitors, barristers and judges will soon be replaced as well!
I have no doubt that many will see the adoption of AI as revolutionary, but the legal sector has always used prompts. Most solicitors’ practices rely on precedents. What is AI but a clever digital set of precedents?
For me, there is something of the emperor’s new clothes about AI. We have the ability with this technology to reduce the number of banal administrative tasks so that we can concentrate on our very human relationships with clients, but instead we are applying it to communicate with our clients, to give them information and advice. We are undermining the very relationships our work relies on.
The relationship between lawyer and client is one of sacred trust: when you are instructed, you take on a responsibility for your client’s life in a fundamental way. To outsource this relationship to AI is not only irresponsible, it negates why, for centuries, humans have sought out lawyers: not just to give them answers and guide them along a path, but in order to have another human looking out for their interests.
Can AI ever explain to a mother whose learning disability and culture prevent her from understanding what ‘adoption’ means why she will never see her son again until he is an adult? Can AI explain why the judge has decided that my client’s son is no longer hers in a way that does not lead her to harm herself or others around her?
Maybe AI will be better at communicating the anguish. Maybe it will not be choking back tears after the fourth phone call with the mother in as many days. Maybe, on hearing the client’s hurt confusion, AI won’t feel the need to apologise on behalf of a justice system that made the permanent removal of her son a statistical inevitability.
Will AI check in with my client and her family in the weeks, months and years after the case, to ensure she knows where to send the Christmas and birthday cards? Cards that are her last vestige of contact with a child she carried to full term, nearly died giving birth to, and fought tirelessly for 18 months to care for?
On some days, when everything feels rather hopeless, I welcome AI.
Maybe it is better suited to navigating systems, which is what the job of being a legal aid lawyer seems to be. A life coach for impoverished, broken people through the impossible systems of late-stage capitalism.
In Denmark, a recent pilot of AI in child protection practice showed huge bias in the ‘decision support’ algorithm towards social workers. Most worryingly, the real families included in the ‘dataset’ weren’t informed about it or given the choice to opt out.2Therese Moreau Hansen, Roberta Sinatra and Vedran Sekara, ‘Failing our youngest: on the biases, pitfalls, and risks in a decision support algorithm used for child protection’, FAccT ‘24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency, 5 June 2024, page 290.
Meanwhile, UK local authorities are pouring data and money into analytics companies to predict trends in future interventions needed by residents.3Robert Booth, ‘Social workers in England begin using AI system to assist their work’, Guardian, 28 September 2024. As a human, I predict that within a decade, social services’ legal planning meetings will rely heavily on an algorithm to decide whether to issue proceedings. This may already be happening.
Gatekeeping at the family court will likely be handed over to machine learning. I also predict that ‘judgment software’ will mean a judge just needs to state the case category and allow AI to consider the case transcript, and an unappealable judgment will be sent to a party’s inbox after the hearing.
I am already seeing letters and legal documents drafted or redrafted by AI in that unusual tenor of LinkedIn posts. As lawyers, writing is our superpower. We use words like a surgeon uses a scalpel. Our words have a magic to them that we should not be so willing to give up on and hand over to a machine. Some lawyers are trained to be emotionless, logical and free of bias. I have always seen my emotion as a gift and my commitment to social justice means I am never neutral.
Perhaps AI will replace the machine-like lawyers and us all-too-human ones will be left writing the prompts.
 
1     See, for example, Bob Ambrogi, ‘AI adoption by legal professionals jumps from 19% to 79% in one year, Clio study finds’, LawSites, 7 October 2024. »
2     Therese Moreau Hansen, Roberta Sinatra and Vedran Sekara, ‘Failing our youngest: on the biases, pitfalls, and risks in a decision support algorithm used for child protection’, FAccT ‘24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency, 5 June 2024, page 290. »
3     Robert Booth, ‘Social workers in England begin using AI system to assist their work’, Guardian, 28 September 2024. »