Using artificial intelligence in court cases is “unconstitutional” say Malaysian lawyers
Malaysian lawyers say the use of an AI system in the country’s justice system is “unconstitutional” and claim that no one really understands how it works. That’s after courts in two Malaysian states launched a test program to use AI to assist judges in delivering sentences for convicted drug dealers and rapists.
Sarawak Information Systems (state government) developed the AI software. The first introduction of the AI software was made to two Sabah court and Sarawak courts on Borneo. This pilot program was to test the effectiveness of artificial intelligence when recommending sentencing. It was scheduled to conclude in April 2022.
In 2020, two drug-related convictions were handed down by Sabah’s court. This was the first time AI has been used in court. However, Hadid Ismail – a lawyer with 20 years of experience who represented the defendants – took issue with the sentence, claiming that the system was being used before judges, lawyers and the public even got a chance to fully understand it and the way it worked.
“Our Criminal Procedure Code does not provide for use of AI in the courts… I think it’s unconstitutional,”According to Ismail, Reuters. “In sentencing, judges don’t just look at the facts of the case – they also consider mitigating factors, and use their discretion. But AI cannot use discretion,” he said, adding that the sentence given by the AI to one of his clients for minor drug possession was too harsh – 12 months in jail for possession of 0.01 gram of methamphetamine.
Malaysia’s Bar Council, which represents lawyers, has also voiced its frustration with the AI pilot program. After courts in Kuala Lumpur, Malaysia’s capital, started testing the system in mid-2021 to suggest sentences for 20 types of crimes, the council said it was “not given guidelines at all, and we had no opportunity to get feedback from criminal law practitioners.”
A Malaysian think tank, Khazanah Research Institute, also filed a report on the system in 2020, arguing that the mitigating measures installed in the AI software, like removing race as a variable, didn’t succeed in making the system perfect. The system had also been noted as inefficient. “somewhat limited in comparison with the extensive databases used in global efforts”The dataset used to train the AI algorithm for AI was five years, from 2014 through 2019.
A spokesperson for the Federal Court Chief Justice stated that the use of artificial intelligence in courts was “still in the experimental stage” but declined to comment any further on the operation of the system.
Meanwhile, the use of AI in the criminal justice system has been growing rapidly throughout the world, from the popular DoNotPay – a chatbot lawyer mobile app – to AI judges adjudicating on small claims in Estonia, robot mediators in Canada, and even AI judges in Chinese courts.
These AI systems are argued to make sentencing consistent, reduce case backlogs and save money.
Simon Chesterman, a professor of law at the National University of Singapore and senior director at AI Singapore – a government-run program – insists that technology has the potential to improve efficiency in the criminal justice system, but has acknowledged that the legitimacy of such systems depends not only on the accuracy of the decisions made, but also on the manner in which they are made.
“Many decisions might properly be handed over to the machines. [But] a judge should not outsource discretion to an opaque algorithm,”Chesterman.
Back In Sabah, Ismail appealed his client’s harsh sentence which was recommended by the AI, and the judge presiding over the case eventually granted him the appeal.
However, Ismail has warned that many other lawyers, particularly young ones, may decide not to mount a challenge to the AI system – potentially condemning their clients to unduly harsh sentences.
“The AI acts like a senior judge, young magistrates may think it’s the best decision, and accept it without question,”Ismail stated.
Share this story via social media