A great lawyer in Australia has apologized to a judge to submit requests in a murder case that included fake quotes and provisions that are not present in the cases that were created by artificial intelligence.
The error in the Supreme Court in Victoria is the last in a set of incidents caused by artificial intelligence in justice systems around the world.
Defense lawyer, Rishi Natwani, who holds the prestigious legal title for the King’s lawyer, took “full responsibility” to provide incorrect information in requests in a teenage case accused of death, according to the court documents that the Associated Press witnessed on Friday.
“We are very sorry and embarrassed for what happened,” Natwani told James Elliot on Wednesday, on behalf of the defense team.
The mistakes created in artificial intelligence 24 hours delay in resolving the case that Eliott hoped to conclude on Wednesday. Elliot spent on Thursday that Natwani’s agent, who could not be recognized because he was a minor, was not guilty of death due to mental weakness.
“At the risk of testing, the way these events are revealed is not satisfactory,” Elliot told lawyers on Thursday.
Elliot added: “The court’s ability to rely on the accuracy of the submission made by the lawyer is essential to the administration due to justice.”
Fake submission operations included fabricated quotes from a letter to the state legislative body and the instructions of cases that are not present from the Supreme Court.
The errors were discovered by Elliot partners, who were unable I mentioned.
Lawyers have admitted that the martyrdom is “not present” and that the application contains “fake quotes”, according to court documents.
Lawyers explained that they were verifying that the initial categories were accurate and the assumption that others would be correct as well.
Applications were also sent to Prosecutor Daniel Bordodo, who has not been accurate.
The judge indicated that the Supreme Court issued a guidance last year for how to use artificial intelligence lawyers.
Elliot said: “It is not acceptable to use artificial intelligence unless the result of this use is verified independently and comprehensive.”
Rod MacBerk / AP
Court documents do not specify the obstetric artificial intelligence system used by lawyers.
In a similar case in the United States in 2023, a federal judge imposed $ 5,000 on fines of lawyers and a law firm after that The blame was blamed on ChatGPT To submit them to the placebopplier in the demand for flying injury.
The judge said B. Kevin Castel, they acted in bad faith. But he strengthened their apologies and therapeutic steps that were taken in explaining the reason for the lack of more severe penalties to ensure that they are not allowed or others again with the tools of artificial intelligence that demands them to produce a fake legal history in their arguments.
Later that year, the most vulnerable to the fake court invented by artificial intelligence in legal papers Lawyers of Michael Cohen presented itA former personal lawyer for US President Donald Trump. Cohen blamed, saying that he did not realize that the Google tool he was using for legal research was also capable of so -called artificial intelligence hallucinations.
Victoria Sharp, the British Supreme Court judge in June, warned that providing wrong materials as if they were real can be considered contempt for the court or, in “most terrible cases”, which displays the course of justice, which carries a maximum of life in prison.
The use of artificial intelligence makes its way to the US court halls in other ways. In April, a man named Jerome Diwald appeared before the New York Court and presented a video That appeared Including artificial intelligence To present an argument on his behalf.
In May, a man was killed in anger in Arizona.He spoke “during the hearing of the deadly ruling After his family used artificial intelligence to create a video of him, he reads the victim’s effect.
https://assets3.cbsnewsstatic.com/hub/i/r/2025/08/15/11be36c3-8788-41b0-a6e9-95f2aa8a32f3/thumbnail/1200×630/d782d8a94198ee02b787460b768b0409/ap25227124956614.jpg
Source link