03 2024 June

Why shouldn't we use AI to replace our legal reasoning?

Elena IrazabalBy Elena Irazabal

When we use generative AI tools, we are tempted to use them without any control. Wow, let him do everything for us. We see that it returns a well-written text and we can think that we can use it as is in our work. So why is this a bad idea?

The Nobel Prize in Economics, Daniel Kahneman, has significantly influenced both psychology and economics as well as artificial intelligence with his studies on system 1 and 2. We could say that, in Latest News, AI is a system 1 sophisticated. This means that artificial intelligence, especially in its current form of deep learning, predominantly operates quickly and automatically, recognizing patterns and making predictions based on large amounts of data. Like Kahneman’s “System 1,” AI can process information quickly and respond immediately, but it does so without deep understanding or deliberate analysis of circumstances. On the other hand, “System 2” is characterized by slow, deliberate and logical thinking, involving a detailed and deep analysis of situations.

Despite its sophistication, this analogy highlights a crucial limitation: AI lacks the deep reasoning and contextualization capabilities that characterize “System 2.” In the legal field, where detailed analysis, deliberation, and understanding of causality are essential, this deficiency is particularly significant. A lawyer must evaluate the context, interpret the law, and apply legal principles to complex situations, tasks that require reasoning that AI, as system 1, cannot adequately provide.

Legal reasoning involves more than just processing large amounts of data; It requires understanding the causal relationships and context behind the information. AI systems, while adept at handling vast data sets and identifying patterns, do not possess the ability to understand causality or context in the same way as humans. For example, a lawyer must understand the implications of a law, consider precedents, and predict the possible outcomes of a legal argument. These tasks require a level of contextual awareness and causal reasoning that current AI systems do not have. Relying on AI to perform these functions could lead to misinterpretations and flawed legal judgments.

Of course, we can use AI to help us reason better. In the words of my friend and one of the greatest AI experts in our country, Javier Ideami, interacting with AIs like ChatGPT can help increase and magnify our potential for human intuition. That is, from our system 1. ChatGPT, for example, has a large well of knowledge that can enhance our pattern matcher.

Ideami highlighted this example to me: when we want to do bodybuilding to improve our physique and strength, we use machines that help us achieve the desired goal. With AI it is the same, we can use it in a way that enhances our capabilities.

For example, a lawyer can use an AI to quickly obtain a summary of case law relevant to a specific case, saving time on the initial search for information. Although AI cannot replace the critical analysis and legal interpretation that “System 2” requires, it can provide an extensive and accessible database that the lawyer can examine and contextualize. In this way, AI acts as a tool that amplifies our capabilities, allowing us to focus on deep analysis and the creative application of legal knowledge to complex situations.

Furthermore, when we use generative AI tools, they make us work in a way we were not used to. We must instruct the AI ​​properly to get the result we need, which forces us to think in a different way. Just as weight machines can't do exercise for us, but they make the training process easier for us, AI can't reason for us, but it can make the process of obtaining and organizing information more efficient and effective.

Share: