Florence G’sell, "Personalization of the Law : a French Perspective"

Presented at the Legal Challenges of the Data Economy conference, March 22, 2019.

Transcript

FLORENCE G'SELL: Thank you, very much. In a data driven environment, empirical analysis overtakes the judgment of experts. And this is, for example, the case of personalized medicine.

Personalized medicine abandons the paradigm of a diagnosis based on the average person and places the patient's particular condition and needs at the center of the treatment protocol. Applying probability analysis techniques to vast pools of data yields conclusions regarding the probability that a particular treatment would be effective for a particular individual given his or her unique circumstances. And it also gives the risks that the treatment may pose.

So as Laure said, in the legal field, the expression, Personalization of the Law appears be a contradiction, and especially to French jurists. Law is generally seen as a system of general rules that all apply in the same way to all citizens. And in France, the application of legal rules and principles to individual cases is usually described as a [INAUDIBLE].

It is deductive reasoning that leads to the application of a general rule to a specific case. And in this perspective, it is the facts of the case, not the law that are considered with their peculiarities. The ruling of the courts is individualized or personalized. Because it simply applies the rule to the specific facts of the case.

But the rule-- the principle itself is general, which guarantees that all citizens are treated equally. In other words, it is the general nature of the legal rules and the fact that these rules are applied in a logical and deductive way which ensure equality under and before the law and legitimate court's decisions.

Of course, this civil law approach must be differentiated from the common law principle, according to which like cases should be treated alike. Common law systems are a judgement systems, accustomed to casuistic reasoning. And equality before the law is achieved by following precedents.

Therefore, one can easily understand that in the Anglo American legal culture, where the peculiarities of every legal case are carefully studied and considered, the law may be personalized. On the contrary, the immediate reaction of a civil lawyer facing the possibility of personalized law would probably be to say that it is not possible, or even that it is not acceptable, since justice, fairness, and legitimacy are based on general principles.

However, big data is here. And big data offers unprecedented opportunities. We know that machine learning algorithms can examine gigantic databases of information. They can identify patterns. They are able to predict individual's future behaviors. And today, they shape important decisions in education, medicine, insurance, banking, and even, as we said, in the criminal justice system.

So today, I could say that big data is associated with movement towards empiricism in the law, the new legal empiricism that promotes an evidence-based law, that uses, for example, scoring systems and algorithms in the criminal justice system.

And since big data promises objectivity and scientificity, it allows to correct the unrealistic assumptions about human behavior that sometimes affects economic theories, and especially when such theories try to predict the effects of a particular legal rule.

And this is precisely what behavioral law and economics seeks to do. The objective is to improve the traditional learning economics by incorporating the growing body of empirical evidence on human behavior.

As far as I'm concerned, I do not think you must necessarily be a legal economist, or a behavioral legal economist, to consider the tremendous capacities of machine learning techniques, and the possible impact on the law. Even though I am undoubtedly a traditional jurist, and a traditional French jurist, I can't help thinking that today in the US, some people are detained or released on the basis of algorithmic risk assessment. And in my country, in France, predictive algorithms are currently being used on a daily basis to anticipate courts' decisions.

Therefore, I do not think that a civil lawyer can afford to miss what we could call the algorithmic turn. And since such an algorithmic turn might result in the personalization of the law, we definitely have to face it.

So I will say a few words, of course, about the personalization of the law. But Ariel presented it excellently. But I will try to adapt this analysis to the French context, and to elaborate a few critical remarks.

So what is the personalization of the law? Ariel explained it, actually. I just wanted to refer to this very interesting paper, Regulation by Machine, that gives the example of medical malpractice. Nowadays, machine learning technology can predict most of the consequences of the doctor's actions. And of course, the algorithms may merely be used to provide doctors with information superior to what they currently possess, like the likelihood of adverse outcomes.

But the machine can also recommend how best to proceed. So of course, doctors have the discretion to ignore these recommendation. But as algorithms become increasingly accurate, doctors who ignore or neglect the advice of the algorithm will sooner or later be found liable for medical malpractice. And this will be some kind of regulation by machine.

Casey and Niblett also gave us the example of what they call micro directives. But the result of all this is a calibrated law. And as you said, it may be safe for some people to drive at high speed. But the general prescribed limit must be lower for drivers who are unskilled or easily distracted.

And of course, we could have other examples. Like in companies, we could imagine software applications that would provide simple directives in order to comply with the law without having to search for the content of specific laws, without having to weigh the reasonableness of their actions.

So of course, you can, of course, gives various examples. Ariel gave the example of mandatory disclosure. So information that would be specifically adapted to the online profile of every consumer by taking into account demographics, previous purchasing patterns or credit scores. And my European colleague Busch and Franceschi said such personalized disclosures based on customer data would resemble the personal advice given by a trader who knows his customer personally.

So what about French law? I have to say that the French legal system already includes a certain form of personalization. In criminal law, punishment is individualized. And it is in the French penal code that criminal sanctions and their execution modalities shall be personalized by taking into account the defendant's age, personality, criminal record, work, family, medical status, et cetera.

But here, it is not a personalization of the legal command itself. And this is what we are currently discussing, because as far as I understand-- but we can discuss that. The question of the personalization of the law implies that the commands imposed by the legal system would be highly tailored to individuals. Whether there would be obligations imposed to individuals like duties of disclosure, or standards of conduct, like in the case of negligence for medical malpractice.

So could such a personalization of the legal rule itself be [INAUDIBLE] within the French context? I wanted to take two examples that you actually have already given-- the personalization of negligence law and default rules.

So maybe just as a reminder, the personalization of negligence law was suggested was by Omri Ben Shahar and Ariel Porat a few years ago. And the main proposition is that the reasonable person standard be abandoned and replaced by the reasonable you standard, a very personalized level of diligence.

So what about the French law of negligence? Well, we have a very objective standard in France. A fault for us is an inappropriate course of action, such as a wise person would not have adopted it. We have had a discussion whether this standard should be interpreted in an objective manner or in a subjective manner. But in principle, it is an in abstracto interpretation that prevails.

So we refer to the average man, to the common or ordinary diligence, the prudent and the wise man, the reasonable man, whatever. It is, in principle, an objective criterion, even though sometimes the court take into account some peculiarities. Like when a professional is at stake, the court will look at the conduct of a professional agent. The court might take into account the age of the wrongdoer, the fact that he is physically impaired.

But apart from this, there is no real personalization. So with that, would it be a very important change? Well, I think so. And are we ready for it? Well, I think that this will raise some issues of principle. And I will come back to that, of course.

But broadly speaking, I would say that committing a fault is transgressing social norms. In any society, there are standards of behavior that apply to all. And I could say that the fact that everyone is subjective to the same rules is a condition of life in society. Some French authors said as opposed to the moral vote, the legal vote is a social vote.

So what about the personalization of default rules? Well, again, we have default rules in France. So we have provisions in the civil code that apply only if the parties have not expressed any other will. But those provisions, they express a certain normality. They express the possible will of the parties. But they also express the conception that the legislature has of certain legal institutions, like a will, like family matters, like contracts. And in those default rules, there are values.

So again, I'm wondering if, from our French perspective, we would be very open to a personalization of default rules. So a few critical remarks, more broadly speaking, about this idea of the personalization of the law.

The first thing is-- probably the first issue is probably equality. As my colleague Busch and Franceschi highlighted, Kelsen said to legislate means to generalize. But I am not sure that I agree with them when they say that the frequent use of legal typifications can be conceptualized as the answer to an information problem.

Generalization is necessary to make sure that similar cases are all treated alike, at least in all legal system. And because I am a traditional legal scholar, I do think that typifications and legal categories guarantees equality before the law.

So of course, you might argue that the principle that similar cases must be treated alike also implies that different cases should be treated differently. And yes, I agree with you. But we could probably think in terms of general categories.

For example, we know the difference between professionals and consumers. But should we make differences between consumers or differences between professionals? I am not that sure.

Another issue is-- to me, is fairness. I wrote, profiling is tyranny. But in principle, the algorithm looks at the individual's past choices. And it also looks at the personal choices of other individuals that have the same characteristics and have made the same past choices. So there is a tyranny of the past, perhaps, and tyranny of the others' choices.

Should we be treated, because of the kind of person we are? And this is to me an issue. There is also an issue with legal certainty, because a system of tailored laws might increase uncertainty. How will I know in advance what is expected from me? So I will need to invest effort to learn the particular rule that applies to me. And with this, of course, an issue of accessibility of the law.

And last but not least, privacy issues. What should we do to get the data? Should we assume that regulators would be part of this surveillance capitalism that is described in Zuboff's book? I think it is an issue. And you mentioned it. There is an issue. It's both an issue of feasibility and privacy.

So in the end, I would like just to mention this very, very famous quote from Judge Holmes, who said, for the rational study of the law, the black letter man may be the man of the present, but the man of the future is the man of statistics and the master of economics. And he wrote this in 1897.

Well, his argument is about the opposition between logical reasoning, deductive reasoning, and statistics. What I would say is that in both cases, there is uncertainty. And we probably should figure out which uncertainty we want. Thank you.

Big data