Recently saw a LinkedIn post about legal outsourcing and offshoring. Considering ethical issues with those arrangements led me to think about AI based legal tech.

Not a digression: for some of us, corporal punishment is just a faded memory of a threat. Some had the experience. For most of us, now, it never was even a thing. But our continuing cultural fascination with phrases like “punch a Nazi” or “curb stomp” reflects our instinct to injure the wrongdoer.

In fact our entire legal system is premised on government sanction for that instinct. In criminal cases, judges award injuries in the nature of deprivation (liberty or property). In personal injury cases, trial lawyers win their clients compensation that also, gratifyingly, injures the wealth of the tortfeasor. And, particularly relevant to my topic of AI legal tech, licensed professionals are driven to do good work not only by pride and joy and financial reward but also by the grim specter of malpractice litigation.

AI, and AI legal tech vendors, do not carry that anxiety. AI itself cannot be rendered penniless or unemployable by a lawsuit. It has (in)organic impunity. Its vendors, by the efforts of their lawyers, have contractual impunity.

It is abysmal, in terms of professional responsibility, to hand off a substantive task to a person or thing whom you can’t hold to account.

That is the essential problem of AI-enabled legal tech.