AI Contract Review Liability: DoNotPay $193,000 FTC Settlement and the Limitation of Liability Gap

The FTC's $193,000 settlement with DoNotPay in 2023 established that AI legal tools that make unverified claims about their capabilities are subject to consumer protection enforcement — even when those tools disclaim that they are not practicing law. For attorneys using AI in contract drafting and review, the liability question is different but equally serious: when AI-generated contract language creates a contractual gap that harms a client, the attorney bears professional responsibility for the error. Understanding the specific failure modes of AI contract tools, the UCC Article 2 and common law contract principles that govern those errors, and the limitation of liability gaps in AI vendor contracts is essential for any attorney deploying AI in transactional practice.

⚖ DoNotPay FTC Settlement — $193,000 (2023)

MatterIn the Matter of DoNotPay, Inc., FTC File No. 232 3057
Settlement Amount$193,000 civil penalty
Settlement DateFebruary 22, 2023 (announced); settlement terms disclosed June 2023
FTC TheoryDoNotPay made false or misleading representations about the capability of its AI to produce legal documents, defend consumers in court, and provide legal advice equivalent to an attorney
Key FindingAI legal tools that claim human-equivalent legal expertise without substantiation violate FTC Act Section 5(a) (unfair or deceptive acts or practices)
Remedial RequirementsDoNotPay prohibited from making claims about AI legal capabilities without competent and reliable scientific evidence; required to notify affected customers of limitations
SourceFTC: DoNotPay Case File →
$193K
FTC penalty against DoNotPay for AI legal capability misrepresentation
The DoNotPay settlement is the first FTC enforcement action specifically targeting AI legal tool capability claims. The FTC found that DoNotPay's marketing claims — including that its AI could "sue anyone" and was "the world's first robot lawyer" — were unsubstantiated and misleading. The penalty was modest relative to the FTC's enforcement capacity, but the consent order's substantive requirements (substantiation for capability claims, customer notification) have influenced how other AI legal tools describe their capabilities.

What the DoNotPay Settlement Means for AI Contract Tools

The DoNotPay enforcement action addresses the consumer-facing end of AI legal tools — tools sold directly to individuals who use them without attorney assistance. The FTC's theory was straightforward: if you tell consumers that your AI can produce legally binding documents and those documents fail to achieve the promised legal effect, you have committed an unfair or deceptive practice.

For attorneys using AI contract tools in professional practice, the FTC enforcement action has three specific implications that go beyond the consumer context:

Implication 1: AI Tool Marketing Claims Are Not Professional Standards

AI contract review tools marketed to law firms frequently claim capabilities — "identifies all material risk provisions," "drafts market-standard indemnification language," "detects missing essential terms" — that may not be substantiated by independent testing. An attorney who deploys an AI contract tool based on the vendor's marketing claims, without independent evaluation of the tool's accuracy for the specific document types and jurisdictions in their practice, has not satisfied ABA Model Rule 1.1's technology competence standard.

Implication 2: Vendor Limitation of Liability Clauses Protect the Vendor

Every AI contract tool vendor includes limitation of liability language in its terms of service. The DoNotPay consent order required customer notification — but it did not require DoNotPay to compensate customers whose legal matters were harmed by relying on DoNotPay's AI-generated documents. The $193,000 penalty went to the FTC, not to affected customers. For attorneys, this means that when an AI contract tool produces incorrect language that harms a client, the vendor's liability limitation clause will prevent recovery from the vendor — leaving the attorney as the only solvent defendant.

Implication 3: The Professional Responsibility Standard Is Independent of the Vendor's Defense

The attorney's professional responsibility standard for AI contract work is not determined by what the AI vendor's terms of service permit or disclaim. An attorney who uses an AI-generated contract draft is professionally responsible for the accuracy and completeness of that draft, regardless of what the AI vendor's marketing claims were or what its limitation of liability clause says. The client's malpractice claim against the attorney does not require proving the vendor made misrepresentations — it requires proving the attorney's conduct fell below the standard of care.

UCC Article 2 and Common Law Contract AI Failure Modes

AI contract drafting tools fail in specific patterns that are predictable from their architecture. Understanding these failure modes — and how they map to UCC Article 2 and common law contract principles — enables attorneys to design verification protocols that catch the most consequential errors.

UCC Article 2 Failure Modes

UCC Article 2 governs contracts for the sale of goods. AI contract tools trained on general commercial contract corpora frequently commit the following errors in Article 2 agreements:

Common Law Contract AI Failure Modes

In non-UCC contexts (services contracts, intellectual property agreements, real property, employment), AI contract tools exhibit a different but equally specific set of failure modes:

The Indemnification Gap That DoNotPay Exposed:

The DoNotPay settlement documents included examples of consumer contracts generated by the AI that contained indemnification provisions that were unenforceable, warranties that were invalid, and limitation of liability clauses that conflicted with mandatory consumer protection statutes in the consumer's jurisdiction. The contracts looked like valid legal documents. They were not. The FTC's enforcement action addressed the vendor's misrepresentation — but for an attorney who produced similar documents for a client, the remedy would be malpractice, not FTC action against the vendor.

Vendor Limitation of Liability Clauses: What They Cover and What They Don't

Every AI contract tool vendor limits its liability through terms of service provisions. Understanding the scope of these limitations — and the gaps they create — is essential for attorneys assessing whether to deploy AI tools in contract drafting and review.

What Vendor Limitation of Liability Clauses Typically Cover

The Indemnification Gap

The critical limitation in AI contract tool vendor terms is the indemnification structure. Vendors typically provide narrow indemnification limited to third-party IP infringement claims — not to errors in the AI's contract analysis or drafting. This means:

  1. If an AI contract review tool misses a critical provision in a contract, and the client suffers harm from that omission, the attorney bears the full liability for the missed provision
  2. The attorney cannot recover from the AI vendor for the client's harm because the vendor's indemnification does not cover accuracy errors
  3. The attorney cannot recover from the AI vendor for the malpractice settlement or judgment because the vendor's liability is capped at prior subscription fees
  4. The attorney's only recourse against the vendor is a contract claim — for breach of warranty (most vendors disclaim all warranties) or for negligent misrepresentation (the DoNotPay theory) — which is expensive and uncertain
// Typical AI Contract Tool Limitation of Liability (Illustrative) Section 12. Limitation of Liability: IN NO EVENT SHALL [VENDOR] BE LIABLE FOR ANY INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL, OR PUNITIVE DAMAGES, INCLUDING WITHOUT LIMITATION: - Loss of profits, loss of data, or loss of goodwill - Cost of procurement of substitute goods or services - ANY CLAIM BY A THIRD PARTY (including your clients) [VENDOR]'S TOTAL LIABILITY SHALL NOT EXCEED THE GREATER OF: - $100.00 USD; OR - AMOUNTS PAID BY YOU IN THE PRIOR 12 MONTHS // What this means for attorney liability: Client loses $2.3M from missed indemnification provision in AI-drafted contract Attorney liable for $2.3M in malpractice Attorney can recover: $0 from vendor (consequential damages excluded) Attorney can recover: $12,000 from vendor (annual subscription cap) Attorney's recourse: Professional liability insurance (if AI use was documented)

AI Contract Review Quality Assurance Checklist

AI Contract Drafting and Review: Error Prevention Checklist

01
AI Tool Capability Assessment by Document Type

Assess the AI tool's accuracy for each specific document type and jurisdiction before using it for client matters. "AI contract tool" encompasses hundreds of document types with different error rates. A tool that performs well on simple NDAs may perform poorly on complex multi-party services agreements with international arbitration provisions.

02
UCC Article 2 Checklist for Goods Contracts

For goods contracts, apply a UCC-specific checklist to every AI-generated or AI-reviewed document covering: warranty disclaimer conspicuousness (§ 2-316), battle of the forms terms (§ 2-207), cure and rejection rights (§ 2-508, § 2-601), and limitation of remedy enforceability (§ 2-719). These are the four Article 2 provisions AI tools most frequently mishandle.

03
Indemnification Review Against Client Instructions

Compare every AI-generated indemnification provision against the client's specific indemnification instructions. AI tools default to training-data-typical indemnification — which may be adverse to the client's position in transactions where the client is not in the typical role for that document type. Confirm indemnification scope, trigger conditions, duty to defend vs. duty to indemnify, and consequential damage inclusion.

04
Choice of Law and Forum Consistency Check

Verify that choice of law and forum selection clauses are consistent with each other and with mandatory consumer protection statutes in the relevant jurisdictions. AI tools frequently select law from one state and forum in another without analyzing the conflict. In consumer contracts, mandatory state consumer protection law may override choice of law provisions regardless.

05
Limitation of Liability Clause Enforceability Analysis

Analyze every AI-generated limitation of liability clause for enforceability under the applicable state law. Many states apply unconscionability doctrine to limit liability limitations in specific contexts. California, New York, and Massachusetts have specific judicial interpretations of "failure of essential purpose" (UCC § 2-719(2)) that differ from the UCC's drafters' intent.

06
Vendor Limitation of Liability Review Before AI Tool Deployment

Before deploying any AI contract tool for client matters, review the vendor's limitation of liability clause and assess the indemnification gap: if the AI produces an error that harms a client, what recovery is available from the vendor? The gap between the vendor's liability cap and the client's potential harm is the firm's uninsured exposure.

07
Missing Term Detection Protocol

Implement a "missing term" checklist for each contract type that documents the essential provisions the attorney must verify are present in every AI-generated document. For services agreements: scope of work, acceptance criteria, IP ownership, confidentiality, termination rights, limitation of liability, indemnification, dispute resolution, and governing law. AI tools miss these provisions less often than they mishandle them — but missing is worse than wrong.

08
AI-Assisted vs. AI-Generated Documentation in Matter File

Document in the matter file whether each contract was AI-assisted (attorney drafted, AI suggested revisions) or AI-generated (AI drafted, attorney reviewed). This distinction matters for malpractice defense — the standard of care for AI-assisted drafting requires verification of AI suggestions; for AI-generated drafts it requires comprehensive review of the complete document.

09
Client Disclosure of AI Contract Drafting Under ABA Op. 512

Disclose AI use in contract drafting to clients per ABA Formal Op. 512 when the AI's contribution affects the substance of the final document. Include disclosure language in the engagement letter identifying the AI tool used and the attorney's review and verification process. This disclosure protects the attorney in malpractice proceedings.

10
Jurisdiction-Specific Consumer Protection Overlay

For consumer-facing contracts, apply a mandatory overlay analysis for consumer protection statutes in each applicable jurisdiction. State consumer protection laws in California (Civ. Code § 1770), New York (GBL § 349), and Massachusetts (c. 93A) impose requirements that override contract terms and that AI tools trained on B2B contracts often fail to reflect.

How Claire Handles AI Contract Drafting Risk

Claire's Contract Drafting Architecture: Verification, Documentation, and Privilege

Claire's contract drafting and review capabilities are designed with the specific failure modes of AI contract tools in mind — and with the attorney supervision documentation that malpractice defense requires.

Document-Type Specific Verification Prompts

Claire's contract workflows include document-type specific verification prompts that flag the provisions most commonly misconfigured by AI tools in each document category. For UCC goods contracts, Claire prompts the attorney to verify warranty disclaimer conspicuousness, battle of the forms terms, and limitation of remedy enforceability before the document is marked complete. The DoNotPay failure modes are built into the verification architecture.

Jurisdiction-Specific Mandatory Provision Overlays

Claire applies jurisdiction-specific mandatory provision overlays to consumer-facing contracts, flagging provisions that conflict with state consumer protection statutes in California, New York, Massachusetts, and Texas — the four states with the most consequential B2C contract requirements. Attorneys receive specific warnings when AI-generated language conflicts with mandatory local requirements.

Attorney Review Documentation in Matter File

Every Claire contract workflow produces a review documentation record in the firm's matter management system, recording: which provisions were AI-generated, which were attorney-modified, which were verified against statutory standards, and who signed off on final review. This documentation is the attorney's primary defense in contract malpractice proceedings.

Zero-Training Architecture for Privileged Contract Content

Client contract terms processed through Claire are never used to train AI models — not Claire's models, not any third-party model. The training data contamination risk that the DoNotPay case highlighted (AI trained on contracts from similar transactions potentially influencing outputs for adverse parties) is architecturally eliminated in Claire's isolated deployment model.

The DoNotPay settlement established that AI legal tool vendors can be held accountable for capability misrepresentations. But the settlement's $193,000 penalty and its focus on consumer protection enforcement should not obscure the larger liability picture for attorneys: when AI contract errors harm clients, the attorney bears professional responsibility that vendor limitation of liability clauses will not satisfy. The solution is not avoiding AI in contract work — it is deploying AI with the verification architecture and documentation protocols that satisfy the standard of care.

For the malpractice insurance implications of AI contract errors, see AI malpractice liability. For the governance policies that document AI tool use for underwriting purposes, see AI governance for law firms. For the bar ethics framework governing attorney supervision of AI-generated documents, see bar ethics AI guidelines.

Claire
Ask Claire about AI contract review safety Verification workflows for UCC and common law contracts