AI Contract Review Liability: DoNotPay $193,000 FTC Settlement and the Limitation of Liability Gap
The FTC's $193,000 settlement with DoNotPay in 2023 established that AI legal tools that make unverified claims about their capabilities are subject to consumer protection enforcement — even when those tools disclaim that they are not practicing law. For attorneys using AI in contract drafting and review, the liability question is different but equally serious: when AI-generated contract language creates a contractual gap that harms a client, the attorney bears professional responsibility for the error. Understanding the specific failure modes of AI contract tools, the UCC Article 2 and common law contract principles that govern those errors, and the limitation of liability gaps in AI vendor contracts is essential for any attorney deploying AI in transactional practice.
⚖ DoNotPay FTC Settlement — $193,000 (2023)
| Matter | In the Matter of DoNotPay, Inc., FTC File No. 232 3057 |
| Settlement Amount | $193,000 civil penalty |
| Settlement Date | February 22, 2023 (announced); settlement terms disclosed June 2023 |
| FTC Theory | DoNotPay made false or misleading representations about the capability of its AI to produce legal documents, defend consumers in court, and provide legal advice equivalent to an attorney |
| Key Finding | AI legal tools that claim human-equivalent legal expertise without substantiation violate FTC Act Section 5(a) (unfair or deceptive acts or practices) |
| Remedial Requirements | DoNotPay prohibited from making claims about AI legal capabilities without competent and reliable scientific evidence; required to notify affected customers of limitations |
| Source | FTC: DoNotPay Case File → |
What the DoNotPay Settlement Means for AI Contract Tools
The DoNotPay enforcement action addresses the consumer-facing end of AI legal tools — tools sold directly to individuals who use them without attorney assistance. The FTC's theory was straightforward: if you tell consumers that your AI can produce legally binding documents and those documents fail to achieve the promised legal effect, you have committed an unfair or deceptive practice.
For attorneys using AI contract tools in professional practice, the FTC enforcement action has three specific implications that go beyond the consumer context:
Implication 1: AI Tool Marketing Claims Are Not Professional Standards
AI contract review tools marketed to law firms frequently claim capabilities — "identifies all material risk provisions," "drafts market-standard indemnification language," "detects missing essential terms" — that may not be substantiated by independent testing. An attorney who deploys an AI contract tool based on the vendor's marketing claims, without independent evaluation of the tool's accuracy for the specific document types and jurisdictions in their practice, has not satisfied ABA Model Rule 1.1's technology competence standard.
Implication 2: Vendor Limitation of Liability Clauses Protect the Vendor
Every AI contract tool vendor includes limitation of liability language in its terms of service. The DoNotPay consent order required customer notification — but it did not require DoNotPay to compensate customers whose legal matters were harmed by relying on DoNotPay's AI-generated documents. The $193,000 penalty went to the FTC, not to affected customers. For attorneys, this means that when an AI contract tool produces incorrect language that harms a client, the vendor's liability limitation clause will prevent recovery from the vendor — leaving the attorney as the only solvent defendant.
Implication 3: The Professional Responsibility Standard Is Independent of the Vendor's Defense
The attorney's professional responsibility standard for AI contract work is not determined by what the AI vendor's terms of service permit or disclaim. An attorney who uses an AI-generated contract draft is professionally responsible for the accuracy and completeness of that draft, regardless of what the AI vendor's marketing claims were or what its limitation of liability clause says. The client's malpractice claim against the attorney does not require proving the vendor made misrepresentations — it requires proving the attorney's conduct fell below the standard of care.
UCC Article 2 and Common Law Contract AI Failure Modes
AI contract drafting tools fail in specific patterns that are predictable from their architecture. Understanding these failure modes — and how they map to UCC Article 2 and common law contract principles — enables attorneys to design verification protocols that catch the most consequential errors.
UCC Article 2 Failure Modes
UCC Article 2 governs contracts for the sale of goods. AI contract tools trained on general commercial contract corpora frequently commit the following errors in Article 2 agreements:
- Warranty Disclaimer Defects: Under UCC § 2-316, to effectively disclaim the implied warranty of merchantability, the disclaimer must be conspicuous and must specifically mention "merchantability." AI tools frequently draft warranty disclaimers in language that satisfies the substantive standard but fails the conspicuousness requirement under § 2-316(2) — often because the conspicuousness analysis is context-dependent (type size, location in document, formatting) and AI tools generating plain text lack the capacity to assess visual formatting.
- Battle of the Forms Errors: UCC § 2-207 governs the "battle of the forms" — the situation where buyer and seller exchange forms with different terms. AI tools frequently draft terms intended to serve as a "master form" that takes precedence in a § 2-207 battle, but draft these terms in ways that are not enforceable as additional terms under § 2-207(2) or that fail to clearly and timely object to materially different terms in the counter-party's form.
- Perfect Tender and Cure Provisions: UCC § 2-508 gives sellers a right to cure non-conforming tender in specific circumstances. AI tools drafting supply agreements frequently omit or misconfigure cure right provisions, leaving sellers with cure rights they didn't intend and buyers with rejection rights they can't exercise.
- Limitation of Remedies: UCC § 2-719 permits parties to limit remedies, including limiting recovery to repair or replacement. AI tools frequently draft limitation of remedy provisions that fail under § 2-719(2) and (3) because the limitation causes the remedy to fail of its essential purpose, or because the limitation of consequential damages is unconscionable in the context of injury to person in consumer goods contracts.
Common Law Contract AI Failure Modes
In non-UCC contexts (services contracts, intellectual property agreements, real property, employment), AI contract tools exhibit a different but equally specific set of failure modes:
- Integration Clause Defects: AI tools frequently draft integration clauses that do not effectively exclude prior oral representations that the client specifically wants to exclude, or that inadvertently exclude representations that the client's prior course of dealing includes in the contract.
- Indemnification Asymmetry: AI tools trained on standard form contracts produce indemnification provisions that are typically asymmetric toward the party that uses forms most frequently in the AI's training data. For law firms representing less common parties in a particular transaction type, the AI's default indemnification may be materially adverse to the client's position.
- Choice of Law and Forum Errors: AI tools frequently fail to coordinate choice of law and forum selection clauses — selecting law from one jurisdiction and forum in another in ways that create conflicts. This error pattern is particularly common in cross-border contracts where the AI training data over-represents domestic agreements.
The DoNotPay settlement documents included examples of consumer contracts generated by the AI that contained indemnification provisions that were unenforceable, warranties that were invalid, and limitation of liability clauses that conflicted with mandatory consumer protection statutes in the consumer's jurisdiction. The contracts looked like valid legal documents. They were not. The FTC's enforcement action addressed the vendor's misrepresentation — but for an attorney who produced similar documents for a client, the remedy would be malpractice, not FTC action against the vendor.
Vendor Limitation of Liability Clauses: What They Cover and What They Don't
Every AI contract tool vendor limits its liability through terms of service provisions. Understanding the scope of these limitations — and the gaps they create — is essential for attorneys assessing whether to deploy AI tools in contract drafting and review.
What Vendor Limitation of Liability Clauses Typically Cover
- Direct contract damages: The vendor's liability for the attorney's subscription fees and direct costs of the AI service
- Consequential damages exclusion: Most vendor limitations exclude liability for consequential, incidental, or punitive damages — meaning lost profits, harm to client relationships, or reputational damage resulting from AI errors
- Aggregate liability cap: Most vendor limitations cap aggregate liability at the fees paid in the prior 12 months — typically $5,000-$50,000 for law firm subscriptions, compared to the millions at stake in the underlying client matter
The Indemnification Gap
The critical limitation in AI contract tool vendor terms is the indemnification structure. Vendors typically provide narrow indemnification limited to third-party IP infringement claims — not to errors in the AI's contract analysis or drafting. This means:
- If an AI contract review tool misses a critical provision in a contract, and the client suffers harm from that omission, the attorney bears the full liability for the missed provision
- The attorney cannot recover from the AI vendor for the client's harm because the vendor's indemnification does not cover accuracy errors
- The attorney cannot recover from the AI vendor for the malpractice settlement or judgment because the vendor's liability is capped at prior subscription fees
- The attorney's only recourse against the vendor is a contract claim — for breach of warranty (most vendors disclaim all warranties) or for negligent misrepresentation (the DoNotPay theory) — which is expensive and uncertain
AI Contract Review Quality Assurance Checklist
AI Contract Drafting and Review: Error Prevention Checklist
Assess the AI tool's accuracy for each specific document type and jurisdiction before using it for client matters. "AI contract tool" encompasses hundreds of document types with different error rates. A tool that performs well on simple NDAs may perform poorly on complex multi-party services agreements with international arbitration provisions.
For goods contracts, apply a UCC-specific checklist to every AI-generated or AI-reviewed document covering: warranty disclaimer conspicuousness (§ 2-316), battle of the forms terms (§ 2-207), cure and rejection rights (§ 2-508, § 2-601), and limitation of remedy enforceability (§ 2-719). These are the four Article 2 provisions AI tools most frequently mishandle.
Compare every AI-generated indemnification provision against the client's specific indemnification instructions. AI tools default to training-data-typical indemnification — which may be adverse to the client's position in transactions where the client is not in the typical role for that document type. Confirm indemnification scope, trigger conditions, duty to defend vs. duty to indemnify, and consequential damage inclusion.
Verify that choice of law and forum selection clauses are consistent with each other and with mandatory consumer protection statutes in the relevant jurisdictions. AI tools frequently select law from one state and forum in another without analyzing the conflict. In consumer contracts, mandatory state consumer protection law may override choice of law provisions regardless.
Analyze every AI-generated limitation of liability clause for enforceability under the applicable state law. Many states apply unconscionability doctrine to limit liability limitations in specific contexts. California, New York, and Massachusetts have specific judicial interpretations of "failure of essential purpose" (UCC § 2-719(2)) that differ from the UCC's drafters' intent.
Before deploying any AI contract tool for client matters, review the vendor's limitation of liability clause and assess the indemnification gap: if the AI produces an error that harms a client, what recovery is available from the vendor? The gap between the vendor's liability cap and the client's potential harm is the firm's uninsured exposure.
Implement a "missing term" checklist for each contract type that documents the essential provisions the attorney must verify are present in every AI-generated document. For services agreements: scope of work, acceptance criteria, IP ownership, confidentiality, termination rights, limitation of liability, indemnification, dispute resolution, and governing law. AI tools miss these provisions less often than they mishandle them — but missing is worse than wrong.
Document in the matter file whether each contract was AI-assisted (attorney drafted, AI suggested revisions) or AI-generated (AI drafted, attorney reviewed). This distinction matters for malpractice defense — the standard of care for AI-assisted drafting requires verification of AI suggestions; for AI-generated drafts it requires comprehensive review of the complete document.
Disclose AI use in contract drafting to clients per ABA Formal Op. 512 when the AI's contribution affects the substance of the final document. Include disclosure language in the engagement letter identifying the AI tool used and the attorney's review and verification process. This disclosure protects the attorney in malpractice proceedings.
For consumer-facing contracts, apply a mandatory overlay analysis for consumer protection statutes in each applicable jurisdiction. State consumer protection laws in California (Civ. Code § 1770), New York (GBL § 349), and Massachusetts (c. 93A) impose requirements that override contract terms and that AI tools trained on B2B contracts often fail to reflect.
How Claire Handles AI Contract Drafting Risk
Claire's Contract Drafting Architecture: Verification, Documentation, and Privilege
Claire's contract drafting and review capabilities are designed with the specific failure modes of AI contract tools in mind — and with the attorney supervision documentation that malpractice defense requires.
Document-Type Specific Verification Prompts
Claire's contract workflows include document-type specific verification prompts that flag the provisions most commonly misconfigured by AI tools in each document category. For UCC goods contracts, Claire prompts the attorney to verify warranty disclaimer conspicuousness, battle of the forms terms, and limitation of remedy enforceability before the document is marked complete. The DoNotPay failure modes are built into the verification architecture.
Jurisdiction-Specific Mandatory Provision Overlays
Claire applies jurisdiction-specific mandatory provision overlays to consumer-facing contracts, flagging provisions that conflict with state consumer protection statutes in California, New York, Massachusetts, and Texas — the four states with the most consequential B2C contract requirements. Attorneys receive specific warnings when AI-generated language conflicts with mandatory local requirements.
Attorney Review Documentation in Matter File
Every Claire contract workflow produces a review documentation record in the firm's matter management system, recording: which provisions were AI-generated, which were attorney-modified, which were verified against statutory standards, and who signed off on final review. This documentation is the attorney's primary defense in contract malpractice proceedings.
Zero-Training Architecture for Privileged Contract Content
Client contract terms processed through Claire are never used to train AI models — not Claire's models, not any third-party model. The training data contamination risk that the DoNotPay case highlighted (AI trained on contracts from similar transactions potentially influencing outputs for adverse parties) is architecturally eliminated in Claire's isolated deployment model.
The DoNotPay settlement established that AI legal tool vendors can be held accountable for capability misrepresentations. But the settlement's $193,000 penalty and its focus on consumer protection enforcement should not obscure the larger liability picture for attorneys: when AI contract errors harm clients, the attorney bears professional responsibility that vendor limitation of liability clauses will not satisfy. The solution is not avoiding AI in contract work — it is deploying AI with the verification architecture and documentation protocols that satisfy the standard of care.
For the malpractice insurance implications of AI contract errors, see AI malpractice liability. For the governance policies that document AI tool use for underwriting purposes, see AI governance for law firms. For the bar ethics framework governing attorney supervision of AI-generated documents, see bar ethics AI guidelines.