The use of artificial intelligence tools such as ChatGPT, Claude, and Perplexity has infiltrated most workplaces, and the legal profession is no exception. Addressing a “question of first impression nationwide,” Judge Rakoff issued a decision of the Southern District of New York in United States v. Heppner has wide-ranging implications for both legal counsel and their clients.
The pertinent facts of Heppner for this discussion are as follows:
- Without direction from his legal counsel, the defendant used an AI tool to generate reports outlining a potential defense strategy.
- In doing so, the defendant provided the AI tool with sensitive information regarding his case, information that would typically be protected by either attorney-client privilege or work-product doctrine.
- When federal agents searched the defendant’s home, they seized devices containing the AI-generated documents and sought to introduce them as evidence.
The defendant attempted to keep these documents out of court by asserting privilege; however, the court’s ruling determined that the AI-generated documents lacked the substantive elements of privilege that would protect the documents.
First, the documents were not between a client and his or her attorney because the AI tool is not an attorney. Second, the documents were not intended to remain confidential because the defendant had communicated with a third-party AI platform to create them. Many AI tools collect data on user inputs and the tool’s outputs and reserve the right to disclose such data to third parties, including government agencies. Third, the defendant did not prepare the documents for the purpose of obtaining legal advice. While the defendant did share the documents with his counsel, the AI tool is incapable of providing legal advice by its own terms and conditions.
What This Means
Although this ruling is not binding nationwide, clients should be aware that information provided to AI tools is generally not protected by attorney-client privilege or other confidentiality obligations. Before disclosing any sensitive information to an AI tool, clients should speak with their counsel.
The privacy policies, terms of service, and end user agreements of many AI tools allow the providers to share data with third parties, including government agencies. Disclosing confidential information to an AI tool may be functionally equivalent to disclosing it directly to a federal agent.
For materials to be protected under product doctrine, the client’s legal counsel must have been involved in the their creation. Independent research by a client, even if later shared with counsel, does not qualify for work-product protection.


