Know the Law: Navigating Pitfalls Related to Using AI for NIL

Photo of Seth M. Corwin
Seth M. Corwin
Associate, Corporate Department
Published: Union Leader
April 8, 2025

Q: I am a student-athlete (SA), and I want to use artificial intelligence (AI) with my name, image, and likeness (NIL) activities. Can I safely use AI to do so?

A: Recently, AI’s usability and accessibility has exploded, with AI applications like ChatGPT, Claude, and Gemini gaining publicity and users. Subsequently, SAs may be tempted to use AI tools like these to assist their NIL activities, mainly due to their accessibility and low cost. However, SAs should evaluate potential issues involved with using AI for NIL activities. Below is a partial list of issues to consider.

First, SAs should be aware of any school or athletic department policies regarding the use of AI. Depending on how the policies are worded, a SA could inadvertently violate an AI policy by using AI for NIL activities. Consequently, SAs should speak with school personnel and review the relevant documents (e.g. the student handbook) to determine whether any policies governing AI exist before proceeding.

Next, SAs might tempted to use AI to understand or draft contractual provisions for NIL contracts. While AI can be useful for researching a subject, SAs should be wary of relying on AI to draft and review legal documents because it may be unreliable. AI applications can hallucinate, meaning they can provide confident and seemingly correct answers that are actually wrong. This occurs when an AI application is considering questions relying on its training data (“Can I avoid paying student loans by declaring bankruptcy?”) or questions after reviewing a user provided document (“Does this contract prohibit me from signing contracts with other companies?”). There is no established strategy to detect or avoid AI hallucinations currently (besides researching yourself to confirm the answer), so it can be difficult to know when they occur. Similarly, users ask an AI application to draft legal language, the results can vary. The tool may draft language that is appropriate for one state but not the user’s. It may misunderstand the context and provide language that looks correct, but is intended to govern a different situation (e.g. providing company-friendly language rather than SA-friendly language).  Subsequently, SAs should proceed cautiously when using AI for their contracts without proper legal representation.

SAs could also be inclined to use AI to help satisfy their various NIL obligations, such as using it to create videos or voiceover recordings of themselves for their company partners. However, before doing so, SAs should thoroughly read their NIL contract’s terms to confirm if that is permitted. Moreover, NIL contracts frequently require SAs to represent that they own and can freely assign any content created for the company. Consequently, users that create media with AI applications may not own any intellectual property rights in that media. The U.S. Copyright Office has been reluctant to grant copyright protections for AI-generated content, although the Office recently clarified that it will do so where the creativity of a human creator is perceptible. Thus, SAs should meticulously review their NIL contracts before using AI to fulfill them.

There is no doubt that AI can be useful for SAs. However, when it comes to AI and all things NIL related, SAs must do their proper due diligence to protect their interests. By doing so, SAs can wield a powerful new tool, while also protecting themselves.