With the emergence of ChatGPT, Copilot, Bard and other large language model generative artificial intelligence tools, hereinafter collectively referred to as “AI Tools”, many members of our community are eager to explore their use in the university context. This advisory, which is jointly produced by the Office of Legal Affairs, University Compliance Services, Information Security Office, and the Business Contracts Office, provides guidance on how to acceptably use these AI Tools safely, without putting institutional, personal, or proprietary information at risk. Additional guidance may be forthcoming as circumstances evolve.

Allowable Use:

  • Data that is publicly available or defined as Published university information (UT Data Classification Standard) can be used freely in AI Tools. 

  • Data that is defined as Controlled or Confidential university information (UT Data Classification Standard) can be used with UT-managed instances of AI tools that are covered by university contracts that specifically protect university data from being used by training models or otherwise isolates university data into a separate instance that is not accessible by parties external to the university.  

  • AI tools and the types of university data authorized for use with each tool can be viewed in the Local and Cloud Services Decision Matrix  

  • In all cases, use should be consistent with the Acceptable Use Policy. 

Prohibited Use:

  • AI tools lacking a university contract and data sharing controls are not covered for use with Controlled or Confidential university information. This includes free or non-UT-managed versions of ChatGPT, Copilot, or other AI tools. In general,  student records subject to FERPA, health information, proprietary information, and any other information classified as Confidential or Controlled university data must not be used with non-UT dedicated instances of AI Tools.

  • Similarly, ChatGPT or similar AI Tools must not be used to generate output that would be considered non-public. Examples include, but are not limited to generating proprietary or unpublished research; legal analysis or advice; recruitment, personnel or disciplinary decision making; completion of academic work in a manner not allowed by the instructor; creation of non-public instructional materials; and grading. 

  • Please also note that the company that owns ChatGPT, OpenAI ,explicitly forbids the use of ChatGPT and their other products for certain categories of activity, including fraud and illegal activities. This list of items can be found in their usage policy. AI Tools of any sort may not be used for any activity that would be illegal, fraudulent or a violation of any state or federal law, or UT Austin or UT System policies.

Additional Guidance:

  • For more information on how UT is using AI tools to explore new ideas and to solve problems in novel ways, please reference UT’s Year of AI site or learn more about how UT is embracing innovation.  

  • For further guidance on the use of ChatGPT or other AI Tools for teaching and learning, please see the following guidance from the Center for Teaching and Learning. 

Rationale for the Above Guidance:

  • No UT Agreement, No Privacy and Security Terms: All content entered into, or generated by, ChatGPT is available to ChatGPT, its parent company, OpenAI, and their employees. There is currently no agreement between UT Austin and OpenAI,   Microsoft or other AI Tools that would provide data security and privacy protections required by UT policy with regard to ChatGPT,OpenAI’s , or other AI Tools’ programming interface. Consequently, the use of ChatGPT or other AI Tools at this time could expose individual users and UT to the potential loss and/or abuse of sensitive data and information. 
    • As of May 2023, the UT Austin Business Contracts Office is working on this issue. We hope to see this addressed in the near future and will update this guidance when additional information is available.
  • Personal Liability: ChatGPT and other AI Tools use click-through agreements. Click-through agreements, including OpenAI and ChatGPT and other AI Tools’ terms of use, are contracts. Individuals who accept click-through agreements without delegated signature authority may face personal consequences, including responsibility for compliance with terms and conditions [1].

Guidance on Appropriate Use

For questions regarding the appropriate use of ChatGPT and other AI Tools, please contact the UT Information Security Office at security@utexas.edu

References

Revision History

DateNewOriginal
09/04/2024
  • "With the emergence of ChatGPT, Copilot, Bard"
  • "Data that is defined as Controlled or Confidential university information (UT Data Classification Standard) can be used with UT-managed instances of AI tools that are covered by university contracts that specifically protect university data from being used by training models or otherwise isolates university data into a separate instance that is not accessible by parties external to the university."
  • "AI tools and the types of university data authorized for use with each tool can be viewed in the Local and Cloud Services Decision Matrix."
  • "AI tools lacking a university contract and data sharing controls are not covered for use with Controlled or Confidential university information. This includes free or non-UT-managed versions of ChatGPT, Copilot, or other AI tools. In general, student records subject to FERPA, health information, proprietary information, and other information classified as Confidential or Controlled university data must not be used with non-UT dedicated instances of AI."
  • "For more information on how UT is using AI tools to explore new ideas and to solve problems in novel ways, please reference UT's Year of AI site or learn more about how UT is embracing innovation."
  • "With the emergence of ChatGPT, Bard"
  • "At present, any use of ChatGPT or similar AI Tools cannot use any personal, confidential, proprietary, or otherwise sensitive information unless a university contract is in place that specifically protects such university data from being used by training models or otherwise isolates university data into a separate instance that is not accessible by parties external to the university. In general,  student records subject to FERPA, health information, proprietary information, and any other information classified as Confidential or Controlled university data must not be used with AI Tools."
07/20/2023 First published