Generative AI Tools at CSU
What AI tools are approved for use with CSU sensitive data?
Only three tools are currently approved for handling sensitive CSU data (such as research data, student information, or protected university records):
- CSU-GPT
- CSU-GPT gives every student, faculty, and staff member a safe way to explore generative AI—inside CSU’s secure Microsoft Azure environment
Microsoft Copilot Chat (When logged in with your CSU NetID)
- Microsoft Teams Premium
- Teams Premium provides meeting AI notes and recaps, much like other services like read.ai and otter.ai
- All conversations are kept within the CSU Microsoft Tenant, ensuring privacy
- Departments can request Teams Premium licenses now at $29/user/year.
These are enterprise tools provided through CSU’s Microsoft 365 agreement. They meet university security and compliance standards. Please keep checking the Data Governance site for updates.
Can I use other AI tools (like ChatGPT, Gemini, or Anthropic Claude)?
You may use other tools for non-sensitive, public information only. Many commercial AI tools have privacy statements that allow them to collect and store your data, and in many cases, they may use your input to train future models. This means if you paste in CSU research data, student records, or confidential work, you may be exposing it outside the university’s control.
How do Copilot Chat and Teams Premium protect my data?
Because they are covered by CSU’s enterprise agreement with Microsoft, these tools follow CSU’s data protection standards. Inputs and outputs are not used to train public AI models, and Microsoft enforces strong access controls, encryption, and compliance requirements.
Can I use other tools to transcribe meetings?
Please view this Division of IT Knowledge Base aricle on the Usage of Third-Party AI/Transcription tools.
What are some examples of “sensitive data”?
Sensitive data includes (but is not limited to):
Student data: grades, IDs, personal records (FERPA-protected).
Research data: unpublished findings, participant information, grant-related data.
Employee data: HR files, evaluations, health or financial information.
Institutional data: confidential planning documents, internal communications not meant for public release.
If you’re unsure whether something qualifies, err on the side of caution and use an approved tool.
What about drafting general content, brainstorming, or summarizing public information?
For these tasks, you may choose from a variety of tools, but remember: do not input sensitive CSU data unless you’re using Copilot Chat or Teams Premium. For example, asking ChatGPT to generate a public announcement draft is fine, but uploading a confidential budget spreadsheet is not.
Will CSU approve more tools in the future?
As the AI landscape evolves, CSU continues to evaluate tools for accessibility, security, and compliance. Any newly approved tools will be announced on this site.
What should I consider before using an AI tool?
Ask yourself:
What data am I providing? Is it public, sensitive, or confidential?
Where does this data go? Does the vendor’s privacy policy allow them to reuse it?
Is this tool covered by CSU’s agreements? If not, only use it with public information.
Who can I contact if I have questions?
If you’re unsure about whether a tool is safe to use, contact us.