This page offers resources to support USF faculty and staff navigating AI, whether in student work, in existing software, or elsewhere on the job, and helps clarify how AI might support specific parts of our work while anticipating current challenges, dangers, and limitations.
Do not enter personally identifiable information (PII), student records, or other sensitive data into any AI tool that is not a licensed, USF-approved application.
Understand what data you can and cannot enter into AI tools. Not all tools are approved for sensitive or institutional data. Be especially mindful of FERPA- and HIPAA-protected data. Learn more about data classification →
Follow USF's policies and the university's Guidelines for Responsible Use of AI when integrating these tools into your work.
AI use in coursework must align with instructor guidelines and USF's academic honesty policies. Instructors are encouraged to model transparency in their use of AI in development of instructional materials.
AI tools can produce inaccurate, biased, or misleading outputs. Always verify AI-generated content before relying on it.
Looking for past events? Recordings from the ETS/CTE GenAI Symposium (2024), Student AI Week (2024), and the Generative AI Speaker Series (2023) are available on the ETS Events Archive →
Artificial intelligence (AI) refers to technology that can perform tasks that typically require human judgment, including understanding language, recognizing patterns, and making decisions. AI is embedded in many tools you may already use, including search engines, email filters, and productivity software.
Generative AI (GenAI) is a subset of AI that creates new content (text, images, code, audio, and video) in response to prompts. Tools like ChatGPT, Claude, and Google Gemini are generative AI. So are AI features built into tools you may already use, such as Zoom, Grammarly, and Adobe Acrobat. GenAI works by learning patterns from large datasets, then producing outputs that reflect what it has learned.