Ai tools header graphic

AI Information for Faculty

Generative AI for teaching at Cal State Ä¢¹½ÊÓÆµ

The Online Campus recognizes that AI is a new technology that is rapidly changing the higher education landscape.  We will be updating this website with helpful information regarding AI tools in teaching and learning and further resources.

ai_icons.png


Generative AI Overview

Generative AI is artificial intelligence that can create new content such as text, images, code, audio and video. It is able to create content based on large sets of data with intuitive processing algorithms. “These models are trained on large text datasets to learn to predict the next word in a sentence and, from that, generate coherent and compelling human-like output in response to a question or statement” (UNESCO, 2023).

image icon for A i video presentation link


Listen to CSUEB Faculty discuss using AI in their classrooms, course development, accessibility and beliefs around AI.

image icon for systemwide AI resources

 

Check out webinars, resources, publications and other AI resources from the . was at the Board of Trustees meeting on 01/28/2025. 

 

 

OpenAI's ChatGPT Edu Agreement is tailored specifically for the California State University system, providing advanced AI capabilities for all employees and students. It offers the ability to build custom GPTs for sharing within campus workspaces, as well as the needed privacy and data protection, and other enterprise security such as single sign-on (SSO), and SCIM integration.

  • (from the CSU Chancellor’s Office)

What do we have? 

Microsoft Copilot is a generative AI assistant that is designed to enhance productivity, creativity and streamline tasks. It provides access to advanced generative AI with text, voice, and image capabilities and it is available to CSUEB faculty, staff, and students. 

To get started with Copilot, use your CSUEB official or horizon email to login to  Review the step-by-step article on .

Resources:

  • Learn about
  • Learn about
  • Linkedin Learning:

AI in your course: policies and discipline guidance

Star Icon
  • For courses where authentic, AI-free student work is essential to learning outcomes.

    Sample syllabus language:

    In this course, the use of generative AI tools (including but not limited to ChatGPT, Microsoft Copilot, Claude, and Gemini) on graded work is prohibited unless I explicitly indicate otherwise for a specific assignment. Submitting AI-generated work as your own is a violation of the CSUEB Academic Dishonesty Policy and will be treated as plagiarism.

    Talk to your students about why. Many students don't yet have a clear sense of when AI use is acceptable in higher education. Frame the policy as part of how you've designed the learning experience, not as suspicion.

  • For courses where AI use is appropriate in some contexts but not others.

    Sample syllabus language:

    Generative AI tools may be used in this course only for assignments where I explicitly permit it. When AI is permitted, I will indicate which tools are allowed, what kind of use is acceptable (brainstorming, drafting, editing, code completion, etc.), and how to cite the AI's contribution. For all other assignments, AI-generated work submitted as your own will be treated as plagiarism per the CSUEB Academic Dishonesty Policy.

    This option is the most common in 2025-26. It gives you flexibility to teach AI as a tool in some contexts while preserving traditional assessment in others.

  • For courses where AI is integrated as a learning tool throughout.

    Sample syllabus language:

    You may use generative AI tools (ChatGPT, Microsoft Copilot, Claude, etc.) in this course as a learning aid. You are responsible for the accuracy, originality, and integrity of all work you submit. When you use AI in a substantive way, cite the tool, the prompt you used, and the date. AI use does not exempt you from the CSUEB Academic Dishonesty Policy - submitting AI work as your own original analysis or refusing to disclose AI use is plagiarism.

    Pair this option with explicit instruction on AI literacy: how to prompt effectively, how to verify outputs, how to cite, and where AI tools fail. Students taking your course will be better prepared for AI-using workplaces.

  • Generative image and design tools (Midjourney, DALL-E, Adobe Firefly, Stable Diffusion) raise distinct questions in art and design courses. Consider:

    • Process vs. product. If learning outcomes are about technique, AI-generated work undermines them. If outcomes are about concept and iteration, AI can be a brainstorming partner.
    • Attribution and ethics. Many image models were trained on artists' work without permission. Class discussion about consent, attribution, and the labor economics of generative tools is appropriate.
    • Tool literacy. Students entering design careers will use these tools. Teaching them how (and when not) to use AI is a curricular responsibility.

    Suggested AI tools: Adobe Firefly (CSUEB has Adobe Creative Cloud licensing for many faculty), Microsoft Copilot for image generation.

  • Business and economics students will use AI extensively in their careers. Consider:

    • Case analysis. AI can summarize, draft, and produce financial models quickly. Outcomes around critical thinking and judgment require careful assessment design.
    • Quantitative work. AI tools handle calculation and code generation well, often better than spreadsheets. Teach students to verify, not just trust.
    • Ethics and accountability. Use of AI in business decisions raises privacy, bias, and disclosure questions worth class time.

    Suggested AI tools: Microsoft Copilot 365 (built into Excel and PowerPoint for CSUEB faculty), ChatGPT Edu.

  • Humanities courses often emphasize the kind of close reading, argumentation, and voice that current AI handles poorly. Consider:

    • Authentic voice. If your assessments require students to develop their own analytical voice, AI shortcuts undermine the learning. Process-based assessment (drafts, revisions, in-class writing) helps.
    • Source criticism. AI hallucinates citations, summaries, and historical "facts." Teaching students to verify AI claims against primary sources is itself a humanities skill.
    • AI as research assistant. AI can help with brainstorming, outlining, and copy editing in ways that supplement (rather than replace) student thinking.

    Suggested AI tools: ChatGPT Edu, Claude (excellent for long-form writing assistance).

  • Social science courses often combine quantitative analysis, qualitative research, and theoretical argument. Consider:

    • Methods. AI can speed up coding qualitative data, drafting survey instruments, and producing literature summaries. Teach students to use these tools as starting points, not final answers.
    • Theoretical synthesis. AI struggles with nuanced theoretical argument and tends toward conventional summary. This is a teachable boundary.
    • Bias awareness. Models reproduce the biases in their training data. Social science courses are well-positioned to teach AI bias as part of methodological literacy.

    Suggested AI tools: ChatGPT Edu for drafting and synthesis, Microsoft Copilot for spreadsheets and code.

  • STEM disciplines have the most established norms around AI: it's a tool, not an answer. Consider:

    • Code generation. AI is genuinely useful for writing code and explaining algorithms. Pair AI use with assessments that require students to explain, debug, and modify code they've generated.
    • Math and proofs. AI can produce step-by-step solutions but often makes subtle errors. Teach students to verify each step.
    • Lab and field work. Process-based assessment is naturally AI-resistant. Use lab notebooks, in-class problem solving, and oral defense where appropriate.
    • Reproducibility and citation. If a student used AI to write code or analysis, they should disclose it the same way they would cite a library or collaborator.

    Suggested AI tools: GitHub Copilot (for code; CSUEB faculty can request access), Microsoft Copilot 365, ChatGPT Edu.