Artificial Intelligence at City College of San Francisco
The AI Workgroup at CCSF supports the thoughtful, ethical, accessible, and privacy-conscious use of Generative Artificial Intelligence (GenAI) tools in support of teaching, learning, and college operations.
Recommendations for CCSF Employees Using Generative AI
The recommendations below were developed by the AI Workgroup to guide employees in the responsible and effective use of GenAI tools. This is a living webpage and will undergo changes as GenAI evolves.
Recommendations Last updated: November 18, 2025
Endorsed by: Technology Committee on November 18, 2025
Complete CCSF’s GenAI Essentials Course
Before using any GenAI tool for college-related work, employees are encouraged to complete the self-paced Generative AI Essentials course. This course introduces ethical, accessible, and privacy-conscious AI use for education and operations.
Protect Privacy and Confidential Information
In alignment with the CCSF Privacy Policy, which prohibits the disclosure of student, employee, or institutional data to non-approved systems, employees should:
- Never enter personally identifiable information (PII), student data, or confidential employee or institutional information in any GenAI tools (such as ChatGPT, Claude, Gemini, or Copilot).
- Use college-approved AI tools that comply with FERPA and the CCSF Privacy Policy.
- Microsoft Copilot, when used with your CCSF account, is approved for CCSF use and is available to all employees.
- Avoid using personal or non-CCSF accounts (such as private ChatGPT or Gemini subscriptions) for college-related work unless no secure alternative exists.
- Limited use of personal accounts is acceptable for learning how to use GenAI tools or for piloting potential tools prior to purchase or formal recommendation, provided no personally identifiable or confidential information is entered.
- Treat GenAI as a potentially public workspace. Anything entered may be stored, reviewed, shared, or reused by the provider or its partners.
Do Not Use Third-Party AI Meeting Assistants
Meeting assistant tools (such as Read.AI and Otter.AI) record or summarize meeting data.
- CCSF employees should discontinue the use of third-party AI meeting assistants in all CCSF-hosted or affiliated meetings, as these tools store data outside CCSF-approved systems. These tools may also join meetings automatically or access calendar data without explicit consent.
- Employees who need transcription, translation, or summary features should use Zoom AI Companion, which is included in CCSF’s enterprise Zoom license.
- More information: AI Assistants in Zoom Meetings (CCSF Login required to view)
Be Cautious with Agentic or Autonomous AI Tools
Some new systems are agentic (or autonomous) AI, meaning they can act independently to complete multi-step tasks (for example, sending emails, scheduling, or summarizing meetings without further instruction).
- Use these tools only in supervised pilot settings with clear human oversight.
- Do not connect them to CCSF systems or accounts without prior review and written approval from the administrators who oversee those specific systems (for example, OLET oversees Canvas).
- Ensure their actions are transparent, logged, and reversible.
You can learn more about agentic AI in the CCSF GenAI Essentials course.
Apply Critical Thinking to AI Outputs
GenAI tools can sound confident even when they are wrong. Always check AI outputs for:
- Accuracy: Verify facts and data with trusted sources.
- Bias: Be alert to stereotypes, cultural bias, or one-sided framing.
- Accessibility: Ensure AI-generated materials meet WCAG 2.1 AA standards.
- Citation: If you quote GenAI output directly, follow MLA or APA guidelines for citing AI-generated content.
You can learn more about bias and spotting hallucinations in the CCSF GenAI Essentials course.
Be Transparent When Using GenAI
When using GenAI to produce college-related work (for example: reports, emails, graphics, policies, or instructional materials), you are accountable for materials created, used, or shared, whether human- or AI-generated.
- Clearly disclose that AI was used to assist in creation.
- Review all content for accuracy, tone, and compliance before sharing.
- Remember: AI can support your role but should not replace human judgment, relationships, or creativity.
Example attribution statement:
This draft was created with assistance from Microsoft Copilot and reviewed by the AI Workgroup Team Members.
You can learn more about design statements in the CCSF GenAI Essentials course.
Follow Accessibility and Equity Principles
When creating or recommending AI-based tools for student or employee use:
- Verify that the tool meets WCAG 2.1 AA accessibility standards by examining a current filled-in VPAT (Voluntary Product Accessibility Template) or ACR (Accessibility Conformance Report).
- Consider whether it promotes equitable access across language, disability, and digital literacy differences.
- Avoid recommending tools that require paid accounts for basic access or essential academic functions.
Stay Informed
GenAI technologies and policies are evolving rapidly. CCSF employees are encouraged to:
- Follow announcements provided by the college concerning GenAI.
- Review official resources from the CCCCO Digital Center for Innovation, Transformation, and Equity.
- Revisit the GenAI Essentials course annually as updates are released.
Ask Questions and Report Concerns
If you are unsure whether a GenAI use case aligns with college policy or raises privacy or accessibility issues, contact:
- OLET / Educational Technology Center – instructional and operational use questions
- ITS Help Desk – data or privacy concerns
Communication
The AI Workgroup has presented the recommendations as an informational item to the following constituency groups:
- Participatory Governance Committee
- Distance Learning Advisory Committee
- Teaching and Learning with Technology Roundtable
- Classified Senate