Last updated: May 21, 2014
- Ensure college-wide assessment data inform college-wide planning.
- Engage in college-wide dialogue of SLO data for GEOs and ILOs.
- Ensure that all Fall Program Reviews report robustly on how dialogue and discussion of SLO data has resulted in completed and/or planned program improvements.
- Complete and continue ongoing ILO and GEO assessments.
- Ensure SLOs exist for all college services (including administrative services).
- Purchase new software to allow us to make progress on the above tasks.
The ACCJC implemented SLO-related standards in 2002. In 2007, they circulated a Rubric for Evaluating Institutional Effectiveness to help institutions progress towards complete fulfillment of these still-new standards and assess their current stage. In 2009, the ACCJC notified all colleges that they would be required to meet the Proficiency Rubric Level by Fall 2012. All colleges were directed to submit an self-evaluation report on the SLO progress -- 1/2 were due November 15 2012, the other 1/2 March 15, 2013. City College submitted its ACCJC-mandated SLO Progress Report on March 15, 2013.
In June 2013, the ACCJC produced the a report on the overall SLO Assessment Implementation across all their colleges.
In September 2013, CCSF requested a specific feedback report on our performance relative to the rubric and other colleges. We wanted to see where the ACCJC saw us as deficient, and where we should focus our future energy. The official ACCJC feedback arrived on October 30, 2013.
On Wednesday, November 6th, 2013 the two SLO Coordinators and the Accreditation Liaison Officer spoke with an ACCJC representative to get further clarification on one of the elements of our particular feedback. We took notes on that phone call.
Based on all the above, we created an SLO Scorecard to imform the campus community of our progress.
In the following grading scheme, some folks might find the following translation useful:
- 5: exceeds norm of effective practice
- 4: solidly meets expectation of effective practice
- 3: barely meets expectation of effective practice (might be just starting or at a basic level)
- 2: doesn't fully meet expectation of effective practice (some aspects aren't present)
- 1: doesn't meet expectation of effective practice (peformance in this are is deficient)
Proficiency -- Required by ACCJC since Fall 2012
SLOs and authentic assessments are in place for ALL courses, programs, support services, certificates, and degrees.
3/15/2013 Report -- ACCJC Score:
- Courses: 4 (ave: 3.66)
- Programs: 3 (ave: 3.49)
- Student Learning and Support Activities: 5 (ave: 4.14)
- Institutional Learning Outcomes: 1 (ave: 4.07)
- Narrative: 4 (ave: 3.51) (addresses authentic assessment that leads to understanding about student learning and gaps to be addressed)
5/21/2014 Self-assessed current score:
- 5: 90+% of all courses, programs (certificates and degrees), and student services have undergone SLO assessment. Reports filed. Quality verified.
- 4: ILOs/GEOs. We have mapped all our courses to GELOS and 40% of our GE Areas have undergone intensive direct assessments. We call them "deep reads." Campus-wide discussions are ocurring based on GE and ILO assessments, including how to make recommendations from the assessments actionable.
- Ensure that all ILOs and GEOs are mapped to courses and programs in soon to be implemented CurricUNET Meta system.
- Continue with our ongoing ILO and GEO "deep reads" and college-wide conversations regarding these results. (ILOs every Fall | GEOs every Spring). New software will allow us to "pull up" data from the course level assessments and acquire data without extra reporting.
There is widespread institutional dialogue about the results of assessment and identification of gaps.
3/15/2013 Report -- ACCJC Score: 4 (Ave 3.38) (Institutional messages value assessment and improvement.)
5/21/2014 Self-assessed current score: 5. Campus-wide dialogue occured at FLEX Days, both in September 2013 & February 2014 that interacted potently with student learning and student service outcomes. Campus discussions on GE Area assessments are on-going and robust. Academic Senate work groups have formed to address areas of concern in assessment data.
- Assessment data needs to interact strongly with Education Master Plan Strategic Goals, especially Goal #1 Advance student achievement in meeting educational goals.
- Fall 2014 Program Review documents need to be informed by the SLO Impact Report and demonstrate that 100% of departments are using SLO data to identify and address challenges within their department (due December 2014).
- Participatory Governance and overall college planning needs to continue discussion of analysis of student learning college wide and identify gaps and issues to resolve.
Decision making includes dialogue on the results of assessment and is purposefully directed toward aligning institution-wide practices to support and improve student learning.
3/15/2013 Report -- ACCJC Score: 3 (Ave: 3.29)
5/21/2014 Self-assessed current score: 4 New program review documents link resource requests directly to assessment data, creating clear linkages. Priority llists for funding have circulated that quote directly from program review data. On-going productive discussions about improving the program review-planning process continue in the Planning Committee.
Next steps: (See next steps from above. Same apply here.)
Appropriate resources continue to be allocated and fine-tuned.
3/15/2013 Report -- ACCJC Score: 4 (Ave: 3.22) (Institutional resource allocation/fine-tuning is oriented toward student learning)
5/21/2014 Self-assessed current score: 4
Next steps: (See next steps from above. Same apply here.) Also:
Implementation of CurricUNET Meta is underway that will greatly facilitate the connections between curriculum, assessment, and program review.
Comprehensive assessment reports exist and are completed and updated on a regular basis.
3/15/2013 Report -- ACCJC Score: 2 (Ave: 3.15)
5/21/2014 Self-assessed current score: 4. We have a defined cycle and format of assessment reports. All campus community is required to report assessment activities every 6 months. This exceeds the norm at most colleges. Participation in completing and updating these reports is on-going and wide spread. Summary and evaluation of reporting process exsist and validation reports are providing training for quality improvements in both assessment activities and reports. Reports now takes place widely at the institutional level. GE Area reports and ILO assessment data are being used to guide institutional practice.
- Reporting needs to continue to be a "habit of mind" for disciplines and units.
- In fall 2014 we will upgrade to a central reporting system and mapping data will be entered into the new software system.
- Continue to find ways to increase the participation of faculty leadership, deans , and division vice chancellors in reviewing and analyzing these report data, looking for college-wide trends, discussing those with the entire college, and integrating student learning into strategic plans and missions.
- Continue providing SLO Impacts Reports based on Program Review data, submitted for Fall 2014. Continue to have participatory governance groups to discuss these results.
Course student learning outcomes are aligned with degree student learning outcomes.
3/15/2013 Report -- ACCJC Score: 3 (ave: 3.54)
5/21/2014 Self-assessed current score: 4. Curriculim Committee requires mapping of courses to program SLO for all revisions and new degress or cerifcates. Well publicized examples of degree and certifcate alignment based on advisory boards, industry and community partnerships, and tuning projects are available.
- Conduct second round of ILO mappings for Fall 2014 ILO assessment of Communication ILO. Map/Remap all PSLOs to ILOs, SLOs to GELOs, and SLOs to PSLOs during the CurricUNET Implementation process.
Students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled.
3/15/2013 Report -- ACCJC Score: 4 (Ave: 2.63)
5/21/2014 Self-assessed current score: 4 Campus built on Student Survey conducted in 2012 and added questions about SLO awareness to the CSSE survey, conducted in spring 2014. Current outcomes are required on all syllabi. Campus communities publicize learning outcomes in classroom, bullentin boards, and with posters. These are visible throughout campus.
- Continue to ensure SLOs are listed in class syllabi and SLOs are shared with students in meaningful ways.
- Continue to ensure SLOs are publicly available for all courses and programs (through catalog and website), and improve the interface for students to access that information.
SELF-ASSESSMENT NARRATIVE THAT ACCOMPANIED REPORT: 4 (Ave: 3.25)
(Planned improvement efforts include improving the value of SLO assessment rather than just processes.)
OVERALL AVERAGE SCORE: 3.42 (Ave: 3.44)
Sustainable Continuous Quality Improvement -- Required to fully meet standards
*Data from August 31, 2013 assessment reporting
|Student learning outcomes and assessment are ongoing, systematic, and used for continuous quality improvement.
19% of instructional programs, 51% of courses, 74% of counseling programs, 55% of student service programs, and ~48% of administrative services are at closed-loop CQI. (Numbers have been steadily rising.)
Annual Assessment Plan: Current benchmark for all units: all outcomes assessed at least every 3 years.
|Dialogue about student learning is ongoing, pervasive, and robust.
||Dialogue about student learning is happening with departments and within weekly professional development activities that happened in Spring 2013 and Fall 2013 and will continue Spring 2014. Proof of the robustness of these conversations should appear in Program Review documents (due Dec. 2013) and will be part of the Program Review SLO Impacts report produced in Spring 2014.
|Evaluation of student learning outcomes processes.
||Evaluation reports provided and follow-up improvements implemented for the Fall 2012 reports and again for the Spring 2013 reports. Similar reporting and evaluation will be completed after each semester's reporting deadline and made available through the website. These evaluations are discussed by the SLO Committee and Planning Committee.|
|Evaluation and fine-tuning of organizational structures to support student learning is ongoing.
||The college is currently undergoing major reorganization. The new structures are intended to better support student learning across the college. Evaluation of its effectiveness and impact on student learning is a future task.
|Student learning improvement is a visible priority in all practices and structures across the college.
||Student learning improvement has been and continues to be a visible priority across the college. However, it definitely can improve, especially by reaching further and bringing more voices to the discussions. At the end of Spring 2013, it was a regular informational item on the Board of Trustees agenda. It was a major part of all college-wide communication. It is part of the 2013-2014 Board Planning Priorities, and is embedded in our ongoing program review and planning processes. Highlights are part of the Chancellor's regular updates. But overall, we need more analysis of learning across the college and more integration with planning and priorities (in practice).
|Learning outcomes are specifically linked to program reviews.
SLO reporting is embedded into Program Reviews and has been for the last few years. SLO impacts are also part of the resource allocation rubric.
A Program Review SLO Impacts Report was completed in Spring 2013 evaluating the quality and major themes presented across the college. This report was reviewed in various participatory governance meetings.
High-quality examples of the SLO portions of Program Review are used in the guidelines to help with departments that can benefit from improved reporting.