Update Archives

March 16, 2014

Please join Cynthia Dewar and Diana Markham for a repeat of the wildly popular workshop on the Insight Assessment Module. The workshop is this Tuesday, March 18th, at 2:00 PM in MUB 388.

Reminder: Spring 2014 Assessment Progress Reporting forms are open and ready for use. If you have completed any assessment activities this semester for your service, course, or program (such as reviewing data from Fall 2013), jump in now and record your work. Avoid the end-of-semester rush. 

For those participating in GE-Area A and E coordinated course SLO assessments, please be sure to report this semester on the methods you used to assess your students. But feel free to take Fall semester to conduct your data review and analysis.

Slow and steady. We don't have to do everything in one semester. Just keep moving forward! Goal: assess each SLO for every course and program and service at least once every 3 years. How's your progress?

Kristina Whalen
SLO Coordinator

March 5, 2014

Are you “Tuning” your outcomes? The concept of ‘Tuning” is gaining traction in the U.S. Educational circles. At CCSF, we see identifiable aspects of this process when departments or units use Advisory Boards. In the March issue of SLO Highlights read how the Broadcast Electronic Media Arts Department recently, after a successful advisory board session, honed learning outcomes and crafted an exciting new curriculum program.  

According to Brad C. Philips at the Institute for Evidence Based Change, Tuning as a process [. . . ] by which Tuning work groups identify what students earning a given degree in a given discipline know and are able to do. The process grows from the assumption that faculty, those who know a discipline best, should be the party responsible for determining the discipline’s core. Because other stakeholders have an interest in the discipline, the work groups solicit feedback from them. The work groups document their thinking and revise it based on the feedback."

The five components are to:

  • define the discipline core;
  • map career pathways;
  • consult stakeholders;
  • hone core competencies and learning outcomes;
  • draft degree specifications.

Visit www.tuningusa.org for more information

Also, read how Honors Coordinator Sami Kudsi brings stakeholders together at  Roundtables and Consortiums to both identify and actualize core competency in CCSF’s Honor’s Program.  

Need “tuning” or other help, visit the Drop-In Help lab each Tuesday from 1-2 in Cloud 208B or contact Kristina Whalen for Outcome Assessment related queries or workshops.

Feb 26, 2014

It’s not too late to RSVP for tomorrow’s SLO workshop entitled, All Alone? Assessment Strategies for the Single Section Course. The workshop starts at 3:00 Feb 27, in MUB 271. I expect the workshop to last an hour.

How will the new ACCJC standards impact SLO assessment? Come to the faculty forum on March 14, 11-1PM (location TBA) to hear answers from faculty members in attendance at the Accreditation Institute in La Jolla on Feb 7-8. The focus of this joint venture between the ACCJC and the ASCCC was, in part, to kick off the public comment period for the revised standards. Look for details about the sessions in President Teti's emails. 

A repeat performance: Insight Assessment Modules. Use Insight to make your assessment data collection far less labor intensive. March 18th, 2:00 in MUB 388. For those that couldn’t make the engaging workshop hosted by Cynthia Dewar and Diana Markham last semester, put it on your calendar now!

As always, the Drop-in Help Lab is open each Tuesday from 1-2 in Cloud 208B. Workshop I host may be repeated at any Center. Contact me for details. 

Kristina Whalen
SLO Coordinator

Feb 19, 2014

Due to the high volume of SLO Reporting at CCSF, we knew our current “home grown” system was not a sustainable method for our increasingly vital SLO Reporting. In fact, during the last round of reporting, we began to receive file size errors! For the last year, one of the primary goals of our SLO efforts has been to research the best SLO software package for CCSF, among a competitive market. Most colleges half our size have either implemented TracDat, eLumen, or a similar assessment-activity-reporting system. Our goal was to find a new software that would add extra time-saving functionality (requiring us to input less repetitive information each semester), integrate as seamless as possible with our existing processes (thus not feel like everyone has to learn something totally different), affordably allow us replace our current system and grow upward as we improve our processes, and integrate with our curriculum and course outlines (so we don't have to enter SLO information in a multitude of different places).

After a vigorous vetting process, and taking into consideration other needs across the college, including curriculum development and program review, together with the Office of Instruction and the Research and Planning Office, we chose CurricuNET Meta. With the help of generous donors we were able to purchase the software and begin implementation in January. Progress is happening! Our new software is under build, and our implementation team is ensuring regular steady process. (Our team is headed by Katryn Wiese and combines experts from Instruction, Student Services, Research & Planning, Curriculum, and IT, with faculty, department chairs, SLO Coordinators, and articulation and curriculum experts.)

This software package, when completely installed, will not only help our college community report on assessment activity, but aslo help create, store, and manage course outlines effectively, track SLO assessment progress and synthesize data across programs, AND integrate with annual Program Review reporting.

On February 5th, the implementation team had its first review of the beginning module for the new CCSF CurricuNET software. CurricuNET developers started the build of our new software with a module on creating a new credit course outline. During our review we were able to discuss and refine the functionality, before moving forward with noncredit courses and programs. After the curriculum functionality is completed, we'll add in modules for reporting on student learning outcomes assessment and program review. All of the CCSF staff on the implementation team are VERY excited to see what has been a tedious, exhausting, manual process to date leverage existing technology (and best practices from some of the other 77 California Community Colleges using this software) to streamline and automate much of the process. We'll be saving a lot of time and stress/frustration for a lot of faculty and staff across the college.

Of course we'll also have some growing pains as we implement new technology and deal with the hiccups that always accompany such a task. But we are hopeful everyone will be able to experience the improvements as we move forward.

We are hopeful that the new curriculum module will be live by April, with the SLO module ready by May/June, and the Program Review ready by September of this year.

Fingers crossed and patience in abundance,

Kristina Whalen
SLO Coordinator

Feb 12, 2014

ILO Data

Last call for ILO Data: The SLO Committee will begin to analyze data from the ILO Critical Thinking Coordinated Assessment shortly. In an earlier SLO Updates we asked that all data be submitted by Feb 17th. If you cannot meet this deadline, but have data for the assessment, please alert Kristina Whalen ASAP. We appreciate everyone meeting this important deadline. Data may be submitted in the Spring Progress Report.

Upcoming Forums/Workshops

SLO Professional Development Session: Feb 27th, 3:00 PM in MUB 271All Alone?: Assessment strategies for the single section course.

Faculty members Wendy Kaufmyn, Rosario Villasana, Karen Saginor, Lillian Marrujo-Duck, Simon Hanson, Andrea Noisi, Cynthia Dewar, and Kristina Whalen, attended the Accreditation Institute Feb 7-8 in La Jolla, CA. The yearly Institute, sponsored by the Statewide Academic Senate and the ACCJC, was primarily focused on educating the college community about the new (forthcoming) accreditation standards. Chancellor Tyler has asked the attendees to organize forums to share what we learned. We are eager to do so--especially since the public feedback period for the standards is in full swing. Look for more information on these upcoming forums--probably early March.

Accreditation Committee Self-Evaluation

Finally, I'd like to put in one last plea for FACULTY to join the Self-Evaluation being coordinated by the Participatory Governance Accreditation Committee. Administrative and Classified employees have been appointed but several evaluation teams (organized around standards) show missing or scant faculty voices. Here's why this exercise is important:

  •   ​​We learned from people with Visiting Team experience and from colleagues at the Accreditation Institute that this exercise is a common and helpful mechanism for evaluation.
  • New perspectives guard against group think
  • Your critical thinking skills will help answer important questions like, "do the actions plans adequately address all aspects of the standards?"
  • You should and could have a voice in building the next set of action plans.  

If you would like to help build a path forward and prepare us a potential site visit, please contact Gohar Momjian.

Kristina Whalen

SLO Coordinator

Feb 5, 2014

New issue of SLO HIGHLIGHTS

The February issue of SLO HIGHLIGHTS features departmental SLO webpages and assessment reflections from webmasters. Read how the English Department's SLO practices have progressed and how the department's SLO website is a source of accountability and pride. Also, how the Biology webmaster uses the department's assessment site to guide faculty to innovative teaching practices designed to improve student outcomes.

Fall 2013 SLO Activities Summary Report now available

Fall 2013 SLO/SSO/AUO progress reporting ended in January. Progress reporting declined, modest in some places but sharp in others. Nonethless, assessment activities on campus are active, rigorous, and steadily closing the loop!  Read the full Fall 2013 Outcomes Assessment Activities SUMMARY Report. Some key information:

85% of instructional courses reported (1354 reports)

  • 93% are at stage 2 or higher (undergoing assessment – PROFICIENCY minimum)
  • 62% are at stage 5 (closed-loop ongoing assessment – SUSTAINABLE CQI minimum)

65% of instructional programs reported (218 reports)

  • 88% are at stage 2 or higher (undergoing assessment – PROFICIENCY minimum)
  • 33% are at stage 5 (closed-loop ongoing assessment – SUSTAINABLE CQI minimum)

 73% of counseling programs reported (21 reports from 8 programs)

  • 100% are at stage 2 or higher (undergoing assessment – PROFICIENCY minimum)
  • 81% are at stage 5 (closed-loop ongoing assessment – SUSTAINABLE CQI minimum)

 40% of student service programs reported (17 reports)

  • 94% are at stage 2 or higher (undergoing assessment – PROFICIENCY minimum)
  • 88% are at stage 5 (closed-loop ongoing assessment – SUSTAINABLE CQI minimum)

8% of administrative service programs reported (4 reports)

SUMMARY OF SUCCESSES:

  • 68% of existing outcomes have been assessed, with the rest planned. (Ensuring that all outcomes are assessed at least once every three years is our college-wide goal. We are on pace to have all outcomes assessed soon even though our processes developed late.)
  • Our process continues to move forward and achieve two goals: regular progress report submissions, thereby maintaining momentum, and continued training on benchmarks, definitions, and overall good practices for assessing outcomes.
  • The Spring 2014 progress reporting forms were made available on the first day of the semester, making continued adherence to semester reporting easier.
  • Transition to new software (expected launch fall 2014) that is more robust, better able to support the volume of reporting, and maintains its own archive and longitudinal review of a particular course/program/service is being implemented.

 SUMMARY OF NEEDED IMPROVEMENTS:

  • Keep moving forward and progressing towards closed-loop CQI (Continuous Quality Improvement) in all courses, programs, and services with high-quality activities and reporting.
  • As we transition to a 3-year assessment cycle minimum for all SLOs/AUOs/Service Outcomes, alerts will be sent to coordinators who have failed to log in and update regularly.
  • Completion rates must rebound. Changes in deadlines and oversight responsibility resulted in an anticipated drop, but with continued training and a transition to a new software system, momentum must continue.  Missing Fall 2013 reports should immediately be incorporated into the Spring 2014 reporting form (available now)

Thanks for all your focus on student learning,

Kristina Whalen

SLO Coordinator

Jan 29, 2014

June 2nd is the reporting deadline for Spring 2014 Assessment Progress Reports. But, the form is open now! This is important for a couple of reasons.

1.       If you didn’t meet the Jan 2nd deadline, you can add fall 2013 information in the spring form, save the editing link, and return to it towards the end of the semester if you want to add additional data.

2.       You may not have indicated in your Jan. 2 reports that you planned to submit your program level data for the Critical Thinking ILO assessment in the Spring reports after you’d spent time processing and discussing the data (maybe during the Feb. 6 FLEX). As soon as are ready, the forms are open for you. The SLO Committee plans to move forward on collating these data on February 17th – so anything entered by then will be part of the college-wide assessment.

3.       If you have a course, program, service or unit that is not scheduled for assessment this semester, go ahead and knock back the report early and cross it off the crowded end-of-the-semester task list.

Let keep the momentum going but also pause to reflect on the quality of our reporting.  Last October, The SLO Committee released its Summary Report for spring 13 reporting and assessed a sample of the SLO reports for their ability to convey detailed, evidence based summaries of assessment activities and provide detailed and concrete explanations of improvements made to courses and programs. The results were good.  As you consider the quality of your reporting, use the following checklist:

  • I’ve carefully read the descriptions of the stages and have correctly staged my course, program, service, or unit. I understand that once I move up a stage I never move back!
  •  I know where to find the archived reports for the course/program/service/unit I am reporting on my department’s SLO/SSO/AUO website (everyone has one) and my work builds on past reports.
  • I am able to articulate knowledge on how often assessment of SLOs/SSOs/AUOs happen for my course/program/service/unit because timelines and schedules are developed.  
  •  I effectively but briefly describe my assessment methods and I only describe methods for the data I plan to report.
  • My assessment summary provides details on 3 things:  1. Specificity of data collected. 2. Criteria used to determine success. 3. Analysis of the data.
  • I understand that the most important part of an assessment report is the concrete and detailed plans made for improvement.
  • I am the coordinator so that’s why I am submitting the report. I know that only one report is submitted for each course/program/service/or unit and I have talked with my colleagues and coordinated a single report.
  • My report’s conclusions resulted from rich dialogue with my colleagues.
  • I have sent myself and copy of the form in case I want to edit the report
  • I’ve stored my password and can locate it!

If parts of this checklist leave you scratching your head, contact your SLO Coordinator or stop by the drop-in help lab on Tuesdays from 1-2PM in Cloud 208B.

Kristina Whalen

SLO Coordinator

Jan 22, 2014

SLO professional development opportunities abound. First, the upcoming Feb 6th FLEX Day has time built into the schedule for departmental assessment work.

9:00AM - 12:00PM Keynote address and Workshops centering around the FLEX Day theme:

Culture Shift: Innovation, Engagement, Achievement.

Keynote with Darrick Smith, Assistant Professor of Educational Leadership, University of San Francisco

12:00PM - 12:45PM: Lunch

12:45PM - 1:45PM: School/Division Meetings

1:45PM - 2:45PM: Department-Based SLO Work

Following school/division meetings. The SLO Committee hopes that departments will use this time to work on any of the following:

  • discussion of data and results collected by assessment measurements
  •  implementation of course or program improvements
  • discussion of any Critical Thinking ILO data collected at program level
  • assessment work related to GE Area A or E
  • scheduling assessments so SLOs are measured at least once every 3 years
  • preparing now to meet the June 2nd deadline for SLO reporting
  • any discipline specific SLO work

Additionally, the following professional development workshops will be held throughout the semester:

Feb 27th, 3:00 PM in MUB 271: Going it Alone: Assessment strategies for the single section course.

March 18th, 2:00 in MUB 398: Insight Assessment Modules. Use Insight to make your assessment data collection far less labor intensive.

March 20th,4:00 PM in MUB 271: Map it! Why does mapping matter? How do you make mapping determinations? And, how to map ILOs/GELOs to PSLOs, PSLOs to SLOs, and mapping out assessment strategies for each course SLOs as part of revising curriculum.

April 24th, 4:00 PM in MUB 271: Non-Credit Assessment Techniques Explored.

April 29th, 2:00 PM in MUB 398: Insight Assessment Modules, Repeat performance.

May 15 at 4:00 PM in MUB 271: ILO Critical Thinking Assessment Tentative Results Explored.

The above workshops are on the Ocean campus but any session may be repeated elsewhere if center deans, site supervisors, departmental chairs, or faculty are willing to schedule a time/place and have 7 particpant RSVPs in hand. 

Drop-in Help lab is going strong! Every Tuesday 1-2PM in Cloud 208B. 

Onward!

Kristina Whalen

SLO Coordinator

Jan 11, 2014

Impact of GEO C Assessment

The assessment of General Education (GE) Area C outcomes was the first institutional-level assessment to take place since the College made a commitment to the outcomes and assessment model for course and program evaluation and improvement. The implications of the findings go beyond the GE Area C Natural Sciences courses.

The assessment process followed by the GE Area C workgroup served as a model for future institutional GEO and ILO assessments.  The process involved developing a common rubric that the many disciplines involved in the institutional-level assessment could adapt to the course level.  The rubric has three levels of achievement: proficiency; developing; and no evidence.  Each participating instructor developed criteria for how students meet each level  of achievement for their course assessment(s).  Instructors at the course level conducted SLO assessments then converted their findings to the common rubric. The common rubric was an overall success and will be applied to other institutional level assessments in the future. The GE outcome assessment of Area C resulted in the following overall results: 64.6% proficiency, 25.6% developing and 10.8% showing no evidence.  Assessment results, analysis and dialogue led to outcome revision. 

Additionally, the workgroup reviewed the pass and withdrawal rates for GE Area C classes. The overall pass rate for GE Area C students is 60.8% and the overall withdrawal percentage is 17.4. Not surprisingly, the assessment data and the pass/withdraw rates were similar and provided validity to the assessment practices being employed at CCSF.

The workgroup decided to investigate why the proficiency and pass rates were low.  They reviewed Math and English placement tests and correlated them with the pass rate for GE Area C.  The number of students passing a class (a grade of C or higher) correlated with higher level Math and English levels. For example, "students enrolled or placed into lower level math (which correlates to pre-algebra and earlier) had only a 24.8% pass rate, compared to a 48% pass rate for upper level (algebra 1 or 2 or geometry), and a 68.7% pass rate for collegiate level (beyond algebra 2).  Lower math placement also shows a strong correlation with higher withdrawals.  The workgroup findings suggested that English and Math placement tests are an important indicator of student success and more emphasis and dialogue between teachers, counselors and students needs to take place so that students have the right prerequisites to achieve at proficiency levels.

At the end of the report the committee provided additional recommendations for the College community, natural science departments, and course instructors with the ultimate goal of increasing student proficiency levels. 

Next steps

At the Jan 9th FLEX day program entitled College-Wide Dialogue Continued, faculty, department chairs, and counselors discussed how to move these recommendations forward.  The workgroup and SLO coordinators are asking more members of the college community to read the full report, identify decision makers that may implement the recommendations, and talk with decision makers and colleagues about the path forward.

Please direct any feedback to the report to SLO Coordinator and SLO Committee Chair Kristina Whalen (kwhalen@ccsf.edu).

Jan 8, 2014

Welcome back!

The first SLO Update of spring 2014 covers spring 2014 reporting, GE Assessment, assessment professional development opportunities, and spring drop-in help lab sessions.

Spring 2014 Progress Reports

The fall 2013 Assessment Progress Report deadline has passed and our attention turns to spring 2014. Next week, the spring 2014 progress reporting form will be available at the now familiar location on the SLO webpage (upper right hand corner of both SLO homepage and department/unit SLO webpages). The form's early availability means you may complete it at any time throughout the semester and well before the June 2nd deadline.

General Education Assessment

The GE Area C workgroup released a final report at the conclusion of last semester. The workgroup seeks input on the report’s important findings and recommendations. Attend the College-wide Dialogue Continued FLEX session from 4-5Pm on January 9th in MUB 250 and join the conversation! As the GE Area C workgroup moves into the final phase(s) of the assessment processes, GE Area work groups E and A are beginning their work. If you are the coordinator for a GE Area A or E course, look for communication from the workgroup explaining how your assessment activities may be integrated into college-wide assessments. These processes will refine our GE outcomes and reveal important learning trends.  GE Area A work group information and courses are here. GE Area E workgroup and course information is here.

Assessment Specific FLEX workshops

Many FLEX Day work shops provide important professional development on assessing student learning. From 1-2PM in SCI 136, participants are introduced to Classroom Assessment Techniques (CATs). From 2-4 in Rosenberg 304, learn more about Program level Assessments.  During the 4-5 hour choose between the Institutional Level Assessment work shop in MUB 250 or the Mapping Assessments to Course SLOs session in Science 136. Many other sessions make strong connections to student learning data. The Spring 2014 FLEX program is here.

This semester the Drop-in help lab will move both its time and location. Drop-in help lab is now on TUESDAYs from 1-2 PM in Cloud 208B.

Happy New Year!

Kristina Whalen

SLO Coordinator