2013 Feb Highlights

Once a month, we highlight departments and programs from across the college that have great things happening motivated by a desire to improve student learning.

February 2013

-- Kyle Thornton – Department Chair – Radiologic Sciences – “Déjà vu! What CCSF is facing now, in terms of accreditation, we faced in 2002 as a program.”

We started our journey to #1 Diagnostic Medical Imaging program in the country through our accreditation process. We credit our current positive reputation in the community in large part to how we handled that crisis and became an exemplary model of program level assessments.

Parallel to the current CCSF-SLO issue, at some level, we had always been doing student learning outcomes assessments, review, and refinement. But in 2002, our accrediting organization (JRCERT) required us to follow a new set of standards published by them. In 2003, they informed our program that an interim accreditation report was due – specifically for student learning outcomes. We weren’t worried. We’d always done it! We submitted our report and went on with our business.

What happened? JRCERT responded saying we were not addressing SLOs sufficiently by their standards. They reduced our accreditation cycle from 8 years to 5 years so they could monitor us more closely. And they required us to reapply for accreditation right then – particularly so we could demonstrate our conformance to the SLO assessment standards.

Our response? “Uh-oh! We need to do something new!”
What did we do? We got educated quickly. I became an SLO workshop groupie with a mission: to learn as much as possible about SLOs so we could get through the upcoming JRCERT reaccreditation process. Sound familiar? Based on what we learned, we developed a complete assessment plan. And fast forward to the result: In 2004 we successfully submitted our new accreditation application. JRCERT touted our assessment plan as exemplary and reinstated our accreditation as well as our 8-year cycle.
What did we learn? Build upon what you already know and are already doing. Don’t try to form SLOs in a vacuum. Link program SLOs to program goals. Assessment is not an event, it’s a process – assessment is never truly finished. Don’t rely on a single assessment instrument – triangulate!

What did we gain?

  • A better understanding of our students’ abilities and limitations
  • Data for grant opportunities
  • Evidence to share with communities of interest and stakeholders
  • Respect from our communities of interest and stakeholders

How did we do it?

  • An articulated mission statement that communicates intention (www.ccsf.edu/dmi)
  • Goals stemming from that mission statement
  • Programmatic outcomes that branch from those goals
  • Periodic assessment of all three

Example goals for our programs

  • Clinical performance and competence
  • Problem solving and critical thinking skills
  • Communication skills
  • Professional development and growth

Example Program SLOs: Outcomes are learner oriented, event-specific measurable statements that stem from a goal. Students will:

  • Evaluate an x-ray image for diagnostic quality
  • Demonstrate writing proficiency
  • Deliver an organized presentation

 

-- Paula Cahill – Department Chair – “We thought we knew what direction we were going, but the data took us somewhere totally different!”

Hepatits B screening blood testing – Hepatitis B vaccination – Student taking SLO survey at end

Many CCSF students in medical related programs come to Student Health Services (SHS) to be assessed for Hepatitis B infection and immunity as part of their program requirements. Other CCSF students who come to SHS for general care are encouraged to be tested because they are in high risk groups for the virus. Students who are determined to be uninfected, but who are not currently immune are given a low cost or free series of three vaccinations over a six-month period to prevent Hepatitis B infection.

People often confuse Hepatitis B with other types of viral hepatitis such as A and C. Hepatitis B is a virus transmitted by contact to the blood or body fluids of an infected individual and from an infected mother to her child during pregnancy.

We want students to understand which type of hepatitis they are being protected against when vaccinating them for Hepatitis B. We also want them to understand hepatitis B transmission and prevention and be motivated to complete the vaccination series when indicated.

In Fall 2008, Student Health Services decided to assess student learning about Hepatitis. We developed a survey that we administered at completion of the third hepatitis vaccination. The survey asked students to list three ways hepatitis B can be transmitted and three ways hepatitis B can be prevented.

Out of 10 completed surveys at end of Fall 2008 semester, only 20 % were able to list three ways of hepatitis B is transmission and only 20 % were able to list 3 ways of prevention. 40% were able to list at least two of three transmission routes and 70 % were able to list at least two of three ways of prevention. During the survey, we realized we were likely skewing the data on prevention by discussing the answers to transmission before administering the prevention question, and the 70% was likely invalid. The survey data indicated that while successful in vaccine completion, we were not providing adequate health education during patient encounters for screening and vaccination. The data also indicated that students may put great trust in their programs and health care providers without a good understanding of what they are being immunized for.

SLO findings were presented at a nurse practitioner staff meeting in February 2009. Robust discussion on how to improve education about Hepatitis B occurred. Suggestions included use of written educational material, better use of vaccine information sheets, use of a mnemonic (blood, body fluids, and to baby), and use of provider time to more thoroughly educate. We also changed administration of the survey so that students would answer both transmission and prevention questions prior to review of answers.

Results: Student learning about Hepatitis B showed significant improvement from our initial 2008 assessment. Our benchmark goal listed on our 2012 Matrix was that 90% of students surveyed could list at least 2/3 modes of hepatitis B transmission and prevention. We met this goal in Fall 2009 and 2012. Our original SLO efforts in 2008 had included looking at the timely compliance and completion of the Hepatits B series. However our finding of lack of a general lack of understanding led us down a different path: focusing on patient learning about Hepatitis B transmission and prevention. Now that we have consistently met our benchmark, we will return to the assessment of completion rates for patients advised to vaccinate for Hepatitis B.

 

-- Andrew Chandler – Department Chair
“Sharing the load: Pairing full-timers with part-timers to assist SLO assessment.”

Architecture Students

As department chair, I wanted to help all members of the department conduct authentic end-of-semester assessments and document that assessment in a way that would facilitate dialogue in Spring and spur improvements in our courses and programs. I was especially concerned about the workload of part-timers and how to bring them into the conversation and process more collaboratively. To help that process, my full-time faculty were asked to pair up with a number of part-timer faculty.

In early Fall 2012, each part-timer met with a designated full time faculty to identify and discuss the assessment cycle. Foe each course that they taught, the part time faculty were asked to identify with the full time faculty’s assistance:
1. The SLOs that would be assessed
2. How these SLOs would be assessed (which methods of measurement)
3. How the results would be reviewed
4. How to consider and note changes that could be made in Spring 2013 to address any possible improvements

At the end of the semester, I reminded everyone to follow up and document their efforts, seek help  if they needed it from myself or their designated full time SLO contact,. I also shared a number of resources with them:

  • A copy of the Spring 2013 Assessment Plans & Fall 2012 Review form they would need to complete by the end of January
  • The SLOs for each course taught in the department.
  • Principles of assessment we use in the department
  • Examples of direct assessment methods
  • Assessment tips
  • Results of student surveys given out that semester (administered by the department) which show how students think they have done with each SLO for each course.

To focus efforts on the positive, I reminded them of why we were conducting assessments at all:

“We all know that we have successes in our classes. That is a given. The assessment cycle is meant for us to identify where areas of improvement (dramatic or minor) can be made. Please focus on these types of items. SLO assessment is not an evaluation of you, so you do not need to consider how this will reflect on you as a faculty member. Consider how it impacts your students' learning and your teaching only. While I will be reviewing the results, it is to be able to identify where we can assist you in achieving your stated goals, not to determine whether you are doing your job. That type of evaluation is done every three years in our official evaluation cycle for faculty.”

To follow up, we took time at our department meeting on the January 11 FLEX day to share our plans for assessment activities for Spring 2013. In addition, the primary work was completed when the full timers met with our part timers to follow up and complete the assessment cycle for Fall 2012. As additional assistance, the full time faculty have taken on the responsibility of filling out the online reporting forms for all classes, and will follow this up with the collection of evidence to be placed in binders within the department.

The SLO assessment process has a number of challenges, but it helps to have good direction and colleagues willing to share the load.