2013 May Highlights

Once a month, we highlight departments and programs from across the college that have great things happening motivated by a desire to improve student learning.

May 2013

-- Megan Corry, faculty -- "Overall, the SLO process has moved from being simply a mechanical exercise to a meaningful teaching and learning evolution."

Paramedics practice advanced resuscitation during simulated scenarios while peers evaluate and provide feedback, building teamwork, communication skills, and self-reflection.

The Department of Health Care Technology is comprised of several individual clinical programs leading to a variety of careers in the Allied Health field. Four of our programs are accredited by national accrediting agencies such as the Commission on the Accreditation of Allied Health Education Programs (CAAHEP), the Commission on Accreditation for Health Informatics (CAHIM) and State agencies such as the State Board of Pharmacy and the California EMS Authority. Because programmatic accreditation requires measuring outcomes such as retention, attrition, graduation, and job placement as well as a regular cycle of using feedback from the students, faculty, and EMS community to guide program improvement, the use of SLOs was not a new process for most of our program coordinators. So at first the CCSF SLO process seemed more tedious than helpful. After all, we were already doing this, right?  Except we were in our own little camps or "silos". The CCSF SLO workshops and our department meetings on Flex days showed us the value of becoming part of a faculty learning community. We exchanged ideas, discussed how to incorporate licensure testing into our program SLOs, and shared our experiences from Flex day workshops. We discussed the importance of student feedback and how to improve return rates on graduate and employer surveys (both required by programmatic accrediting agencies). We also continue to work on incorporating SLOs into the annual program review process and use student outcome data to determine needs when applying for Perkins CTE Grant funding. 

In the coming year, we plan to also use the individual course SLOs to evaluate for redundancies or note any gaps that may recommend changes to existing courses and programs. In the many fields of Allied Health, we must keep up with changing technology and demands of the work environment. Medical practice standards are always evolving, which requires our Program faculty to maintain knowledge of new techniques, pharmacological agents, and  life-saving devices. Expectations of the education program graduates will evolve along with these changing standards. On the horizon are plans to strengthen our links to the San Francisco community through participation with high school Health Academies, local community health fairs, and on first responder medical teams during large-scale events. In the past, our participation at these events was done because it seemed like a good experience for the students. Now, we can more thoughtfully link these experiences to a specific desired learning outcome; assessing and revising as needed. Overall, the SLO process has moved from being simply a mechanical exercise to a meaningful teaching and learning evolution.

Paramedic students at a health fair.

   

-- Peter Stoffers, Faculty -- "...there were several ways to teach students about academic policies that were equally effective."

The AAPS 104 Assessment Committee: Peter Stoffers, Amy Mack, Antonio Martinez, and Mandy Liang with their Department Chair Maria Heredia

Each semester too many students end up on academic or progress probation. This can make them ineligible for financial aid, transfer,  an associate degree, or other goals.  For many years counselors have intervened trying to help those students who were on probation.  Until about seven years ago the intervention that counselors provided was a one-hour workshop with a counseling appointment to follow. A study by the research and planning department, published in Spring 2003, found that students on probation, who attended this workshop, were almost twice as likely to get off probation as compared to those who did not attend the workshop. In Spring 2012 the New Student Counseling Department began to offer the Student Success Seminar (AAPS 104), a 0.5 unit class focused on serving students who are on probation.  The requirements for AAPS 104 included written work and a counseling meeting focused on formulating a plan for academic success.

The initial assessment, which was completed during the course’s first semester, Spring 2012, focused on three student learning outcomes which described students’ knowledge of probation policies, student services, and educational goals. The results showed improvement in students’ understanding of the SLOs from pre-test to post-test. However, most of the pre-test scores from the SLOs that focused on student services and educational goals, were quite high.  This may be the result of the class consisting of mostly returning students, who have the opportunity to learn about these areas prior to AAPS 104. If we were to improve AAPS 104 assessment and what students were learning from the class, it would be advantageous to focus on areas with which students were unfamiliar at the start of the semester.  It was decided that the questions for the student services and educational goals section would be revised while the questions on probation policy would be used in the second half of the study. The goal for the second half of the study was to improve the course so students would learn more about probation policy.  As a consequence of this, students would score higher on the post-test questions relating to probation policy.

The instructors of AAPS 104 had a robust discussion about how best to improve the class. Initially it was thought that one instructor might do significantly better than others in conveying the SLOs, thus providing a model for how to best teach the topic, however, the results showed just the opposite. In the Spring 2012 administration, three of the four instructors showed almost exactly the same post-test composite score on the academic policies SLO. These scores were only 0.05 apart with 0-4 points possible. The fourth instructor's score was just slightly below the others. Given that instructors taught the course using different strategies, this result suggested that there were several ways to teach students about academic policies that were equally effective. Thus, the initial assumption that one teaching strategy would prove best was not supported. The group discussed what worked, in general, when teaching their class and read an article that described some empirically validated teaching strategies[1]. They decided that the most important types of teaching strategies were those that induced critical thinking, promoted student engagement, and made use of direct teaching strategies such as review time and breaking down large topics into mini lecturers. Instructors agreed that they would add one strategy, from each of the above categories, to their class during the Spring 2013 semester. In order to determine if the new strategies helped, instructors administered the assessment for the academic policies SLO as they had in Spring 2012. The results showed an improvement: on average, students learned more when the new strategies were employed.

[1] Rosenshine, B. (2012, Spring). Principles of instruction: Research based strategies that all teachers should know. American Educator, 36(1), 12-18.

 

-- Sean Laughlin, faculty -- "...brought our department closer both personally and professionally"

The Wellness Center

The Physical Education & Dance Department has completed our first SLO cycle and are pleased with the overall results. Being that initially departments were not given any real guidance or resources to devise and implement an SLO process we feel that our department really stepped up: we created our own plan, revised it along the way, and were able to implement the process and use the results to improve our overall teaching strategies. It actually brought our department closer both personally and professionally.

We developed an SLO committee that met every two weeks to discuss and establish processes and timelines that both fulfilled what the College needed as well as our own goal in implementing our plan and closing the loop by the end of the semester. We used all FLEX days for department-wide meetings to establish course SLOs and course assessment criteria. We feel that using each semester as a complete loop keeps everyone involved and contributing, which aids in everyone's total overall understanding of and engagement in the process.

The department implemented two means of assessments (psychomotor and cognitive) that would allow us to collect and archive data as part of each course’s SLOs. These data would be a “baseline” of data for each course – in addition to what each instructor may already be using as a way to assess their own courses (i.e. mid term and finals, etc).

Pro’s and Con’s to Our First Run:

  • Timelines: We were pressed for time for each of the tasks that needed to get implemented.
    • Improvements:  Assessments will be conducted 1 month before mid terms, at mid terms, and 1 month after. This will give instructors more time to plan, collect, and record data results.
  • Exit Survey: When we ran a successful preliminary early semester survey (to test our system) using Scantron forms for a select number of courses, we were able to implement, collect, and tabulate the data very easily. When we administered the exit survey for all of our courses at the end of the semester we found we were not only pressed for time to administer the survey but the collection of forms from each instructor as well as to run them thru the system to get our tally sheets was very time consuming and therefore the data were not available during our analysis and revision review process.
    • Improvements: We will use a Google docs form so to electronically administer and capture data.  Coincidently we did use a Google docs form for PE 200 (Fitness Center) which has over 2,000 students enrolled. This form was able to successfully collect orientation test data as well as exit survey data.
    • Instructors will give students the option to either take the survey on their own time or bring their classes into one of the two computer labs that we have in house.
  • Skills Assessment: The skills assessments for each class actually turned out to be very easy to implement as well as to collect and tabulate the data for. One issue encountered was instructors were given the choice to pick 3 skills from a pool of skills that were developed for each course.  Unfortunately skills varied from instructor to instructor and assessment was subjective to each instructor.
    • Improvements: we will standardize the three skills that each assessment will test on.