Below are examples of course level assessments across the curriculum. The last third of the page features program level assessments for both instruction and student services. Instruction PSLO examples employ alternative assessments to the usual "rolling up" of course level assessments. Student services examples illustrate the various ways of assessing.
Evidence Based Decision Making: Course Level Assessment Plans, Instruments, Results, and Action Plans
Assessments— We assessed the two SLOs with three assessments: (1) students completed an excel spreadsheet project, which was evaluated with a three-point analytic rubric whose emphasis was on formulas, functions, formatting, and the printed product; (2) students also were expected to score at least 70% on the 34 T/F and multiple choice midterm, and (3) for both SLOs, students took a survey in which they assessed their own abilities.
Action Plans—Students achieved the SLOs in all areas except one question on the survey. Only 60% of students felt prepared to go on to the next level. As a result, we reconfigured the class so that students would do more projects that require them to apply the skills covered in the class. We also reconfigured the grading so more emphasis was on the projects and less on the quizzes and homework. The next time we administered the survey, students reached the benchmark in regards to feeling prepared to go on to the next level.
Assessments—We used the same three assessments to assess all four of our SLOs, which made for more efficient assessment. Students took a comprehensive pre-post test; we identified which questions pertain to which SLOs for the assessment. Students also completed a visual and oral presentation, which was assessed with a 100 point checklist with ten criteria. And third, students assessed how well they mastered the four courses’ SLOs with a survey.
Action Plan-- Students achieved the benchmark for all but one of the SLOs, the latter of which was negligible. However, we will review the questions for clarity and revise or replace accordingly.
Assessment—The assessed learning outcome covers activities in several segments of WELL 701: therefore, we designed an assessment method that would take all relevant student activities (quizzes, practical palpation exams, etc.) into account. Scores on all activities were aggregated and converted to a performance percentage. We were very gratified to find that the standard we set (90% of students would achieve an aggregated percentage of 70% or better) was exceeded!
Action Plan-- While the fact that no student scored below 80% tells us that we are doing a good job with this learning outcome overall, the fact that we created a broad, combined measure does not allow us to assess how well each segment of student experience is contributing to the overall good performance. Though we will likely focus on assessments of other course learning outcomes in the near term, we could refine assessment of this outcome in the future by designing methodology that focuses on each segment of student experience. By focusing in this way, we would gain more detailed information about which parts of our instructional model are strongest and which parts could be strengthened.
Assessments—The SLOs range from technical mastery to choreography to fitness, and as such, require multiple assessment methods. Many of our assessments evaluate progress and achievement of personal goals since students are at different skill and experiential levels. Progress can be evaluated in multiple ways: actively participating in and attending class so as to practice, and being able to make self-corrections. In order to improve, students critique their own performance that is videotaped, and then are later evaluated based on the changes they’ve implemented.
No Action Plan Needed— The criterion was met for all of the SLOs.
Assessments—Students are to be able to identify and build individual / team skills related to conditioning programs for specific sports , evaluation of various techniques, use of equipment, scientific knowledge of conditioning, and improve / maintain acceptable and healthy body composition, flexibility, muscular strength, muscular endurance and flexibility. The assessments were oriented to individual students being able to demonstrate improvement in most categories over the course of the class, one for the individual student, and the other a Division Fitness Test (PEEP) that measures weight, blood pressure, flexibility, body composition, BMI, muscular endurance and cardiovascular efficiency. Students also are examined visually in order to make sure they assimilated concepts and can apply those concepts to a sport specific setting. Students more than surpassed the benchmarks.
No Action Plan Needed— The criterion was met for all of the SLOs.
Assessments-- The same assessments were used for both the beginning and intermediate classes. The first SLO pertained to employing proper shooting technique. For the assessment, students had six opportunities to hit the target or within range and attain a given minimum score. They also completed a survey in which they assessed their abilities. For the second PSLO, they were expected to identify risk factors for heart disease and stroke, and apply general fitness/well-being principles to their own lifestyles. To demonstrate their understanding, they took a quiz, and to demonstrate application of these principles, they assessed their own habits.
No Action Plan Needed— The criterion was met for all of the SLOs.
Assessments-- Using a rubric, we evaluated a text-based essay assignment that all participating sections assigned. This essay covered all of the SLOs—essay writing, critical thinking, integration of sources—except the metacognition SLO, which we have since inactivated since the information it yielded was obvious.
Action Plans-- Because students struggled with writing a strong thesis and adequately developing their ideas, we dedicated departmental meeting times to share instructional strategies. With the assessment of other composition courses, the department is now working on a departmental rhetoric so as to highlight best practices and adopt a common language around the teaching of writing.
Assessments: Students in JOUR 300 (now JOUR 320 and 330) were assessed twice during the course of the 2012 semester on how well they were able to recognize and evaluate sources for news articles. Students were asked to look at a selection of news stories in the context of doing their bi-weekly critique of the print version of their newspaper and then to identify who the expert sources were in each. This exercise helped them to key into not only what sources actually are, but also what kinds of sources are considered expert. (For example, the ASSC president would be considered an “expert” in a story on the opposition to plus/minus grading, since Associated Students officially weighed in on this topic. A random student on campus would not be an “expert,” but would still be useful as a source representing campus reaction.) Interestingly, students in JOUR 300 reached the benchmark for the SLO in the first assessment, but not the second. However, on closer analysis, the sample size for the second assessment was smaller than it should have been.
Action plan: The next time the assessment is conducted on this SLO, the assessment will be done in class to ensure originality of answers and to ensure that all students in the class participate in the assessment.
Science/ Math/ Technology
Assessments-- The Department has consistently developed General Education Biology courses (numbered BIOL 100–199) to provide students with a scope of information that will help them to objectively deal with the events and responsibilities of daily life. The primary goals of these courses are to promote scientific literacy and therefore good citizenship. SLOs for these courses are essentially identical and include specific information and problem-solving skills necessary to make decisions regarding personal nutrition, environmental resources, and health care.
We created SLOs for these courses based on the unifying themes in Biology including evolution, scientific methodology, and ecology that could be used to assess all of the 100-level courses. We created assessment criteria that set the bar high for ourselves to encourage us to promote scientific literacy for all. Most of the 100s have completed several SLO assessment cycles:
Several common issues recur in examining reasons why students are not meeting the various standards. These include:
- Students are under prepared coming into the introductory level courses. However, more importantly they come in saying "I don't like science" or "I don't do well in science."
- Students do not integrate information from several sources when they are writing essays, research papers and other assignments that require this skill. It is apparent that students write the first response to googling the question—regardless of whether the response addresses the question.
- Students have difficulty solving problems that require a multi-step process and/or quantitative skills.
Action Plans-- While continuing to incorporate the importance and fun of understanding sciences some mitigations in progress:
- Divide challenging (quantitative) concepts into step-wise problems that guide problem solving.
- Incorporate more assessment measures that allow students to work collaboratively. (This was popular in Spring 2012 and resulted in increased scores.)
- Require students to analyze why each possible answer to a multiple-choice question is or is not correct.
Assessments— This was the math department's first attempt at the SLOAC. We identified Number Sense as our most important SLO and set out to assess it using several final exam problems scored with a rubric.
Action Plans-- Upon looking at the results we observed several things and made changes accordingly:
- In creating our criteria, we worded it to measure individual student performance rather than aggregate averages among students.
- We realized that the criteria was set unrealistically low and have put more experienced thought into criteria for subsequent assessment plans in this and other courses.
- In looking specifically at results on certain problems we saw that two common errors were to leave out units on answers that require them, and to leave fraction answers un-simplified. We have since added reminders to students both in the instructions on exams, but also have adjusted the instructions in exercises throughout the textbook.
- Students are struggling with estimation. We are continuing to look for ways to explain and gives students opportunities to practice estimation.
- We did not give the students opportunities to look at and use the scoring rubric before taking the exam. We now show the students at the beginning of the semester and throughout the semester how certain problems are graded.
Assessments-- In the spring, we analyzed the results and we determined that only 13% scored an average of at least 3.5, only 25% at least 3, and only 56% at least 2.
Action Plans-- Since the results were so dismal, we decided to repeat the same assessment in fall 2011 with the same criteria but with the following changes:
- We took out the radical model agreeing that they are the least important and come at the end of the semester and that many instructors had to de-emphasize them.
- We made more of an effort to remind students of the SLOs when the pertaining topics were covered rather than mostly only at the beginning of the semester.
The results in fall 2011 were much better, although still well below the criteria. We will continue to look for ways to improve student learning in this difficult topic and will most likely assess this SLO in the future to see if changes have improved their success.
Social Sciences/ Creative Arts
Assessments—We used the same three assessments to assess all three of our SLOs, which made for more efficient assessment. First, students took pre/post tests because we wanted to see how much knowledge students gained from being in the class. Secondly, students wrote an essay that was evaluated with a rubric. And last of all, students took a survey in which they assessed their own abilities.
No Action Plan Needed— The criterion was met for all of the SLOs. We now have a benchmark with which to compare future assessments, and at this point, we don’t plan to make any changes until we assess again and note any drops in achievement.
Each semester I assess one SLO using direct measures including a project rubric of one completed art project, a critique rubric for the presentation of the students' artwork, a research paper, and a quiz. I also use an indirect assessment method of an exit survey related directly to the class SLOs. This assessment plan is tailored specifically for Ceramics I. The Art Department faculty are going to work next on creating an assessment plan like this one, that can be used across all studio mediums. This will help us to create a template that will aid our adjunct instructors in completing assessments and enable us to map trends among the studio arts classes.
Evidence Based Decision Making: Instructional Program Level Alternative Assessment Instruments
- Cooperative Education-- Instead of analyzing course level assessment results that map up to the PSLO, Co-op faculty require studentst to complete an end-of-semester survey in which they assess their progress toward achieving their goals. Such assessments increase students' awareness of their progress as well as give Co-op faculty insights into how students in general are faring.
- Respiratory Therapy -- Instead of analyzing course level assessment results that map to the PSLO, RT faculty analyze the percentage of students who pass the industry certification exams, attain associate's degrees, and gain employment in the field within six months of graduation. Career Technical Education (CTE) programs are especially encouraged to draw from such data, which also may be the data they report for accreditation purposes.
- Administration of Justice
- Program Exit Survey (Fall 2009)
- Majors self-assessment questionnaire
- Library Program/ Information Literacy ISLO --- Information literacy instruction is incorporated (or "infused") into all sections of ENGL 100 taught at Skyline College via two workshops given by a librarian. The information literacy assessment plan is an attempt to measure how well students have learned the basic components of information literacy after attending the two workshops. Three assessment instruments are currently being used: a rubric applied to ENGL 100 research papers (a direct measure), a rubric applied to an in-class exercise (a direct measure), and a student feedback survey (an indirect measure). Approximately 75 research papers, 150 exercises, and 700 surveys are collected once every three years. (The library follows this three year cycle in order to match the College's timeline for assessing institutional student learning outcomes.) These artifacts are scored and analyzed during the spring semester that immediately follows, and the major findings are presented to the English faculty for discussion. Data collected in fall 2011 has been analyzed and recorded in TracDat.
Evidence Based Decision Making: Student Service Program Level Assessment Plans, Instruments, Results, and Action Plans
Assessments— One of the PSLOs relates to Disabled Resource Center (DRC)students using their accommodations (i.e., test proctoring, textbooks in alternate format, and text-to-speech program such as Kurzweil 3000). They implemented multiple assessment methods: (1) aiming for 60% of returning DRC students with verified print disabilities submitting their alternate media request forms at least two weeks prior to the beginning of the semester, and (2) tracking the number of students using accommodations compared between the same semesters in the current and previous year. Most of the time they reached their benchmarks, but in instances when they didn’t, they arrived at creative and pertinent solutions that required resource requests.
Action Plans: In 2012- 2013, they requested funding to support the implementation of a DRC orientation and the development of resource materials, including applying for a Program Improvement Fund grant. They also plan to evaluate the orientation the semester after its implementation.
Among the other resources they requested were the following: (1) full-time DRC Coordinator to help DRC develop and implement DRC Orientation for new and returning DRC students; (2) ensure sufficient counselor hours in daytime and evening to meet with all DRC students to review and update accommodations; (3) develop a better space for test proctoring (not in same space as DRC's main office) so as to minimize distractions in the test proctoring area by creating a separate DRC office connected to test proctoring area.
Assessments— One of the PSLOs relates to students being able to identify financial aid resources for which they may qualify, submit applications and meet deadlines. They Financial Aid Office (FAO) used multiple assessment methods: (1) increase the number of Title IV Awards, and the number of unduplicated headcount; (2) increase the number of outreach/in-reach hours worked by the Financial Aid Ambassadors; (3) increase the number of outreach/in-reach events developed, coordinated and attended; (4) increase the number of Federal and State Grants (Title IV and Board of Governor's Fee Waiver "BOGFW"); (5) increase the number of institutional scholarship applications submitted. These assessment methods were staggered over time, and as action plans were implemented, they positively impacted the results for other assessment methods. Remarkably, they continually reached all of their benchmarks.
Action Plans— Among the action plans implemented were reclassifying campus ambassadors to financial aid ambassadors in 2010; thus, staff could commit to developing their expertise in financial aid resources and devote their hours to financial aid outreach and in-reach activities. In 2012, they implemented more services offered during the financial aid application labs so students could complete and file necessary documents. And in 2013, they’re updating the handbook for financial aid ambassadors and providing a sustained three- month training pertaining to FAO policies and procedures. As a result of their efforts, they have almost doubled the number of students receiving some form of financial assistance, far surpassing the benchmark on the College’s Balanced Scorecard.
Assessments—Students completed surveys after participating in activities such as transfer workshops, transfer fairs, and transfer field trips to UC Davis. For all but one SLO’s activity, students achieved the success criterion.
Action Plans—For the trip to UC Davis, students didn’t achieve the benchmark even though generally they felt more confident about applying to a UC after the field trip. Thus, counselors are considering whether the benchmark is reasonable given the field trip’s experiential nature, and will look into whether they survey instrument itself needs to be revised to reflect the field trip’s intent. Secondly, for the other activities, counselors noted some incongruencies between students’ responses to the Likert scale survey questions and their open-ended responses. So they will revise the Likert scales to start with strongly disagree (1) and end with strongly agree (5).
Assessments—Most of the TRIO PSLOs align with what is requested for federal reporting purposes, such as persistence, academic standing, and graduation/ transfer. As such, the TRIO director is able to draw from the Blumen database for both assessment and federal reporting purposes. Also helpful are the definitions for the criteria that are consistent with the Office of Education.
Action Plan—The TRIO Director will work with the Office of Planning, Research, and Institutional Effectiveness to create assessment methods that can’t be fulfilled by the Blumen database.