
Student Learning Outcomes & Assessment
- Home
- Student Learning Outcomes & Assessment
- ILOs: The Four Cs
ILOs: The Four Cs
Foothill's Rubric Assessment Model for Evaluating SLOs (FRAMES)
FRAMES is an acronym for Foothill’s Rubric Assessment Model for Evaluating SLOs. This project was initiated in Fall 2005 to create useful assessment tools to enhance and measure Foothill College's institutional student learning outcomes at the course level.
Each of the four FRAMES teams had between four and six faculty members representing a variety of departments across campus. Each team spent or will spend about a year developing and testing the rubric assessment tool and about a year using the tool to assess between 100 to 150 student artifacts from departments across campus.
Read about the work of the four teams.
FRAMES Critical Thinking Team
The FRAMES Critical Thinking team assessed over 150 student artifacts during 2006-07 and 2007-08 using the model they created in the 2005-06 academic year. Currently member of the FRAMES Critical Thinking team plan on presenting various workshops throughout the 2008-09 year to increase the use of creating assignments that ask for critical thinking, as well as teaching instructors how to use the rubric to evaluate critical thinking.
Foothill's Rubric Assessment Model for Evaluating SLOs
The purpose of FRAMES is to improve student learning on campus. Starting with one
of Foothill's four institutional student learning outcomes (SLOs), a team of faculty
designed an assessment rubric that they will now test on over 150 student artifacts
for evidence of Critical Thinking. More teams of faculty have volunteered to head
up the development of tools to assess Computation, Communication, and Community/Global
Consciousness SLOs.
FRAMES: Critical Thinking Phase I (completed)
The resources, methods, and results of 24 other education institutions were identified
and researched by the team based on their student learning outcomes and assessments
for evaluating critical thinking. A list of over 100 statements evaluative of critical
thinking was created. Over the next few months, the team worked within the parameters
of local knowledge and relevancy and filtered the list of statements based on clarity,
purpose, and semantics. When the filtering process was complete, a draft rubric was
formed and ready for initial testing.
Initially, the team applied the draft rubric to 10 assignments: 5 essays from the History department and 5 letters from the Adaptive Learning department. Modifications were made to several of the statements, but overall the rubric worked for both sets of assignments. The committee tested the rubric on 11 more assignments: 6 essays from the English department and 5 short research reviews from the Biology department. The rubric continued to work well for both departments, however the scoring system that was being used was altered to allow more flexibility in choosing which assessment statements to evaluate assignments. Finally, the team analyzed 9 more assignments using the rubric: 6 written artifacts from the Computer Information Systems department and 3 recorded performances from the Drama department. The rubric continued to work well for both sets of assignments and one seemingly repetitive statement was removed from the rubric.
In summary, the participating faculty agreed that they were better able to plan critical thinking assignments for their students when working backwards from the rubric. In addition, the team agreed that the rubric statements and scoring system were useful tools for assessment.
FRAMES: Critical Thinking Phase II (completed)
The FRAMES Critical Thinking group started the second phase of the implementation
process by planning how to collect and assess 150 student artifacts using the Critical
Thinking rubric. After defining the parameters by which to randomly select student
artifacts, the team sent a letter the second week of the quarter to the instructors
of selected students requesting they choose an assignment that requires critical thinking
and is due no later than Week 6 of Spring Quarter 2007. The random student artifacts
came from students with at least 60 quarter units acquired from Foothill College.
When the instructor had the students’ work in hand, they forwarded it to the team.
All identifiers were removed from the students’ work before evaluation, and all assignments
were handled confidentially within the Critical Thinking team. Only 64 artifacts were
collected during Spring 2007, so the project was extended to Fall 2007 in order to
collect a larger sample for the benchmark year.
During Fall Quarter 2007, another 100 artifacts were collected using the same parameters. During the second week of Winter Quarter 2008, each artifact was labeled and again distributed to two members of the Critical Thinking team for evaluation. The Critical Thinking team evaluated the artifacts during Winter Quarter 2008 and examined the results along with other FRAMES teams during Spring Quarter 2008.
FRAMES: Critical Thinking Results
The scoring scales that the FRAMES Critical Thinking team used were as follows:
- Excellent: 14-16 points
- Good: 11-13 points
- Minimally Competent: 8-10
- Deficient: 0-7
Based on preliminary research from the Instruction and Institutional Research Office, we knew that critical thinking was a challenge for many of our students. Using a sample size of 157 critical thinking assignments from across the campus, the team found that the mean score was 9.2 and the median score was 9.5. On average, our students are minimally competent in critical thinking based on the FRAMES Critical Thinking rubric. More importantly, what do we do with this revealing benchmark data? Total FRAMES Critical Thinking results:
- Sample size (N=157)
- Mean = 9.2, Median = 9.5
- 35 of 157 (22%) Excellent
- 36 of 157 (23%) Good
- 39 of 157 (25%) Minimally Competent
- 47 of 157 (29%) Deficient
FRAMES: Critical Thinking Going Forward
The FRAMES Critical Thinking team, consisting of faculty from each division felt strongly
that the campus needed to address the critical thinking deficit in the classroom in
order to meet the college's institutional critical thinking outcome. During Fall Quarter
2008, there will be a series of workshops in which the FRAMES Critical Thinking faculty
will work with individual faculty to redesign their assignments in order to include
stronger critical thinking skills found in the rubric.
FRAMES Computation Team
The FRAMES Computation team developed a rubric during the 2006-07 academic year, which it implemented in the 2007-08 year. The student artifacts that were collected are still being reviewed and final results will likely be presented during 2008-09 and 2009-2010.
Foothill's Rubric Assessment Model for Evaluating SLOs
The purpose of FRAMES is to improve student learning on campus. Using Foothill's
four institutional student learning outcomes (SLOs), teams of faculty volunteered
to design assessment rubrics to test student artifacts for evidence of Critical Thinking,
Computation, Communication, and Community/Global Consciousness. The following narrative
is about the development and implementation of the Computation assessment rubric.
FRAMES: Computation Phase I (completed)
Winter Quarter 2007: Faculty from Fine Arts, Chemistry, History, Economics, Math,
and English started the process of building a computation rubric. After finding few
available computation models for assessing student learning outcomes, the group decided
to use course materials, course outlines, and college program planning documents to
generate a list of criteria for assessing computation. The team noted that the actual
process of reviewing course outlines and developing the sample rubric is productive
and engages colleagues in interesting conversations about learning.
Spring Quarter 2007: The team brought together their rubrics and began the process of convergence and refining terms into one assessment rubric, however, the computation rubric was broken down into two different areas of assessment: Processing and Calculating. One debatable issue that stemmed from this dialogue is whether the purpose of this tool is to measure what the student can do, or to measure the correctness of an answer. The FRAMES Computation group is currently leaning towards the purpose of the tool being to measure what the student can demonstrate, not necessarily the accuracy or correctness of the final answer.
Several other things became more clear during this process. First, the scoring system that FRAMES Critical Thinking came up with will not necessarily work for FRAMES Computation. Secondly, when comparing an Economics assignment with an Art Drawing assignment, the clearly numerical Economics assignment only met 6 of the 15 assessment statements and the more abstract Art assignment met 11 of the 15 statements. The group felt that assignment may need to be assessed institutionally based on the criterion an instructor identifies as relevant and not on all 15 statements. Thirdly, the collection process to evaluate computation assignments institutionally needs to be well thought out. The collection process would include having instructors submit a description of their assignment; a clear checklist of the computation criterion met on the Computation Rubric; and an assignment answer key with labeled objectives.
FRAMES: Computation Phase II (completed)
Fall Quarter 2007: The FRAMES Computation group started the second phase of the implementation
process by planning how to collect and assess student artifacts using the Computation
rubric. Using similar parameters to those used by the Critical Thinking Team, the
FRAMES Computation team will send a letter the second week of the Winter Quarter 2008
to the instructors of randomly selected students requesting they choose an assignment
that requires computation and is due no later than the end of Winter Quarter 2008.
The student artifacts will again come from students with no degree, and at least 60
quarter units acquired while at Foothill College.The Computation team added a component
in the FRAMES Computation process that includes an instructor rating of the presence
of computation, as well as the quality or accuracy of computation in the assignments.
Because of the varying nature of how computation is measure by different disciplines,
the instructors will also be asked to identify criteria from the draft rubric that
they judge to be of value in assessing the computational skills of the assignment.
The score will likely be out of those criteria chosen by the instructor.
In addition, the team felt that not all courses were appropriate for measuring computation. The program planning documents in 2002-03 and 2005-06 will be examined for adding a possible computation filter, in which programs and courses at Foothill College that rated the institutional computation outcome as being highly important for graduates of their programs may be selected first for the project.
Also of note, Hilary Ciment, Art Instructor, has made an excellent contribution to this project and the larger academic community by demonstrating the value of measuring computation in the Fine Arts Division, and sharing her findings with local and national colleagues.
FRAMES Communication Team
The FRAMES Communication team began developing a rubric in 2007-08, with implementation starting in 2008-09. Instructors this year will begin receiving requests for student artifacts during Winter 2009 and Spring 2009 quarters. Because the projects goal is to measure institutional outcomes expected of graduates and transfers, the artifacts will only be used from students with no degree and that have more than 60 quarter units.
Foothill's Rubric Assessment Model for Evaluating SLOs
The purpose of FRAMES is to improve student learning on campus. Using Foothill's
four institutional student learning outcomes (SLOs), teams of faculty volunteered
to design assessment rubrics to test student artifacts for evidence of Critical Thinking,
Computation, Communication, and Community/Global Consciousness. The following narrative
is about the development and implementation of the Communication assessment rubric.
- Download the Communication Rubric (pdf 58KB)
FRAMES: Communication Phase I (completed)
Fall Quarter 2007: Faculty from several different departments and divisions, after
approval by the Academic Senate met December 2007 to kick off the FRAMES Communication
project. Over the course of 2007-08, the team of faculty from Physical Education,
History, Language Arts, and Communication developed the current draft Communication
rubric. During 2008-09, the team will use student artifacts to continue to finalize
and improve the current draft.
FRAMES: Communication Phase II (completed)
Fall Quarter 2008: narrative coming soon.
FRAMES Community/Global Consciousness Team
The FRAMES Community/Global Consciousness team will begin working on the last rubric during 2008-09 and begin implementation during 2009-2010. The ultimate goal of the project is four fully implemented rubric assessment teams operating by 2009-10, each examining over 150 student artifacts on a 4-5 year cycle.
In the end, the information achieved from the rubric assessments will inform future curricular and policy decisions that will focus the campus on strengthening our students’ achievement of our institutional SLOs.
If you are interested in joining the FRAMES Community/Global Consciousness Team, please let Rosemary Arca, Student Learning Outcomes Coordinator, know by email (arcarosemary@foothill.edu). Thank you for your consideration.
Foothill's Rubric Assessment Model for Evaluating SLOs
The purpose of FRAMES is to improve student learning on campus. Using Foothill's
four institutional student learning outcomes (SLOs), teams of faculty volunteered
to design assessment rubrics to test student artifacts for evidence of Critical Thinking,
Computation, Communication, and Community/Global Consciousness. The following narrative
is about the development and implementation of the Community/Global Consciousness
assessment rubric.
FRAMES: Community/Global Consciousness Phase I (completed)
Fall Quarter 2008: A team of diverse faculty met December 2008 to kick off this project.
FRAMES: Community/Global Consciousness Phase II (not yet started)
Fall Quarter 2009:

Questions?
Please contact me!
Kurt Hueg, Associate Vice President, Instruction
Administration Building 1900