ALEE Center specialists meet with faculty in each program to design a study that includes the following three elements:

(1) Identifying research questions

(2) Selecting courses for assessment studies

(3) Creating or revising tools to measure student learning (e.g., rubrics)

Courses that focus on PLO(s) of interest

Start by reviewing the curriculum matrix to identify relevant courses that will be taught in the upcoming year. See Step 9 for more information about writing a curriculum matrix.

Courses that include assignments that demonstrate PLO-related skills

Meet with course instructors to review their assignments and assessment tools (e.g., rubrics, exam questions) to determine whether the course is appropriate for PLO assessment. The best courses for PLO assessment include assignments where students have had ample practice with the PLO and are being asked to demonstrate their mastery of the skill. Senior seminar papers or capstone projects are often a good place to start.

Courses that are new or have been recently redesigned

Courses that have recently participated in the Teaching and Learning Center’s Project REAL program are ideal candidates for assessment.

Courses with high DFW rates or known equity gaps

Including these courses can help identify specific areas for improving student learning and closing equity gaps.

Courses where assessment can measure student learning at different points of the program

Remember that the main goal of a PLO study is to assess student learning across the entire program. Thus, we recommend selecting intermediate courses (e.g., lower 100-level courses) to measure the skills students learned in lower-division courses, as well as how well students are prepared for upper-division coursework. To measure students’ final skills and mastery of PLOs after nearly completing a program, select courses at the end of the curriculum. Both intermediate and capstone courses can be helpful for identifying specific areas for improvement.

Core courses that are taught by multiple faculty

Whenever possible, we recommend assessing more than one course offering, especially if they are taught by different faculty. This allows us to collect more representative student data (e.g., across different quarters) and measure the course impact on student learning, rather than differences in individual instructors. Most importantly, when faculty work together on assessment and teaching by communicating with each other to align their assessments and rubrics, their participation has an immediate and long-lasting impact on the curriculum. This collaborative effort has resulted in the development of teaching tools and has streamlined curricula to better support student learning.

There are two primary ways to measure student learning: (1) direct evidence (faculty assessment of students’ skills); and (2) indirect evidence (student self-assessment of their skills). The ALEE Center recommends collecting both types of evidence, as they provide different types of insight regarding student achievement of PLOs.

Criteria-based rubrics are the primary tool that faculty use to collect direct evidence of student learning and skills. Rubrics specify skills that students are expected to demonstrate in an assignment and are associated with the PLO(s) of interest. For each rubric criterion, instructors articulate their expectations or standards based on four levels of proficiency.

Our approach to creating effective and valid rubrics is rooted in our faculty teaching expertise: faculty write rubrics fully aligned with their pedagogy and assignments, based on their idea of what specific, demonstrable skills constitute mastery of the PLO. We do not recommend starting with a generic rubric, as it does not capture the specific standards and teaching approaches of our faculty.

Results from direct evidence might look like this: “87% of graduating seniors met faculty expectations in selecting relevant sources, but only 64% met faculty expectations in analyzing/ synthesizing sources.”

To collect indirect evidence of students’ skills, students rate their own skills relevant to the PLO via a survey, which is collaboratively designed by faculty and ALEE Center specialists. This type of evidence is most useful when the survey questions are directly aligned with PLOs and match the specific skills that are directly assessed by faculty.

Course- or program-specific surveys also include questions about students’ preparation for the course and future courses, use of resources, and sense of belonging.

Results from indirect evidence might look like this: “79% of graduating seniors reported very good/excellent skills in selecting relevant sources, and 72% reported very good/excellent skills in analyzing/synthesizing sources.”

In addition, two campus-wide student surveys include self-reported ratings of students’ skills, including the UC Undergraduate Experience Survey (UCUES) and the UCSC Graduate Student Survey.

Notably, well-developed assessment tools can be reused for teaching and subsequent studies to measure change over time! These tools become embedded in instruction and, most importantly, shared across faculty.

ALEEC assessment specialists are also available for rubric design consultations and can be contacted at aleec@ucsc.edu.


Once you have finished designing your assessment rubrics, the next step is to collect data (Step 2).

Last modified: Feb 26, 2025