Detailed Examples of the Student Learning Outcomes Assessment Plan

The following examples are taken from actual University of Pittsburgh program assessment plans.

Learning Outcomes Developed

What will students know and be able to do when they graduate?

Exemplars
  • Students will demonstrate knowledge of current theories regarding the relationship of healthcare needs and outcomes to patterns of behaviors associated with certain subpopulations (defined culturally, economically, socially, and/or by age).
  • Students will refine critical skills in reading and interpreting literary texts and other cultural artifacts; convey interpretations of texts in formal academic writing; acquire comprehensive knowledge of the various periods, major writers and currents of thought in [foreign language] and [foreign language] literature; develop an understanding of the historical, material, social, and intellectual contexts which inform that literature.
  • By the end of the second year, students will develop basic patient examination and communications skills, including the ability to communicate clearly and effectively with patients.
Satisfactory
  • Graduates will be able to demonstrate mastery of theoretical content so they that they provide safe, competent care.
  • Students will be able to apply knowledge of the basic concepts of [the discipline] to new fact situations.
  • Students will generate plausible interpretations of data from completed original research in [discipline].
Needs Improvement
  • Students will publish research findings in a timely fashion
    • Explanation: Publishing research findings does not describe what a student will know or be able to do. However, this can be a good indirect measure of a student’s ability to conduct research and write.
  • Students are prepared to be employed in entry-level practice.
    • Explanation: This is too broad.
  • Students will satisfy the individual academic and/or career needs which motivated the students’ selection of this major.
    • Explanation: This is not a good description of what a student will know or be able to do.

Assessment Methods Developed

How will the outcomes be measured? Who will be assessed, when, and how often?

(There should be some direct evidence.)

Exemplars
  • Using a sample of 20 videotaped speeches, each year 4 members of the faculty will review the speeches using established criteria. (Direct evidence.)
  • The reading list/statement and written examination for the comprehensive exams will serve as the basis for this evaluation. A committee of graduate faculty will evaluate a sample of these, using an agreed-upon rubric (attached) to focus on breadth of knowledge of the field, analytical skills, research skills, and methodologies. This assessment will be conducted every two years, beginning in AY 2010. (Direct evidence.)
  • Students will be evaluated through the Comprehensive Examination. Program will conduct an analysis of pass rates, repeat rates and failures. Sample exams will be evaluated in terms of writing quality and critical content. (Direct evidence.)
  • Program will conduct an analysis of first author papers the graduates have written. An aggregate index of student publications will be assessed in terms of impact factors and citations. (Indirect evidence.)
Satisfactory
  • Assessment will be conducted by portfolio analysis (see end of table for description) of an assignment (or two smaller assignments) from selected students enrolled in [courses]. This will be evaluated every four years. (Direct evidence.)
  • Every 2 years, the department faculty will assess a random sample of 12-15 papers written by students in [course] and upper-level courses during the previous two terms. (Direct evidence.)
  • We will conduct an annual quality assurance survey. (Indirect evidence.)
Needs Improvement
  • Mastery of theory, methods and institutions will be assessed by reviewing a sample of midterm and final examinations and student papers for these two classes
    • Explanation: This is missing external validation – i.e., the method does not identify who will be doing the reviewing other than the classroom faculty. Also missing is any indication of how the Learning Outcome is being isolated in the exam.
  • Aggregate case study grades during senior year final semester will be used to measure this outcome.
    • Explanation: Grades are not a good method for measuring a Learning Outcome because (1) they do not isolate the Learning Outcome, and (2) they do not provide for external validation.

Benchmarks (Standards of Comparison) Developed

How well should students be able to do on the assessment?

(This should include numerical expectations.)

Exemplars
  • On a scale of Marginal, Acceptable, Capable, and Proficient, 70% of essays/exams should rate Acceptable or better and 40% should rate Capable or better.
  • Answers on a 1-4 Likert scale in the 22 questions under the subhead 25 on the survey (“to what extent have your experiences in your doctoral program contributed to your knowledge, skills, and habits of mind in the following areas?”) should average 3 or better for at 80% of survey respondents and 4 for at least 20%.
  • It is expected that 85% of our students will be ranked highly competent, 100% will be ranked at least competent, on a three-point scale, consisting of “highly competent,” “competent,” and “minimally competent.” 
Satisfactory
  • 100% of students must achieve 80% or better on the criterion-referenced competencies.
  • We expect the majority of our students to attain OMET scores congruent with or above the school mean in all categories except “Graded my work fairly,” which does not accurately measure what we train our students to do, which is to provide careful critical attention to their undergraduates’ work.
  • It is expected that each student will publish at least one peer-reviewed research manuscript as either a primary author or co-author. This manuscript should be of high quality, as assessed by both the journal in which it is published and by the student’s thesis committee.
Needs Improvement
  • Ninety percent of students taking comprehensive exams should be expected to pass. 
    • Explanation: Passing a comprehensive exam does not adequately isolate the Learning Outcome being assessed. To use comprehensive exams as methods, the plan should explain how assessing the particular Learning Outcome will be isolated, such as using a rubric.
  • All students will pass the course with an average of B or better.
    • Explanation: Grades are not a good method for measuring a Learning Outcome because (1) they do not isolate the Learning Outcome, and (2) they do not provide for external validation.
  • 90% of graduate survey respondents should have encountered three or more courses or an internship where they used or developed communication skills related to a professional context or career goal.
    • Explanation: Taking courses and internships do not measure Learning Outcomes. However, student material generated in those courses – such as papers, presentations, and portfolios – are excellent materials to use for measuring Learning Outcomes.

Interpretation of Results

What do the data show?

Exemplars
  • 75% of the class scored 80% or higher on the relevant portions of the exam. The mean was 84.6%. The median was 86%.
  • Of the 32 students who graduated with a PhD from our department between 2005 and 2008, 16 (50%) have attained full-time, tenure-track positions. This figure surpasses that reported (49%) in the most recent study (2003-2004) of placement to tenure-track conducted by the [professional association.] According to the [professional association], the percentage of PhD graduates with definite employment between 1996 and 2006 has ranged from a low of 44% to a high of 58%. The [discipline’s] Survey of Earned Doctorates reports 55% definite employment. It is worth noting that this employment may not be tenure track, and may be part-time. Although we are most concerned with tenure-track placement, the figure for “definite employment” of our PhD graduates in the period under assessment here is 100% of students who responded to our survey (all but 5). Most of those who do not have tenure-track employment are employed full-time in the academy.
  • EVALUATED SPRING 2008: NEXT DUE TO BE EVALUATED SPRING 2011. Eight exams—the totality of those presented since written comps essays were instituted as part of our graduate program reform several years ago—were evaluated. 100% were assessed as demonstrating at least Proficient knowledge; only one was judged to be Exceptional.
Satisfactory
  • 78% of the students received 80% or greater on this project. In examining these results, some difficulties students had were: (1) formulating a testable hypothesis, (2) noting limitations in study design, (3) use of references in introduction and discussion, and (4) concluding more than the results indicated.
  • All 8 students who took the “Reprint Exam” passed it in a convincing fashion, showing they could satisfactorily understand and critically evaluate a current scientific report.
  • As a first step, we have looked at the cohort of 34 PhD recipients who graduated in the last 4 years. We were able to confirm that 32 of them (94%) had presented at least one talk at a professional meeting prior to being awarded their PhD degree, and 2 did not.
Needs Improvement
  • Class evaluations, grades, advising sessions.
    • Explanation: Grades are not a good method for measuring a learning outcome because (1) they do not isolate the learning outcome, and (2) they do not provide for external validation. 
  • The data will show how well our students are able to take the important information and skills developed through lecture and laboratory classes and put them to use in a much more independent setting. Also, we will attempt to identify other factors that indicate potential for success in independent study and research.
    • Explanation: This is not an interpretation of an assessment, but instead a description of what the program hopes to learn from the process.

Use of Results/Action Plan

Who reviewed the findings? What changes were made after reviewing the results?

(Improvements to either the program or the assessment plan are useful outcomes.)

Exemplars
  • The data collected and analyzed by the doctoral committee suggests that Ph.D. students need more opportunities to publish and present research at professional conferences. In response, the doctoral committee plans to take the following actions:
    • Convene a faculty meeting in 2010-2011 to discuss ways to encourage and mentor doctoral student authorship and presentations of research at conferences.
    • Develop and distribute guidelines for doctoral advisors that include departmental expectations for doctoral student publications and presentations.
    • Create more department-level research and writing groups that include doctoral students.
    • Improve our distribution of information related to professional conferences, including posting upcoming CFPs on the departmental “doctoral student information” bulletin board.
    • Request that the School faculty annual review process include a section on or notation for co-authoring with doctoral students.
    • Make expectations for publishing and presenting clear at the annual doctoral orientation.
  • Survey results were distributed to all PhD program faculty. Faculty met to discuss interpretation of scores and means of addressing deficiencies in specific areas as part of the PhD program-wide curriculum revision process that began in 2007. A course was instituted in spring 2009 and is required for all first-year graduate students.  This course assists in addressing the deficiencies identified in the first Primary Employer Survey. The survey will be sent to primary employers in early 2011. Employers will be asked to rate those students who graduated in 2008, 2009, and 2010.
  • Although we have attained consistent positive findings regarding achievement on this goal, other than overall quality of the program, we are not sure exactly what the specific sets of contributing factors are. Thus, the Program Director and the quality assurance committee will explore this question by beginning with a focus group session involving near graduation students during early Spring, 2011. Information generated by such as assessment will help us to be targeted and nurture and capitalize on the contributing factors.
  • In preparation for the 2009 term, it was determined that the “practical examination” should be divided into 3 sections to accommodate the size of the class and the need to evaluate certain performance factors at separate times during the evaluation period. The rubrics were adjusted for 2010 to reduce the practical examinations to 2 rather than 3, and these rubrics are included in Appendix B1 and B2.
  • As a result of the initial and continuing reviews, the Oversight Committee and Curriculum Committee revised the interim reporting requirements that students must meet while a project is underway. These quarterly reports provide an additional mechanism to support student progress toward the desired learning outcomes.
  • The Undergraduate Committee [UC] reviewed the findings for Outcomes 2 and 3. The results represented an improvement over the last time this exercise was conducted (2008): 
    • no papers this time were found to be Marginal, and a larger portion reached the level of Capable or better. However, the number of merely Competent papers remains unacceptably large, and we still did not see the numbers of Proficient essays we believe our students should be capable of producing. The UC interprets these results as evidence that: (1) voluntary faculty workshops held in Fall 2008 to raise awareness and share best practices regarding the pedagogy of research and writing have indeed made instruction and expectations in our capstone courses more consistent, with the result that we have “brought up the floor” for student writing; yet (2) full implementation of the new two-course sequence of writing seminars, approved by our department in Spring 2009 but lamentably still on hold pending our department’s return to full teaching strength, will be necessary in order to “raise the ceiling,” that is, to enable significant numbers of majors to gain the kind of proficient mastery of writing skills and [discipline] conceptualization to which we aspire. Beginning Fall 2010, the Undergraduate Director will send all faculty requesting assignment to [courses] a summary of the conclusions of the faculty workshops on research and writing pedagogy, and the standards developed by the UC to assess capstone seminar papers. Faculty are encouraged to use these resources as part of their course design process, and to share the standards with students as part of in-class discussion of the components of excellence in research and writing within our discipline.
  • [Course] will be substantially overhauled to include some [specific discipline] topics in this core class to better reinforce the material of [subsequent course].
  • Findings were reviewed by a faculty committee who recommended the assessment process be strengthened. In particular, the questions should be more focused on specific topics, the tests given more often during the program (in addition to the beginning and end), and students tracked through the program.
  • New progression of standards will be implemented to reflect that 80% of the students should achieve a score of at least 80% and at least 90% of the students should achieve a score of 75%.
  • Exit interview form developed in spring term 2008 to collect responses from graduates about strengths and weaknesses of the program. Responses will be reviewed every three years.
  • After reviewing last year’s data, it is our plan to develop an instrument whereby upper-level courses are standardized in their goals and outcomes for students’ reading proficiency. As such, the learning outcomes for reading comprehension will be expressly integrated into all syllabi, as will a discussion of reading strategies and  approaches.
Satisfactory
  • The findings were reviewed by the steering committee and it was decided to move our Professional Development course from 4th to 3rd summer to allow students to apply for research support earlier in graduate training.
  • Findings were evaluated by the Course Director and sent to the Assessment Coordinator and Department Chair on 7/15/09. In the future, the Course Director of [course] will further emphasize [specific disciplinary content.]
  • Modifications will be made to the course next year to ensure sufficient instruction and practice in all areas of [disciplinary] analysis. In addition, instructions for the completion of this assignment will be modified to clarify the requirements.
  • We decided to add another lab practical and assignment into [course] to give the students more exposure to this skill set to better prepare them for the final exam. COMPLETED FOR FALL 2009 Delivery. To be assessed again 12/10.
  • Changes made as a result of the assessment include the addition of more writing assignments to lower level classes to prepare students to write effectively in upper level courses.
Needs Improvement
  • The program administration and the faculty review student performance twice a semester. Following review of student performance, changes may be made to the students overall plan of study (i.e., additional coursework) and increased oversight from faculty mentors may be recommended. Based on the evaluation letter, students may be placed on probation or terminated from the program.
    • Explanation: This describes remediation for students who fall short of the standards, instead of describing changes to the program.
  • Changes are made accordingly as needed.
    • Explanation: This does not provide any details.