Answers to all of your frequently asked Assessment questions
I need help! Whom do I contact?
Where can I find my previous reports, my Program Student Learning Outcomes (PSLOs) and other information?
All previous reports, a list of the current PSLOs, as well as other information can be found on the Assessment Canvas page. If you are not granted access to the page through the link, please email the Assessment staff so we can add you as a “student” to the “course.”
What is the assessment schedule?
UMN Morris follows a 3-year assessment cycle. All General Education categories, Program Student Learning Outcomes (PSLOs) and Campus Student Learning Outcomes (CLSOs) are assessed over the course of 3 years. The day before classes begin in the Fall semester is the deadline for Academic Program Annual reports (for the previous academic year) and Assessment Plans (for the upcoming year). More details about the assessment cycle can be found on the Assessment Schedule page, while specific dates for this current year can be found on the Assessment Calendar.
How can we make assessment easier and more useful?
Many UMN Morris programs have developed excellent strategies for making assessment easier while producing more useful data for program improvement. Some strategies to consider:
- Communicate, communicate, communicate. The more faculty/staff discuss the goals, needs, and achievements of their students, the more likely all program members are to be working toward common outcomes in a coordinated manner. The easiest and most useful way to write an annual assessment report is to discuss assessment data for an hour, then write a short summary of the discussion.
- When assessing student work, use pre-made rubrics, or modify them to better fit your needs. AAC&U rubrics and other resources can be found on the Assessment Canvas page.
- Modify assignments or activities that you already use in your classes or programs. That allows your assessment work to fulfill needs within your course/program.
- Even if you don’t grade students on their fulfillment of student learning outcomes, gather data about their achievement of those outcomes at the same time that you grade them on their assignments or reflect on their activities. Canvas allows grading rubrics to include components that do not count toward the grade, or this can be done manually.
- Assess as many program student learning outcomes (PSLOs) as possible within a single capstone course or activity, such as a senior seminar. This will capture data about student skills at the end of the program and consolidate data collection. You may be able to finish all of your assessment during one conversation about student capstone projects.
What makes a good assessment tool?
Any assignment or activity, whether graded or ungraded, can be an assessment tool. Some things that make a good assessment tool:
Direct link between the tool and the outcome being assessed
- Skills or knowledge needed to succeed at assessment should match the PSLO
- Unlike the overall grade, assessment of outcome proficiency should not include factors like lateness or spelling, unless that’s the PSLO being assessed
Fair and reliable
- Avoid assessments that disadvantage students because of characteristics such as disabilities, first-generation, or ESL status
- Look for student work where the outcomes are normally similar if students taking the assignment have a similar set of skills, regardless of context or who grades it
What’s the difference between “grading” and “assessing”?
Under some circumstances, such as a mastery grading system, “grades” and “assessments” might be the same. However, generally a student’s grade is not the same as their proficiency at a particular learning outcome. For example, the Paleontology program might have a program student learning outcome (PSLO): “Students will be able to describe the circumstances under which fossils form and the processes that prevent fossil formation.” The program assesses proficiency on this PSLO through a lab write up in Paleontology 1001 where students are asked to identify the processes that led to the formation of several fossils. The grade for this lab consists of the following: 6 points for correctly identifying the processes, 2 points for spelling and grammar, 2 points for correctly handling the fossils. It is possible for a student who does not have a strong understanding of fossil formation (receiving only 3 of the 6 possible points for correctly identifying the processes) to nonetheless get a passing grade on this lab if they received full points for spelling and correctly handling the fossils. Many grades that students receive, whether for labs, exams, or whole courses, include points/credit for aspects of learning that are unrelated to the PSLO being assessed.
In contrast, the assessment of the Paleontology PSLO could be done quite simply by noting, while grading the lab, whether students showed proficiency in the PSLO. This could be done by setting the number of points that would be considered “proficient” (for example: students are considered proficient if they received 4 out of 6 points for correctly identifying the fossilization process), or it can be simply a statement by the professor as to which labs did or did not show proficiency in the PSLO, based on criteria such as “student stated that fossils are more likely to form in undisturbed substrates” or “identified at least three different processes that lead to fossilization.”
What is the difference between program assessment and program review?
Program assessment is an annual process wherein academic and co-curricular programs reflect upon their programmatic learning goals, gather data on student achievement of those goals, and discuss how to best improve their program on the basis of those data. Program assessment is facilitated by the Assessment staff in the Office of Institutional Effectiveness and Research. The Assessment staff includes faculty and co-curricular program staff who have part-time appointments in assessment while maintaining their presence in the classroom or co-curricular program.
Program review occurs once every five years for academic programs. Program review runs through the Office of the Vice Chancellor for Academic Affairs. It is a more intensive process where faculty/staff review data relevant to their programs, reconsider their program goals, and engage in a dialog with administrators, faculty, and students to shape their program for the future.
Why are the faculty/staff in each program asked to discuss assessment plans and data? Can’t we just have one person fill out the form?
The primary purpose of assessment is to improve student learning. Assessment data is only useful when it is used to improve curricular and co-curricular programs. The meaning of assessment data, and how to use that data to improve the program, must be decided as part of a larger discussion about the goals and needs of the program. When faculty/staff discuss the goals, needs, and achievements of their students and programs, they are better able to work toward desired outcomes in a coordinated manner.
How do we assess a broad or general PSLO?
If the PSLO has many facets or is very broad, try to create a window on your program, rather than opening the whole wall. One small slice of the PSLO is enough. So, if you’re looking “communication” in general, it’s not necessary to assess oral, written, and multimedia communication; you can pick one. If you make programmatic changes on the basis of this assessment, however, make sure that you run a comparable assessment during the next cycle, so you can determine the efficacy of those changes. If you assess oral communication and make a change to your program on the basis of that data, but during the next cycle you assess written communication, then you are not able to determine the efficacy of the change that you made.
What if one (or two or three) of our faculty/staff are on leave this year?
Even if your program is understaffed, the assessment show must go on. The assessment process is about improving programs based on the data you’ve gathered. All program members should be part of the conversation about what changes would be helpful to better meet program goals.
Should we assess incoming and graduating students? Pre- and post-courses? Only graduating seniors?
Minimally, you should be assessing whether your graduating seniors have achieved your program goals (either through a capstone experience or in the one course that covers that goal, if there is one). You may be able to assess all or almost all of your PSLOs through one faculty conversation about your students’ capstone project.
How do we assess interdisciplinary programs with few dedicated courses?
You are assessing the whole program, not just those classes with a course designator related to the program. It is probably easiest to assess the program in the capstone experience or by evaluating students’ capstone work, even if technically it took place outside of your program (for example, if you allow students to use capstone courses from other majors to count toward your program if they include sufficient content related to your program’s focus). You could also use required classes, regardless of designator, to assess the program. Some of your program’s required classes may have many non-majors in the classes. You can pull out assignments just from the program’s majors to assess the program’s learning outcomes.
Many interdisciplinary programs are small, but even if there are only a few (or one) students in your capstone course, they can be used to assess the program. Remember, this is not a small sample; it is a small universe.
You keep saying “data.” What does that mean? Must we use statistics?
You do not need to assess using standardized exams or assignments that produce numerical data. You must have some criteria for determining whether a student has achieved a program outcome. In your report, you will be asked how many students did or did not meet those criteria. Beyond that, you do not need to focus on quantitative data. You do not need to run statistics unless you want to.
What happens to our assessment reports?
Assessment reports are the basis for making data-informed decisions about the improvement of curricular and co-curricular programs to better support student success. Assessment reports are gathered and stored on the Assessment Canvas page, where they are available to instructional faculty and staff as well as our accrediting body. The data from the assessment reports is summarized in two annual campuswide assessment reports, “Academic Program Assessment at the University of Minnesota Morris,” and “Co-Curricular Program Assessment at the University of Minnesota Morris.” These reports are available on the Assessment Canvas page, under "Previous Reports." The data from these report are used by faculty, staff, and administrators at all levels to make data-informed decisions about improving our campus and our assessment process so we can better serve our students.