Office of Assessment
Raynor Memorial Libraries, 326H
1355 W. Wisconsin Ave.
Milwaukee, WI 53233
PROBLEM WITH THIS WEBPAGE?
To report another problem, please contact email@example.com.
Welcome to the Marquette Office of Assessment's Resource Center. After leveraging the content here, you will be able to:
This page contains tools and resources that are useful for assisting faculty and staff throughout the assessment process. The Office of Assessment at Marquette provides guidance, assistance, and technical expertise to ensure that intended student learning and development outcomes in all curricular and co-curricular programs are defined, aligned to educational and developmental experiences, measured, and most importantly, applied to efforts to innovate, improve, and maintain excellence in accordance with the Program and Institutional Level Student Learning Assessment Policies.
What are Student Learning Outcomes (SLOs)?
Student Learning Outcomes (SLOs) are statements that describe the key observable and measurable knowledge, abilities, or dispositions a program expects students to acquire as a function of the experiences within that program. SLOs should be focused on what students should know, what students should be able to do, or how students should think about or value certain ideals. They should not focus on what the program, faculty members, or staff will provide or do for students.
How do I write SLOs?
SLOs should be:
A helpful formula for writing SLOs is the A [Audience], B [Behavior], C [Condition], D [Degree] format. For example:
“As a result of [Condition], [Audience] will be able to [behavior] [degree].”
"Upon graduating from the program, students will be able to integrate self-awareness, counseling roles and reflective practices into a professional counseling identity."
A framework such as Bloom’s Taxonomy, a hierarchical model used for classification of learning objectives (and their associated educational experiences and assessments) into levels of complexity and specificity, is particularly useful for identifying behaviors and writing quality SLOs.
How do I know if my SLOs meet the standard of quality?
Here are some best practices for writing SLOs:
What's the difference between an expected and aspirational SLO?
More information on aspirational outcomes can be found in this workshop presentation: https://www.marquette.edu/assessment/workshop-1.php
Does my program have to assess and report on every SLO every year?
No. Programs should select at least two outcomes (at a minimum, typically 2-3) for which you will report assessment information/data at the end of the academic year. You are not required to report results on each outcome every year.
What resources are available to help me write SLOs?
The Office of Assessment is happy to consult with faculty and staff in support of writing effective SLOs. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. Additionally, please reference the following NILOA resources specific to SLO writing:
What is a curriculum & experience map?
A curriculum & experience map contains the educational experiences intentionally designed to align student progress toward program outcomes.
What does a curriculum & experience map look like/include?
A curriculum & experience map is a detailed structure of program courses and learning experiences.
An effective curriculum & experience map:
How do I make a curriculum & experience map?
The Office of Assessment has created this Curriculum & Experience Mapping presentation to help you get started.
What should I consider in creating my program’s curriculum & experience map?
Here are some markers of quality and utility we recommend in designing your program’s curriculum & experience map:
Why should my program have a curriculum & experience map?
A curriculum & experience map links intended cause-and-effect experiences described in the program theory to intended SLOs and relevant assessment measures, providing an essential blueprint for planning your assessment activities. Without a map, it is much more difficult to conduct meaningful assessment work for your program.
What happens if I made a curriculum or SLO change? Do I need to submit an updated curriculum & experience map?
Yes! If your program has revised or created a new outcome, or made curricular changes to your program, please submit an updated curriculum and assessment plan.
What resources are available to help me create a curriculum & experience map?
The Office of Assessment is happy to consult with faculty and staff in support of developing a curriculum & experience map. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher.
What do you mean by data collection?
Data collection is the gathering of relevant evidence about how well students are meeting the intended outcomes. Your program assessment process must include collecting direct assessment data; however, it might also include collecting indirect assessment data as supporting information. Direct methods of assessment reflect data evidence through actual student work versus indirect data, which focuses on interpretation, such as graduation rates. We recommend focusing on collecting direct data in support of your program’s intended SLOs.
Program-level assessment measures should be selected, modified, or designed to measure program-level intended SLOs.
How do I know what data to collect?
In considering your SLOs, designed with specificity and measurability in mind, how will you know your program’s students are meeting the intended outcomes? The data you collect should serve as evidence that students are meeting the intended SLOs. It can also be helpful to consider what data would be convincing to you and your colleagues in your program.
Setting benchmarks for each of your SLOs is a critical step in making the data useful in the analysis. That is, given your program’s experiences and the numerous factors outside of the program’s control, what percentage of students does your program realistically believe should achieve the standards set for a particular outcome? Benchmarks serve as a guide which can later be compared to the actual number of students that meet the set goal for a particular outcome. Baseline data, students’ beginning assessment towards an SLO, is also valuable data to collect.
Additional data collection considerations:
What data collection methods should I use?
There is no perfect assessment instrument; capturing the complexity of student learning requires identifying multiple methods of assessment, including qualitative, quantitative, or mixed methodologies to provide the evidence appropriate. Decisions regarding your data collection methods should consider both the potential quality and practicality of data collection. Keep in mind a doable-ideal spectrum, balancing appropriate rigor with feasibility.
Specific suggestions for data collection methods could include: pre/post measures, capstone reflections, exam scores, rubric scores.
Here are two versions of a successful program assessment framework that are both popular and effective:
When should I collect data?
Program-level assessment information for all program-level intended SLOs should be collected, when possible, in the final experience of a program. If SLOs are on a cycle, each current SLO has a measure and each future SLO has a measurement plan.
Why is data collection important?
The data collected during this process provide evidence for whether students are progressing towards the intended learning outcomes consistent with program expectations. This evidence drives the pursuit of excellence, innovation, and improvement of student learning and development. Results are intended to inform decisions about program, co-curricular, and institutional level content, delivery, and pedagogy.
What are some resources available to help me collect data?
The Office of Assessment is happy to consult with faculty and staff in support of gathering relevant evidence about how well students are meeting the intended outcomes. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher.
Who evaluates my assessment data?
Assessment is a collaborative activity, involving all constituents with a stake in improving student learning, including students, faculty, staff, alumni, administrators, and any other stakeholders who can provide unique information useful to the goals of assessment.
All faculty within a program are given access to the results and have opportunities to provide input into the interpretation of the information. However, it is often helpful to have a single person or small team provide some initial interpretations to start the conversations.
What am I looking for in my assessment data?
Interpret your results in the context of what you expected to see (benchmarks). Again, be sure to set expectations before looking at any data. Identify the potential reasons that these results are meeting or not meeting your expectations. The interpretation should be aligned to the program-level intended SLOs and be appropriate based on available evidence.
What do I do with my findings?
Why is data analysis important?
When you receive assessment data , it is purely data until you make meaning of it. Taking the time to analyze and reflect on the data findings allows you to make meaning and identify opportunities for program improvement.
What are some impactful reflection questions in contemplating the holistic assessment results?
Reflecting across your methods and measures, results, and interpretations across your assessment process as a whole can provide critical insight for enhancing future learning. Some suggested reflection questions include:
What are some resources available to help me collect data?
The Office of Assessment is happy to consult with faculty and staff in support of reflecting on the implications of the evidence collected via assessment and using interpretations to inform decisions about student learning and development experiences. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher.
What do you mean by implementing change?
Assessment by itself is an insufficient condition for learning and improvement; the information produced by assessment must be carefully reflected on, interpreted, and acted upon. It is through a process of educational and developmental innovation and improvement that assessment realizes its potential.
What kinds of changes could my data suggest?
Your data could suggest any of the following modification strategies:
What if my program doesn't need any changes?
An integral component of the assessment process in engaging in continued growth. Reflecting on a plan for continued growth for your outcomes ensures a strong commitment to our students’ learning and development. With data indicating SLO outcomes are being met, some ideas to consider moving forward include:
What do I need to consider when making changes to my program assessment plan?
Some things to keep in mind when planning program assessment changes could include the following:
To make the assessment process meaningful by linking it to the innovation and improvement of educational and developmental experiences, the assessment process itself is regularly evaluated and refined as insights emerge from the assessment and educational process.
What are some resources available to help me implement change?
The Office of Assessment is happy to consult with faculty and staff in support of making warranted changes in the course, program, co-curricular or institution-level experiences to enhance learning and assessing the impact of these changes on student learning and/or development. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher.
What happens to my assessment report after I submit it?
After your submit your program assessment report via Qualtrics survey, you will receive a copy of the report. Your assessment report is first filed with your program’s historical record of assessment reporting. After both report Parts One (fall) and Two (spring) are received, the comprehensive report is reviewed by a member of the University Assessment Committee (UAC) with the goal of providing constructive feedback to further strengthen your program’s assessment efforts.
How is my assessment report evaluated?
The Office of Assessment has developed a rubric that reflects what an assessment report looks like at various stages of development, including beginning, developing, compliance-focused, aspirational, and exemplary assessment practices. View the rubric here: Program-Level Assessment Process Rubric.
What is the process for receiving my assessment report feedback?
Assessment results will be shared with Program Assessment Leaders (PALS) by the Office of Assessment once available.
Who should I be communicating with about my program's assessment results and evaluation?
A helpful question to ask yourself is: What are some key questions we need to know about our program and communicate to other parties?
Plans for sharing results, interpretations, and suggested applications with multiple stakeholders should be articulated within a program, including relevant details (e.g., with whom, methods of communication). Some criteria to be mindful of include:
How should I document and communicate my assessment results?
At a very minimum, your assessment plan and results should be discussed at a department/unit-wide meeting. Additional documentation suggestions include creating a system for internal storage and access, distributing an annual summary/report, and/or partnering with the Office of Assessment to create a spotlighted assessment story.
What do our accreditors look for?
Our institutional accreditor, the Higher Learning Commission, looks for three main factors when evaluating the quality of the assessment process for academic programs.
The assessment process at Marquette University is designed to help your program meet all three of these criteria in the most efficient and effective ways possible for each program.
What are some resources available to help me tell our story?
The Office of Assessment is happy to consult with faculty and staff in support of sharing your assessment story. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher.
What are the major milestones in the Marquette Assessment process?
Why is the assessment report process broken into two parts?
Starting in 2021, we began administering the assessment process in two phases with the goal of distributing the workload across the year to make it more manageable and actionable. Additionally, the two-part process allows assessment to be considered as an integral and intentional component to your educational program(s) throughout the year.
What information am I required to submit for Part 1, by January 31st?
Part 1 requests you to submit your Student Learning Outcomes (SLOs), Curriculum Map, and Assessment Plan.
What information am I required to submit for Part 2, by June 1st?
Part 2 incorporates your responses from Part 1 and additionally asks you to submit your data results, interpretations, and plans for changes.
How does the Office of Assessment collect assessment report information?
A Qualtrics survey link is sent to the identified Program Assessment Leader (PAL) for both Parts 1 and 2. Your responses to Part 1 are incorporated into your Part 2 survey.
I’m a Program Assessment Leader (PAL) for several programs. Will I receive a link for each program?
Yes. Please contact us if you do not receive survey links for all your programs.
What if I don’t have any students in my program currently? Do I still need to submit an assessment report?
With zero students enrolled in your program, you are not required to submit any data collection or analysis. You should instead concentrate on providing the other components of your assessment plan. Thus, a report that shows your outcomes, how your experiences are linked to your outcomes, and how those outcomes will be assessed.
What should I do if I am having issues with the Qualtrics survey, including receiving an error message that the survey link has expired?
Please contact us at firstname.lastname@example.org for Qualtrics survey assistance.
What should I do if I have questions or if I’m unsure how to complete my assessment report?
We are happy to set up a meeting with you! Please contact us at email@example.com.