Assessment Resources

2023-2024 Assessment & Key Dates

  • January 31st: Part 1 Assessment Reports DUE
  • June 1st: Part 2 Assessment Reports DUE

2023-2024 Assessment Learning Objectives

Welcome to the Marquette Office of Assessment's Resource Center. After leveraging the content here, you will be able to:

  • Write effective expected and aspirational SLOs
  • Create a Curriculum Map relevant to your program(s)
  • Design appropriate assessment data collection methods
  • Interpret your data assessment results
  • Leverage your data to implement impactful program change
  • Communicate your successes to compel further continuous
    improvement

This page contains tools and resources that are useful for assisting faculty and staff throughout the assessment process. The Office of Assessment at Marquette provides guidance, assistance, and technical expertise to ensure that intended student learning and development outcomes in all curricular and co-curricular programs are defined, aligned to educational and developmental experiences, measured, and most importantly, applied to efforts to innovate, improve, and maintain excellence in accordance with the Program and Institutional Level Student Learning Assessment Policies.

Assessment Cycle Modules

  • 1-Student Learning Outcomes (SLOs)
  • 2-Curriculum and Experience Mapping
  • 3-Designing and Selecting Methods
  • 4-Data Analysis
  • 5-Implementing Change
  • 6-Sharing Your Assessment Story

Expand all   |   Collapse all  

What are Student Learning Outcomes (SLOs)?

Student Learning Outcomes (SLOs) are statements that describe the key observable and measurable knowledge, abilities, or dispositions a program expects students to acquire as a function of the experiences within that program. SLOs should be focused on what students should know, what students should be able to do, or how students should think about or value certain ideals. They should not focus on what the program, faculty members, or staff will provide or do for students. 

How do I write SLOs?

SLOs should be: 

  • Specific: focused on specific knowledge, abilities, or dispositions. Each statement should be sufficiently detailed to convey exactly what the ideal student “looks like” at the end of a program and what is indicative of success. 
  • Measurable: convey the knowledge, abilities, or dispositions such that they are able to be evidenced. Each statement should convey exactly what a student could do to demonstrate their attainment of the intended SLO and what is considered success. 
  • Reasonable: Within the specific context of an SLO, the level of knowledge, ability, or disposition expected or aspired to from students should be realistic. It is not helpful to set unrealistic expectations for students or programs. It is also problematic, however, to set expectations too low. Every SLO should aim to set a balanced goal between these two extremes. 

A helpful formula for writing SLOs is the A [Audience], B [Behavior], C [Condition], D [Degree] format. For example: 

“As a result of [Condition], [Audience] will be able to [behavior] [degree].” 

  • Audience: Who is the target audience (i.e., students) 
  • Behavior: What is the work to be accomplished by the learner? (Should include an appropriate action verb) 
  • Condition: What are the conditions/constraints in which the learners will be expected to perform these tasks? 
  • Degree: How will the behavior need to be performed? To what specificity or measure? 

Example: 

"Upon graduating from the program, students will be able to integrate self-awareness, counseling roles and reflective practices into a professional counseling identity."

  • Audience: students 
  • Behavior: integrate self-awareness, counseling roles and reflective practices 
  • Condition: Upon graduating from the program 
  • Degree: professional-level counseling identity 

A framework such as Bloom’s Taxonomy, a hierarchical model used for classification of learning objectives (and their associated educational experiences and assessments) into levels of complexity and specificity, is particularly useful for identifying behaviors and writing quality SLOs. 

How do I know if my SLOs meet the standard of quality? 

Here are some best practices for writing SLOs: 

  • All program-level intended SLOs are student-centered, specific, and measurable. 
  • Outcome descriptions clearly describe what the ideal student "looks like" when the outcome has been met. 
  • The intended SLOs are important to the field, program, the students, and faculty, and aligned with institutional goals and values, such as the Marquette Beyond Boundaries strategic plan. 
  • The intended SLOs are developmentally structured (e.g., introductory, reinforcing, and mastery level; Bloom’s Taxonomy; Fink's Taxonomy of Significant Learning; any other developmental structures). 
  • At least one intended SLO is focused on aspirational assessment. 

What's the difference between an expected and aspirational SLO? 

  • An Expected program-level intended SLO is: A statement that describes the key knowledge, abilities, or dispositions a program currently expects students to acquire as a function of the experiences within that program. Expected program-level SLOs aim to drive efforts to maintain the high quality of efforts to facilitate program-level student learning/development. 
  • An Aspirational program-level intended SLO is: A statement that describes the key knowledge, abilities, or dispositions students WILL acquire in the future, given planned changes to the program. Aspirational program-level assessment SLOs aim to drive new efforts to facilitate program-level student learning/development (e.g., new curricula, new educational experiences, new pedagogies, new structural components, etc.) 

More information on aspirational outcomes can be found in this workshop presentation: https://www.marquette.edu/assessment/workshop-1.php 

 

Does my program have to assess and report on every SLO every year? 

No. Programs should select at least two outcomes (at a minimum, typically 2-3) for which you will report assessment information/data at the end of the academic year. You are not required to report results on each outcome every year. 

What resources are available to help me write SLOs? 

The Office of Assessment is happy to consult with faculty and staff in support of writing effective SLOs. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. Additionally, please reference the following NILOA resources specific to SLO writing: 


Expand all   |   Collapse all  

What is a curriculum & experience map? 

A curriculum & experience map contains the educational experiences intentionally designed to align student progress toward program outcomes.  

What does a curriculum & experience map look like/include? 

A curriculum & experience map is a detailed structure of program courses and learning experiences. 

An effective curriculum & experience map:  

  • Links the courses / programming to intended SLOs (noting the degree to which each SLO is covered) AND to relevant assessment measures.  
  • Articulates expected results of measures based on existing curriculum/experiences. 

How do I make a curriculum & experience map? 

The Office of Assessment has created this Curriculum & Experience Mapping presentation to help you get started. 

What should I consider in creating my program’s curriculum & experience map? 

Here are some markers of quality and utility we recommend in designing your program’s curriculum & experience map: 

  • Include a detailed structure of comprehensive program course/experience offerings, including both in and out of classroom experiences. 
  • A reasonable rationale is present for the choice and sequence of experiences across the program. 
  • An intended cause-and-effect chain (i.e. program theory) of educational experiences is detailed for each program-level intended SLO across multiple timepoints during the program. 
  • Provides compelling evidence that both students and faculty are aware and value the program-level structures.

Why should my program have a curriculum & experience map? 

A curriculum & experience map links intended cause-and-effect experiences described in the program theory to intended SLOs and relevant assessment measures, providing an essential blueprint for planning your assessment activities. Without a map, it is much more difficult to conduct meaningful assessment work for your program.   

What happens if I made a curriculum or SLO change? Do I need to submit an updated curriculum & experience map? 

Yes! If your program has revised or created a new outcome, or made curricular changes to your program, please submit an updated curriculum and assessment plan. 

What resources are available to help me create a curriculum & experience map? 

The Office of Assessment is happy to consult with faculty and staff in support of developing a curriculum & experience map. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. 

Expand all   |   Collapse all  

What do you mean by data collection? 

Data collection is the gathering of relevant evidence about how well students are meeting the intended outcomes. Your program assessment process must include collecting direct assessment data; however, it might also include collecting indirect assessment data as supporting information. Direct methods of assessment reflect data evidence through actual student work versus indirect data, which focuses on interpretation, such as graduation rates. We recommend focusing on collecting direct data in support of your program’s intended SLOs. 

Program-level assessment measures should be selected, modified, or designed to measure program-level intended SLOs. 

How do I know what data to collect? 

In considering your SLOs, designed with specificity and measurability in mind, how will you know your program’s students are meeting the intended outcomes? The data you collect should serve as evidence that students are meeting the intended SLOs. It can also be helpful to consider what data would be convincing to you and your colleagues in your program. 

Setting benchmarks for each of your SLOs is a critical step in making the data useful in the analysis. That is, given your program’s experiences and the numerous factors outside of the program’s control, what percentage of students does your program realistically believe should achieve the standards set for a particular outcome? Benchmarks serve as a guide which can later be compared to the actual number of students that meet the set goal for a particular outcome. Baseline data, students’ beginning assessment towards an SLO, is also valuable data to collect.  

Additional data collection considerations: 

  • Program-level assessment information should be collected across multiple levels of the program.  
  • Can the data be disaggregated by subpopulations to inform equitable assessment and educational/development practices? 

What data collection methods should I use? 

There is no perfect assessment instrument; capturing the complexity of student learning requires identifying multiple methods of assessment, including qualitative, quantitative, or mixed methodologies to provide the evidence appropriate. Decisions regarding your data collection methods should consider both the potential quality and practicality of data collection. Keep in mind a doable-ideal spectrum, balancing appropriate rigor with feasibility.  

Specific suggestions for data collection methods could include: pre/post measures, capstone reflections, exam scores, rubric scores. 

Here are two versions of a successful program assessment framework that are both popular and effective: 

  • Student-level Baseline Data, Intervention, and Re-Assessment: A student begins a program with an assessment of an SLO. That student participates in the designed programmatic experiences. That student re-takes the assessment of an SLO. Those two assessment points are compared to make interpretations about changes to program-level student learning/development. 
  • Program-level Baseline Data, Intervention, and Re-Assessment: Students completing a program take an assessment of an SLO. Students who remain after those students have completed the program participate in different educational experiences. The new students take the same assessment of an SLO. Those two assessment points are compared to make interpretations about changes to program-level student learning/development. 

When should I collect data? 

Program-level assessment information for all program-level intended SLOs should be collected, when possible, in the final experience of a program. If SLOs are on a cycle, each current SLO has a measure and each future SLO has a measurement plan. 

Why is data collection important? 

The data collected during this process provide evidence for whether students are progressing towards the intended learning outcomes consistent with program expectations. This evidence drives the pursuit of excellence, innovation, and improvement of student learning and development. Results are intended to inform decisions about program, co-curricular, and institutional level content, delivery, and pedagogy. 

What are some resources available to help me collect data? 

The Office of Assessment is happy to consult with faculty and staff in support of gathering relevant evidence about how well students are meeting the intended outcomes. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. 

Expand all   |   Collapse all  

Who evaluates my assessment data? 

Assessment is a collaborative activity, involving all constituents with a stake in improving student learning, including students, faculty, staff, alumni, administrators, and any other stakeholders who can provide unique information useful to the goals of assessment. 

All faculty within a program are given access to the results and have opportunities to provide input into the interpretation of the information. However, it is often helpful to have a single person or small team provide some initial interpretations to start the conversations. 

What am I looking for in my assessment data? 

Interpret your results in the context of what you expected to see (benchmarks). Again, be sure to set expectations before looking at any data. Identify the potential reasons that these results are meeting or not meeting your expectations. The interpretation should be aligned to the program-level intended SLOs and be appropriate based on available evidence. 

What do I do with my findings? 

  • Results should be collaboratively shared with identified stakeholders who should have opportunities to provide feedback and suggestions based on shared materials. 
  • Sharing your initial data results with students and/or faculty/staff of a different program could strengthen the interpretations for both programs. 
  • Compare benchmarks with actual results. For example, what percentage of students will need to meet each outcome for the program to consider the educational experiences successful (i.e., the outcome is achieved for a large number of students; e.g., 80% of students need to achieve a satisfactory assessment score for the outcome to be considered met)? What was the actual percentage of students who met each intended outcome? 
  • Identify a clear and compelling rationale such that the interpretation would withstand reasonable criticism from knowledgeable stakeholders.  
  • The interpretation should be aligned to the program-level intended SLOs and appear to be appropriate based on available evidence. 
  • Specific challenges and opportunities should be articulated concerning implementation of suggested applications. 
  • Present a clear and compelling explanation of why any changes will be successful given the challenges and opportunities that were identified. 

Why is data analysis important? 

When you receive assessment data , it is purely data until you make meaning of it. Taking the time to analyze and reflect on the data findings allows you to make meaning and identify opportunities for program improvement. 

What are some impactful reflection questions in contemplating the holistic assessment results? 

Reflecting across your methods and measures, results, and interpretations across your assessment process as a whole can provide critical insight for enhancing future learning. Some suggested reflection questions include:  

  • In terms of program-level outcome and assessment, what are the current strengths of your program?  
  • What are the potential areas for growth?  
  • What are the steps you are going to take over the summer to prepare you for next year's assessment cycle? 
  • What is a clear plan of action for meeting/not meeting desired results of re-assessment? 

What are some resources available to help me collect data? 

The Office of Assessment is happy to consult with faculty and staff in support of reflecting on the implications of the evidence collected via assessment and using interpretations to inform decisions about student learning and development experiences. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. 

Expand all   |   Collapse all  

What do you mean by implementing change? 

Assessment by itself is an insufficient condition for learning and improvement; the information produced by assessment must be carefully reflected on, interpreted, and acted upon. It is through a process of educational and developmental innovation and improvement that assessment realizes its potential. 

What kinds of changes could my data suggest? 

Your data could suggest any of the following modification strategies: 

  • Add/Remove/Change courses offered to, or required of, students 
  • Add/Remove/Change educational experiences within courses 
  • Add/Remove/Change educational experiences outside of courses
  • Provide professional development for faculty/staff
  • Add/Remove/Change assessment methods
  • Add/Remove/Change communication with students/faculty/staff
  • Evaluate the implementation fidelity of existing experiences 

What if my program doesn't need any changes? 

An integral component of the assessment process in engaging in continued growth. Reflecting on a plan for continued growth for your outcomes ensures a strong commitment to our students’ learning and development. With data indicating SLO outcomes are being met, some ideas to consider moving forward include: 

  • Setting a higher expectation of the percentage of students we expect to achieve the outcome 
  • Revising outcomes to expand/increase what students must do to meet the outcome 
  • Assessing a new outcome (either through rotation or creating a new one) 
  • Reassessing the outcome for another year to ensure you have stable experiences over time (this is only acceptable for two consecutive years) 

What do I need to consider when making changes to my program assessment plan? 

Some things to keep in mind when planning program assessment changes could include the following: 

  • Who wille be involved in planning and executing these changes?  
  • When will these changes occur? 
  • What resources will be necessary? 
  • What other information will you need before you make these changes? 
  • How will you go about enacting these changes? 
  • What results do you hope to see from these changes? 
  • How are you going to share the data, interpretations, and expected changes and with whom do you plan to share them? 

To make the assessment process meaningful by linking it to the innovation and improvement of educational and developmental experiences, the assessment process itself is regularly evaluated and refined as insights emerge from the assessment and educational process. 

What are some resources available to help me implement change? 

The Office of Assessment is happy to consult with faculty and staff in support of making warranted changes in the course, program, co-curricular or institution-level experiences to enhance learning and assessing the impact of these changes on student learning and/or development. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. 

Expand all   |   Collapse all  

What happens to my assessment report after I submit it? 

After your submit your program assessment report via Qualtrics survey, you will receive a copy of the report. Your assessment report is first filed with your program’s historical record of assessment reporting. After both report Parts One (fall) and Two (spring) are received, the comprehensive report is reviewed by a member of the University Assessment Committee (UAC) with the goal of providing constructive feedback to further strengthen your program’s assessment efforts.  

How is my assessment report evaluated? 

The Office of Assessment has developed a rubric that reflects what an assessment report looks like at various stages of development, including beginning, developing, compliance-focused, aspirational, and exemplary assessment practices. View the rubric here: Program-Level Assessment Process Rubric.

What is the process for receiving my assessment report feedback? 

Assessment results will be shared with Program Assessment Leaders (PALS) by the Office of Assessment once available. 

Who should I be communicating with about my program's assessment results and evaluation? 

A helpful question to ask yourself is: What are some key questions we need to know about our program and communicate to other parties? 

Plans for sharing results, interpretations, and suggested applications with multiple stakeholders should be articulated within a program, including relevant details (e.g., with whom, methods of communication). Some criteria to be mindful of include: 

  • A clear and compelling rationale for sharing is articulated. 
  • Stakeholders have opportunities to provide feedback and suggestions based on shared materials. 
  • Results are collaboratively shared with students and/or faculty/staff of a different program to strengthen the interpretations for both programs. 

How should I document and communicate my assessment results? 

At a very minimum, your assessment plan and results should be discussed at a department/unit-wide meeting. Additional documentation suggestions include creating a system for internal storage and access, distributing an annual summary/report, and/or partnering with the Office of Assessment to create a spotlighted assessment story. 

What do our accreditors look for? 

Our institutional accreditor, the Higher Learning Commission, looks for three main factors when evaluating the quality of the assessment process for academic programs. 

  1. Do academic programs have high-quality, publicly accessible program-level student learning outcomes?
  2. Do academic programs have a well-developed plan that aligns with institutional expectations for assessing student progress towards program-level student learning outcomes?
  3. Do academic programs enact their plans for assessment in ways that inform and drive improvements to the program? 

The assessment process at Marquette University is designed to help your program meet all three of these criteria in the most efficient and effective ways possible for each program. 

What are some resources available to help me tell our story? 

The Office of Assessment is happy to consult with faculty and staff in support of sharing your assessment story. Additionally, the National Institute for Learning Outcomes Assessment (NILOA) provides robust, open-source resources for higher education professionals working in assessment, whether new to the practice or in need of a refresher. 

Expand all   |   Collapse all  

General Assessment Frequently Asked Questions (FAQs)

What are the major milestones in the Marquette Assessment process? 

  • November: Part 1 of the assessment report released 
  • December: Assessment Open House 
  • January 31st: Part 1 of the assessment report due 
  • April: Part 2 of the assessment report released
  • June 1st: Part 2 of the assessment report due 

Why is the assessment report process broken into two parts? 

Starting in 2021, we began administering the assessment process in two phases with the goal of distributing the workload across the year to make it more manageable and actionable. Additionally, the two-part process allows assessment to be considered as an integral and intentional component to your educational program(s) throughout the year. 

What information am I required to submit for Part 1, by January 31st? 

Part 1 requests you to submit your Student Learning Outcomes (SLOs), Curriculum Map, and Assessment Plan. 

What information am I required to submit for Part 2, by June 1st? 

Part 2 incorporates your responses from Part 1 and additionally asks you to submit your data results, interpretations, and plans for changes. 

How does the Office of Assessment collect assessment report information? 

A Qualtrics survey link is sent to the identified Program Assessment Leader (PAL) for both Parts 1 and 2. Your responses to Part 1 are incorporated into your Part 2 survey. 

I’m a Program Assessment Leader (PAL) for several programs. Will I receive a link for each program? 

Yes. Please contact us if you do not receive survey links for all your programs. 

What if I don’t have any students in my program currently? Do I still need to submit an assessment report? 

With zero students enrolled in your program, you are not required to submit any data collection or analysis. You should instead concentrate on providing the other components of your assessment plan. Thus, a report that shows your outcomes, how your experiences are linked to your outcomes, and how those outcomes will be assessed. 

What should I do if I am having issues with the Qualtrics survey, including receiving an error message that the survey link has expired? 

Please contact us at assessment@marquette.edu for Qualtrics survey assistance. 

What should I do if I have questions or if I’m unsure how to complete my assessment report? 

We are happy to set up a meeting with you! Please contact us at assessment@marquette.edu.