Learning Outcomes and Assessment
Assessment and Learning Outcomes
Providing training and resource management to encourage evidence-based decision making in the Division of Student Affairs.
Share

Home

 

The Student Affairs Assessment and Learning Outcomes Group provides training and guidance about the practice of assessment, design of learning outcomes, and use of evidence-based findings to guide practice.  The group manages Division-wide projects and resources with the goal of promoting data-driven decision making. The purpose of the Division’s Assessment and Learning Outcomes Steering Committee is to provide leadership to the Division on issues of assessment and learning outcomes. 

Goals:

  • Serve as a resource to division staff for assessment and learning outcomes;
  • Provide on-going training and professional development opportunities;
  • Share assessment and learning outcomes activity across the Division;
  • Advance best practices.

Resources

Interested in learning more about assessment?  here are some resources to consider.

University of Maryland Resources

  • Institutional Research, Planning, and Assessment (IRPA): IRPA provides essential information about UMD for the purposes of decision-making, policy analysis, strategic planning, mandated reporting, and academic program review. They capture data on faculty, staff, students, and finances and use it to characterize the institution in quantitative ways using nationally accepted definitions.
  • Campus Assessment Working Group (CAWG): Organized through IRPA, CAWG collects data from students and alumni to inform and enhance the campus experience.  There are three subgroups you can join if you are interested in assessment at an institutional level. They also have reports on a wide range of topics of interest.
  • Campus Counts: Also managed through IRPA, Campus Counts provides data and summary reports of publicly available data at the university (enrollment particularly). 

 

​​Other Useful Resources

  • Student Affairs Assessment Leaders: A wide range of assessment resources for Student Affairs professionals, including readings, plans, trainings, syllabi, job descriptions, learning outcomes, assessment design, and more.
  • Assessment Conferences: A listing of higher education learning outcomes and assessment related conferences and events.
  • Center for the Study of Student Life (Ohio State University): Includes national resources and training webinars.
  • Assessment Commons: A wide range of resources for higher education outcomes assessment, including assessment handbooks, assessment of specific skills or content, a listing of assessment pages for individual institutions, and general resources and best practices.
  • Council for the Advancement of Standards in Higher Education (CAS): Includes general standards, learning and developmental outcomes, shared ethical principles, and characteristics of individual excellence for a range of services in higher education, many of which focus on student affairs units.
  • ACPA Commission for Assessment and Evaluation: The Commission promotes assessment skills and knowledge to facilitate and support student learning, development, and effective student affairs practice.  This site includes the ASK Standards (Assessment Skills and Knowledge).
  • NASPA Knowledge Community for Assessment, Evaluation, and Research:  The Knowledge Community encourages and supports student affairs professionals as they assess learning, evaluate programs, and research theory and practice. By providing quality education and networking opportunities, the KC strives to serve as a driving force in the movement towards improved student learning.

Reports

More to come!

Survey Platforms

Qualtrics

Qualtrics is a web-based survey tool available for use by all UMD faculty, students and staff to support teaching and research on campus.  Surveys can be created and distributed by anyone with UMD credentials.  Staff members are encouraged to use Qualtrics as their primary survey tool, but can use other UMD-supported tools when necessary (OrgSync, Google Forms). Staff members should avoid the use of other online tools such as Survey Monkey.

Qualtrics University Support and Training page provides webinars and training videos on a range of topics helpful to new users.

SAALOG has developed some resource sheets to highlight additional features that may be of interest as you use Qualtrics.

Survey Management:

  1. Copying a Project: Did you administer a survey last year that you would like to administer again?  Don’t recreate it! Copy it and make any updates in the new version.
  2. Sharing a Project: Allow others in your office to edit or distribute the survey.

Survey Appearance:

  1. Customized Survey Header: Use this resource sheet to include your office or program logo in the survey header.
  2. Starting the Survey in the invitation email: You can include a survey item in the invitation email that gets sent to respondents.

Editing Surveys (routing, responses, CAS capture):

  1. CAS Authentication: Require students to log in through CAS in order to access the survey; save their DirectoryID to tie to responses.
  2. Deleting cases: Did a student not take the survey seriously? This resource sheet outlines the steps you need to take if you want to remove a small number of responses so they do not appear when using Qualtrics reports. (Note: if you need to remove a large number of cases, this is not your best option.)
  3. Redirect upon survey completion: Do you want your respondents to be redirected to another website when they finish the survey? Use this resource sheet!
  4. Exporting Data to CSV: Use this resource sheet to download your survey responses to a CSV file which can be saved as an Excel workbook.

Reporting:

  1. Managing Public Reports: Need to share a Qualtrics report with a colleague?  This will show you how!
  2. Filtering Reports: Maybe you are only interested in the responses of first year students to your survey.  You can filter the report to show only these responses!
  3. CrossTabs: You can conduct cross tabulations in Qualtrics! For example, compare whether similar proportions of men and women report attending a given program.

 

Interested in a feature that's not listed?  Email stamp-assessment@umd.edu to request a resource sheet be created!
 

 

Survey Best Practices

Survey Development Best Practices

Included below are general and campus-specific tips to help you develop surveys. 

Critical questions to consider before beginning

  • What do you want to know?
    • This is not a question about ALL the things you could possibly ask (a brainstorming activity), but a fundamental exploration of what is important to know (not just interesting) to advance your work. Understanding the core purpose of your survey should guide every step of the process, from determining target audience, the questions posed, the distribution method, and how results will be reported. Don’t skip this step!
  • Who are you going to ask?
    • Is your population narrowly defined (i.e., students who participated in your program) or do you want to generalize your findings to a larger population (i.e., graduate students)? Knowing who your audience is will help determine the kinds of questions asked and the distribution method.
  • How do you plan to use the data collected? (also see later section on reporting results)
    • Begin with the end in mind. What possible implications for policy and practice will result from your data? Who will want to see and use the information gathered?
  • Is there existing data that might already provide helpful information?
    • Data is already collected from every corner of our campus. In addition, national studies already explore topics may be of interest to you (i.e., leadership, alcohol use, engagement). Is there anything you are planning to ask in your survey that already know or can find from another source?
    • For information about other data sources, visit the resources page.
  • Is a survey the best option for you?
    • Surveys are just ONE way to collect data. They may seem like a quick and easy solution, but may limit you in the types of information you can gather and the depth you can reach. Would a focus group, direct observation using a rubric, or another option better serve your core purpose?
    • To learn more, download "Selecting a Method."
  • When do I need to consult with SAALOG (Student Affairs Assessment and Learning Outcomes Group)?
    • While we do not require approval for most surveys to be created or distributed, we do recommend you consult with a SAALOG member (see link on the left) for projects that are division-wide or a major campus initiatives, when you will be asking for confidential data from the Registrar (i.e., emails or other personal identifiers), or if you are planning on sending a survey to more than 2000 people. SAALOG members can help you with survey development and distribution.

Survey design

  • A project manager should shepherd the process forward
    • When developing a new survey, it is best to designate one point of contact.  This person will be responsible for moving the process forward, soliciting and evaluating stakeholder feedback, and determining the items included in the final survey.  Of course, committee members or other stakeholders provide critical feedback, but assigning someone primary responsibility for overseeing the project will help ensure best practices are followed.
  • Develop good questions
  • Obtain feedback from multiple sources (pilot, expert on topic, expert on surveys]
    • Once you have finalized a draft of your survey, it is best to solicit feedback.  There are a variety of options for doing so, and it is best to solicit feedback through more than one of the following methods:
      • Pilot your survey – are there student employees who work in your office?  Ask them to complete the survey, noting any items that were confusing and places where the response they wanted to give to a question wasn’t available.  You might also want to ask students about the “flow” – does the progression of the survey make sense?
      • Ask a topical expert – perhaps there is someone else on campus with more advanced knowledge of some of the questions asked on the survey than you.  That’s ok!  Ask them if they wouldn’t mind reviewing the survey for you and making sure your survey items are appropriate.
      • As a survey expert – you can also ask for feedback from someone who is an expert in survey design.  The members of SAALOG can help provide feedback on the types of questions and general design of your survey.
  • Sampling
    • Consider the questions you are interested in.  Are you hoping to get a sense of what services all university students feel are missing on campus with regards to your units function?  Are you hoping to get feedback regarding the learning outcomes you established for the 40 students who participated in your program?  Having a clear idea of what body of students you would like to generalize to will help inform your sampling procedure.  Learn more about "Sampling and Sampling Sizes."
  • Length
    • Considering the length of your survey is important.  The shorter your survey, the more likely students will complete it.  For program assessments, it is best if the survey takes less than 5 minutes to complete.  Consider the types of questions you are asking as well; open-ended items tend to take more time to respond to than multiple response questions.
  • Response Rates
    • Having a high response rate is important for being able to “trust” the data you collect.  Low response rates introduce bias into your results, and make interpreting them with accuracy challenging.  With the move to conduct more surveys online, students are being asked to complete more surveys than in the past, and response rates are dropping.  In general, you should strive for your survey to have at least a 25% response rate (of course, what is considered "acceptable" will vary according to target population and survey purpose). Learn more tips on how to solicit an acceptable response rate.
  • Incentives
    • One method to increase response rates is providing incentives.  Being a state institution, we face some limitations in terms of what we may provide students.  See this document for incentive suggestions specific to the University of Maryland you could provide and other considerations.
  • Timing
    • Think carefully about when you want to distribute your survey to students.  If it is a program evaluation, sending students the survey closer to the end date of the program will increase the likelihood that they will respond.  If your survey is not tied to a specific program, consider the academic calendar, stressful times during the semester, and religious holidays when determining your administration timeline.
  • Dispersal methods (paper, online, mobile)
    • There are a variety of ways to get your survey in front of students.  You could email a link to an online survey to them – this will likely increase the number of students you are able to ask to complete the survey and will allow you to easily remind them to complete the survey.  You could distribute a paper survey to your students.  This may be more costly and limits the complexity of the survey design, however response rates are generally much higher with paper than online surveys.  You could also design a mobile survey that students can complete on a smart phone or mobile device.  (Someone in SA) has (iPods? iPads?) that can be checked out for an event. Mobile surveys should be extremely short and include no open-ended items.
  • Qualtrics
    • The University of Maryland utilizes Qualtrics is a survey platform provided to everyone at the University of Maryland.  Qualtrics is an advanced survey design tool, however the support available for Qualtrics is focused on designing the survey in Qualtrics, not on providing feedback.  The Division of IT offers workshops on how to develop surveys in Qualtrics.

Human Subjects

  • Consent: When engaging in an assessment, you should first ensure that the students have consented to be a part of this process.  Alerting them of how their responses will be used and providing the contact information of the staff member in charge of the assessment are key things to consider.
  • Confidentiality: When performing any assessment, it is a best practice to ensure those who are responding that their responses will be kept confidential.  That is, you may know who expressed what opinion for any given project, but you may not release this information to others.  This helps to ensure that students will give you their honest feedback.
  • Institutional Review Board (IRB): Sometimes, an assessment that you conduct may contribute knowledge that would be of use to the whole field.  You may wish to publish these findings or present them at an external conference.  By gaining IRB approval before you begin your project, you ensure that these avenues are open to you should you decide to pursue them.  It is the role of the IRB to protect human subjects (i.e., the students), and to that end they will review your research design thoroughly to ensure it meets their ethical guidelines.
  • FERPA (Family Educational Rights and Privacy Act): Under FERPA, you may not release information that includes identifiers.  That is, if you release the results of you assessment publicly, you must first ensure that the students who responded to your request for feedback (be it a survey or another method) cannot be identified through your report.
  • Data Security: Adequate provisions must be made to maintain the confidentiality of identifiable information and data collected. Data should be stored in locked file cabinets/offices and be password protected. Personally identifying data should be deleted from data sets when appropriate or after a set period of time and access to data should be limited. 

Obtaining sample data

  • Types of data you can obtain
  • Process (Lisa, Registrar, existing data)
    • Issues of approval and access are important to consider early on in the assessment process. Please review this document to learn about the when, how, and why of gaining access to student data.
  • Listserv v. confidential data

Reporting and using the findings

  • Best practices
    • When reporting your data, it is important to not just provide raw numbers, but to provide context and interpret the findings for the reader. It is also recommended that you review your results with others to find additional meaning in the data.
    • Results that are not shared are less likely to be used, which in terms makes the survey process purposeless. Consider these best practices in writing reports and sharing your results.
  • Maintaining records
    • If you left your position and your replacement wanted to duplicate your survey, could they do it? While a copy of the survey and a report of the results should be readily available, you should also leave detailed notes regarding sampling, distribution, analysis (including any coding, syntax, etc.), and reporting (who it was sent to, notes from presentations given, questions you received that would inform future surveys, etc.). If your survey was developed in Campus Labs, you can also enter project notes there.

Next Steps/ The assessment cycle

  • Assessment as an iterative process
    • As evidenced in the assessment cycle, assessment is iterative in nature. Findings should support programmatic changes, which in turn inform new or revised assessment measures.
  • Incorporating assessment into your planning process
    • To be most powerful, assessment should be tied to your annual planning processes. When you are determining goals for the year (or longer term strategic planning), asking “how will be know if we are successful?” will help drive the assessment process forward. Take time each year to ask, “what do we need to know to improve our services (or better understand our students)?” and map out timelines for your assessment measures.
    • Here is a helpful assessment plan template to assist!

 

 

Note: documents in Portable Document Format (PDF) require Adobe Acrobat Reader 5.0 or higher to view, download Adobe Acrobat Reader.

Assessment Consultants

SAALOG members (see link on the left) are available to assist with any assessment questions you may have, including assessment planning, writing learning outcomes, method selection, instrument development, or general assessment or learning outcomes training.

While we do not require approval for most surveys to be created or distributed, we do recommend you consult with a SAALOG member for projects that are division-wide or a major campus initiatives, when you will be asking for confidential data from the Registrar (i.e., emails or other personal identifiers), or if you are planning on sending a survey to more than 2000 people.

Committee Members

SAALOG (Student Affairs Assessment and Learning Outcomes Group) is comprised of Division of Student Affairs staff who possess knowledge, skills, or experience that will contribute to the Steering Committee’s effort and is by invitation.  A broader coalition of staff who have interest in assessment and/or learning outcomes may join the learning community to receive information about professional development opportunities, relevant articles and resources, and to engage in an on-going dialog.

Chair:

Steering Members: