Oh @#!% (via the Talking Stick)

By: Dr. Sherry Woosley & Matthew Venaas | Jul 13 2017

The numbers don’t lie. Here’s what to do when you wish they did.

“This can’t be,” they think. “This doesn’t add up.”

But it is. And it does.

The simple truth is that sometimes assessment results do not always come back as expected. Maybe the numbers do not match those from previous years. Perhaps the data refute longstanding conventional wisdom in the department. The results can be inconsistent, either within or across the assessment, or they can contradict existing theory or best practices. Or, worse yet, the results show no improvement in outcomes despite numerous programs that should have an impact. Regardless of what shape these unexpected results take, though, they fall under the umbrella of what is commonly known as an Oh $#'”% Moment, and, once it occurs, a strategy for what will happen next is needed.

The numbers don't lie. Here's what to do when you wish your unexpected results were more expected.

How did we get to this place? All assessment projects should start off with clearly stated goals and objectives, then proceed by establishing the best assessment methods for the project, and end by collecting and analyzing the data. This is the point where the unexpected results rear their ugly head. These, essentially, can include anything that is surprising. They can be either good or bad or simply not what the department wants. Perhaps they do not match what everyone thinks they should be or they just do not make sense. Regardless of their form, unexpected results usually draw attention or surprise just because they are unexpected. But unexpected results may also trigger confusion, panic, or, in high-stakes environments, dread.

Despite how often unexpected results may occur in assessment, the topic is not discussed widely. Perhaps it is pride, or a whistling past the graveyard mindset, but so often conference sessions, committee meetings, and department reports describe data results that are neatly packaged. Results, both unexpected and expected, are explained rationally, and the findings are coupled with clear implications and suggestions. Any confusion is glossed over.

Behind that exterior, though, is the complexity that comes from muddled results or ones that do not point to obvious actions. Unexpected results raise emotions, including panic, confusion, and worry. Even sharing unexpected results may require additional information or alternative strategies. When housing and residence life professionals are facing unexpected results, decisions must be made related to if, what, and how to share them with broader audiences. In the midst of all the unsettling feelings, having a set of strategies that can be drawn upon at that moment makes the process easier to manage.

1. Team Spirit
A primary strategy for handling unexpected results is to rely on a trusted team.

The right colleagues can help think through both the results and the implications of the findings. The team need not be formal, nor do they have to meet as a group, but any professional facing unexpected results benefits from having a core group of people they can turn to. The potential members may vary depending on the project, but a valuable team includes specific dispositions, knowledge, and viewpoints that allow them to contribute effectively.

In terms of disposition, team members need to be calm and thoughtful to ensure that emotions do not cause panic. Group members should also be trusted so that discussions can be transparent without fear of consequences. Team members should be curious, asking questions and exploring alternatives. The ability to remain calm, thoughtful, trusted, and curious will encourage smart thinking.

The team should also include good coverage of specific knowledge areas. At least one team member should have a solid understanding of the methods used for research and analysis. This person can raise questions related to methods and propose alternative explanations for results based on the methods. Other team members, who may also have helped design the assessment model, need a deep understanding of the content of the assessment, allowing them to serve as the subject matter experts. These people can ask questions about content issues that could impact results. For example, if the assessment is related to training, the group should include someone familiar with the goals of the training, the theory behind the training, the topics covered, and the methods used. Specific knowledge about the training could clarify results or point to alternative perspectives about the results. Or, as another example, if the project involved student participants, then the team should have a person with a deep understanding of the student experience. That person can describe findings from the student perspective, including alternative understandings of interview language, survey questions, or student comments. Finally, the team needs to include someone who knows the context of the assessment. Specifically, someone needs to understand not only how the assessment was conducted but also the campus context in which it was conducted. That person would be aware of current issues, major events, and important campus conversations.

Along with knowledge, a strong team will also include a variety of perspectives. Specifically, three perspectives should be represented: a big-picture thinker, a detail person, and a challenger. These categories are common in many discussions and are critical to effective discussions of assessment results. The big-picture thinkers ensure that results are evaluated for usefulness and importance. These thinkers will also consider how the results fit with department goals. The big-picture perspective prevents situations of analysis-paralysis and dilemma-tizing. The detail-oriented people play an important role because their ability to dig into results can be used to find nuggets that explain high-level unexpected findings. Their probes can also be used to find deeper patterns that point out which results are anomalies and which are widespread. Finally, the role of challenger is integral to the team – and not just to avoid group-think. They may use questions to challenge the traditional narratives that reinforce what was done in practice or during the assessment project. Though challengers are often involved in the design and analysis phases, some should also be included in the discussion of results.

Continue reading Oh @#!% in the July + August 2017 issue of the Talking Stick, the official magazine of ACUHO-I (the Association of College and University Housing Officers-International).