Are You Prepared to Answer? How Do You Know?

SurveyBefore big data analytics, real-time dashboards and user-friendly survey software, the vice president to whom I reported taught me one of the most important lessons of my career. He asked, “How do you know?” I was sitting across from him and had just outlined some urgent steps I thought we needed to take to respond to constituent concerns. My morning had been filled with dissatisfied calls and emails. As I began to enumerate who had contacted me and share their concerns, I realized (which had been his point) that even if their views were consistent with a larger percentage of our constituency, the number from which I had heard was not representative. Though they were vocal, I had no way of telling whether these sentiments reflected a predominant view. To make changes based solely on a small sample would be unwise. More information was needed. The lesson, of course, was to have answers the next time I was asked, “How do you know?” From that day on, I viewed collecting data and insights as central to decision-making and program design. No matter your field, understanding your marketplace is essential to your success. What you know about people’s beliefs, how they behave and what they value is paramount to making sound decisions. However, before you can answer how you know what you know, you have to make sure that the information you gather is both useful and defendable. I am going to stick to the basics here, but you can start with these guiding questions:

  • What do you want to know? Whether you are assessing quality, trying to answer a specific question or seeking insights, clearly define your purpose.
  • What data will you need? Save yourself time and headaches by making sure that the data you plan to collect will serve your purpose.
  • How will you use the data you collect? How you plan to use the data will help determine which methods you choose and how rigorous your process needs to be.
  • Are you certain your collection methods are sound? In order for your data to be trust-worthy, you have to consider many factors including validity, reliability, neutrality and user-design.

Recently, I took two surveys. Both were poorly designed. I took one survey following a visit to a national chain. My experience at the store was positive, but the survey required that I answer all questions even those not applicable to me. That company’s market researchers are gathering data. But, if they are using that information to assess operations, they are doing so based on flawed information. The other survey followed a customer service call with a national provider. My interaction with the customer advocate had been positive, but before I even spoke to someone I was on hold for 30 minutes. The survey I completed after the call did not ask about wait time nor did it include any means for offering additional concerns. Again, this company is collecting data, but they could have a completely false sense of how they are doing.

Avoid making these common mistakes. Instead, design flexibility into your research instrument.

  • Questions should be optional unless the answers are absolutely necessary for your data to be credible. Offering a “not applicable” option is a simple solution.
  • Provide an opportunity for respondents to express additional concerns. Letting them tell you what they want you to know may provide your most useful feedback.

Most of all, be prepared to answer the question, “How do you know?” Your decisions and proposals will be more credible.

Closing notes:  I invite you to contact me for further conversation about gaining insights through data collection or how I might help you develop your organization’s strategic capability through learning.

The Value of Knowing What is Valued: Lessons for Liberal Arts Colleges

Honoring Questions about the Value of College
Before I left to pursue graduate studies at Northwestern University, I led constituent relations at a small liberal arts college. Much of my work entailed being tuned in to alumni and parent perceptions and communicating about issues of mutual interest. Not surprisingly, around 2008, with the tuition rates rising and the economy’s effect on the job market, questions about the affordability of college, which college administrators and staff had heard with increasing regularity, began to transform into questions about the value of college.

One of the capacities important in constituent relations is to search for the realities in perceptions. I’d found that you can learn a lot by attending to questions, concerns and criticism, looking for patterns and exploring, “What is going on here?” But, as someone who had greatly valued my own undergraduate experience and spent most of my career in higher ed, honoring the questions around the value of college challenged me. My deep appreciation of and commitment to education clouded my view of what was happening. I needed to give more credence to how people were feeling and what factors were shaping their views. With some effort, I set aside my beliefs and began to dig into this perception, listening carefully and reading everything I could to understand what was affecting the perceived value of a college education.

The Value Equation
There are two sides to the college value equation. On one side is the investment students and their families make, financial and otherwise. On the other side are the outcomes (i.e., benefits) of a college education. To increase perceived value, you can reduce costs, increase benefits or, like many colleges have attempted, tinker with both sides of the equation.Value Equation

In my role, I did not have influence over college costs, but I could try to improve how we communicated outcomes. A common criticism, even from within the higher education community, is that we’re not very good at communicating outcomes. (There are lots of reasons for this, but I’ll have to explore them in a future post.)

I decided to collect what I called “value data” to embed in constituent communications. I wasn’t sure what I would find, but I began to look at our performance on surveys like the Higher Education Data Survey (HEDS), National Survey of Student Engagement (NSSE), the Collegiate Learning Assessment (CLA), Educational Benchmarking, Inc. (EBI) and the American Collegiate Testing Student Opinion Survey (ACT). I also arranged for us to conduct video interviews with parents talking about their sons’ or daughters’ college experience. Both the data and the anecdotal evidence indicated we were doing well on outcomes. Of course, there were areas where we were strong and areas where attention was needed, but on the whole, we had good stories to tell. We began to use “value data” and outcome anecdotes in what we said, printed and promoted.

Discovering Most Valued Outcomes
Then I began to wonder which outcomes were most important to parents. (I focused on the moms and dads because of my responsibility for parent relations.) It’s one thing to generate positive outcomes, but to increase value, you have to produce the outcomes that your constituents deem most important. In 2012, I conducted a survey of parents of incoming students to see what outcomes they hoped to see when their students completed their degrees. I compiled a list of outcomes that we already measured through various assessment instruments and asked parents to rank them. I also gave them an opportunity to write in outcomes not on the list just in case we were missing an outcome they sought. Given all the articles that had been written about the “employability” of college students, I was prepared for gainful employment to receive the highest ranking. It didn’t. What did? The ability to think analytically and critically.

Please note that I surveyed parents whose sons and daughters had been accepted and were enrolling. They had specific reasons for choosing a liberal arts college. Nevertheless, the results informed us about what mattered to them. More than anything else, these parents wanted their students to develop the ability to think analytically and critically. Did they want their students to be employable at the end of their college career? Of course. Gainful employment ranked in the top ten of desired outcomes. Were there outcomes that parents wanted that did not rank in the top ten? Yes. Critical to our ability to increase perceived value is understanding expectations. Survey results like this enable us to be more strategic about reporting on the outcomes that are most important to our constituents.

Know what is valued by your constituents and you will be one step closer to increasing value.

If you would like to learn more about increasing value for your constituents, contact me.