“O would some power the gift to give us to see ourselves as others see us.”
“Are the villagers carrying torches?”
—A scientist beginning to review her 360 results
360 surveys have strong potential to improve science communications, but getting the most from the surveys requires thoughtful planning and analysis. This first part of a two-part posting outlines 360 potentials and problems, and notes steps to plan and launch your survey.
360 Survey Uses, Potentials
We like to use 360 surveys because they consistently generate useful results. Scientists we work with in leadership training, teambuilding, and strategic planning consistently use their 360 results to make meaningful gains, improving their leadership and communications performance. Their survey results help them see more clearly how their communications with others advances—or obstructs—their science.
360 surveys provide the person being surveyed with survey data about their communications and leadership effectiveness from respondents. The surveys use the “360” label because they usually elicit data from a mix of people around the person being surveyed: people who manage, collaborate with, are clients of and/or are managed by the respondent.
Once thought radical, 360 surveys are now in widespread use in most industries. Numerous vendors offer 360 survey packages online. Software like Survey Monkey makes it possible for amateurs to design their own surveys with minimal investments of time and effort. Some companies conduct 360 surveys on key managers and employees and use the results when making promotion and advancement decisions. Others enable survey recipients to control the process themselves, selecting respondents more for learning and development purposes.
Scientists in particular benefit from the 360 survey process. If done well, the surveys translate the important but difficult-to-describe competencies of “communications” and “working relationships” into data. Scientists can take the time to analyze the often-rich data in their survey results, and develop and implement thoughtful action plans. Some scientists we’ve worked with have taken their 360 results data far beyond the percentage and mean scores the survey software produces, creating scatter diagrams and exploring standard deviations.
360 Survey Shortcomings, Dangers
Of course 360 results do not reflect respondents’ opinions in ways that are perfectly accurate. Some respondents are wary of the surveys’ claim of confidentiality and so restrain their responses. Some use the surveys as a weapon, communicating long-held grudges. Others take the surveys as an opportunity to advance their own position, e.g., in open-ended comments like, “Harry is the best boss I ever had. Signed, George.”
It’s also quite possible to for 360 surveys to cause significant damage. The whole premise of receiving “objective” survey feedback one’s communications effectiveness glosses over the gravity of the personal issues 360 surveys may address. Listening skills, showing respect, working well with others—the behaviors surveys describe can be difficult to think about and discuss, entangled in respondents’ ego and identity.
Some few companies have put 360 surveys to ill use, reassuring respondents and recipients that the process is confidential but then using the data later to influence advancement and layoff decisions. More often, the harm people cause with 360 surveys is unintentional, the result of careless planning, survey design or processes to absolutely ensure respondent confidentiality.
Getting The Most From Your 360 Survey: Ten Tips For Planning and Launching
It takes some clear thinking and planning to avoid the problems and shortcomings, and get the significant results 360 surveys offer:
- Clarify outcomes. It’s most useful to begin work with 360 surveys not by reviewing the packaged survey offerings online or drafting survey questions but by listing what kinds of information you want the survey to provide and clarifying how you want to use the survey. If the survey is being initiated not by you but by your organization and your influence on survey design is limited, it’s still very useful to clarify what you’d like the survey to do for you, what aspects of your communications you’d like it to spotlight, and how you anticipate using the results.
- Clarify survey ownership. Some companies retain 360 survey data in Human Resource files; others enable recipients themselves to receive the only copy of the data. Both approaches can be effective for different purposes as long as they are made clear and then scrupulously maintained. Changing the uses and circulation of survey data mid-stream is ineffective and unethical.
- Clarify respondent confidentiality. Some 360 surveys don’t try to achieve respondent confidentiality, using the process more as an open reporting device. More often, 360 surveys strive to ensure respondent confidentiality to maximize the value of and validity of responses. Respondent confidence in your assurance of confidentiality will enhance the quality of their responses. Take the time to discuss your plans for ensuring confidentiality with several of your actual respondents to ensure that you’ve provided an effective strategy.
- Consider the number of respondents. It usually takes a minimum of eight respondents to both provide reasonable respondent confidentiality and generate enough responses to detect patterns and trends in your communications. It’s fine to increase the number of respondents to 15 or 20, but often difficult to analyze results beyond that.
- Maximize response rate. A survey of 10 respondents that has a 100% response rate is much more valuable—and valid—than one of 100 respondents that has a 10% response rate. Work to increase your response rate by selecting respondents thoughtfully and encouraging them to be open and frank in their responses. Keeping your survey as short as possible maximizes response rates. It’s tempting and easy to add questions that are “interesting.” However, each question you add has the potential to reduce your response rate.
- Link your desired outcomes with the survey questions. Pre-packaged surveys generate tidy, graphically impressive results and enable you to compare your responses with those of people in other organizations. However, many of the statisticians we’ve worked with have scoffed at the packaged survey statistics as misleading. They’ve challenged the practice some of these packages use in providing you with mean data scores, neglecting to show how the data points are arrayed. Also, science jobs seldom lend themselves to comparison across organizations because they usually evolve in ways that are shaped by specific, project-related tasks.
- If you’re designing your own survey, take the time to do the online tutorials. Customizing your survey usually makes it more possible to link survey questions with your desired outcomes, but it does take a bit of effort. The neatness and agility of Survey Monkey and other online tools can mask the ambiguity of the questions you formulate. To ensure that your questions are addressing the items that interest you, pre-test them on a few people and discuss their thoughts.
- Tune your questions. Organize your questions into 5–8 groupings of 5–6 questions per group. Use a five-point Strongly Agree–Strongly Disagree scale for responses on closed ended questions. The difference between an “Agree” and “Strongly Agree” response is essential for respondents to express their sentiments and scientists who want to perform at the highest levels. Use mostly closed-ended 5-point scale questions, but also include several open-ended questions to capture whatever the closed-ended questions miss. Remind respondents that the survey software uses their open-ended comments as written, so they should be careful not to use catch phrases that may identify them.
- Consider using two-part questions. Two-part questions ask respondents not only to scale responses on five points from Strongly Agree–Disagree but also rank the Importance of the question on a five-point scale from Very Important to Very Unimportant. This question form often points out to the person being surveyed that he/she is performing tasks effectively that respondents don’t think are important while neglecting tasks that respondents rate as high priorities.
- Ask respondents to email you when they’ve completed the survey. That way you can protect their confidentiality while also keeping tabs on your response rate. If your response rate lags, ask all respondents again to complete the survey. Allow about a week for respondents to complete their efforts.
Next: Analyze Your Survey Results, Create Effective Action Plans
Do most people accurately predict their 360 results? What’s the difference between an “Agree” and a “Strongly Agree” response? What’s considered “Excellent” in 360 responses? Should I communicate with respondents after I’ve received my 360 survey results?
Even very intelligent people can make big mistakes interpreting their survey results. We explain why this happens and provide insights about how to work with your survey results most effectively in Part 2 of this article.