Each year thousands of faculty produce their annual report usingDigital MeasuresDigital Measures. Each year hundreds of institutions review those reports. Each year many of those institutions are missing out on the opportunity to improve the process, increase the value of the information they gather, and enhance faculty satisfaction with the system. The missing ingredient is a post-process survey.
Surveying faculty requires some thought and planning, but it isn't difficult and it can produce tangible results.
If you are not currently collecting feedback from your faculty, here are some foundational things you should consider:
Pick one process to start
If you leverage Digital Measures for multiple processes (Annual Review, Accreditation Reporting, Promotion and Tenure, Sabbatical Review, etc), trying to tackle all of them at once may end up causing confusion and frustration. Identify one process to start with and execute the survey process completely before you review additional processes.
Know what you want to know and who you plan to ask
Before you begin collecting feedback, you should have clarity around the types of decisions, actions, or sentiment you want to address based on the responses. If you want to know how people feel about the process, you would ask different questions than if you wanted capture response to specific enhancements. We will share some sample questions you can leverage later in the article.
You will also need to know who you plan to survey. Do you want to ask faculty about their experience creating an annual report or do you want to survey Department Chairs about whether the report that is generated is helping them make better decisions? Also, if you provide training or education on how to perform a process, you may want to segment your audience. Faculty who are trying to complete an annual review without any training will likely give you very different input from someone who has received some preparatory support or documentation.
Inform faculty of your plans
Letting faculty know that you will be asking for their feedback will help improve the quality of the responses. Tell them that you plan to survey, when you plan to survey, and most importantly WHY you want to survey. This will provide context for your effort and result in greater awareness and engagement. It will not hurt to explicitly call out how the results of the survey will be used to directly benefit your faculty. What’s in it for me? is a important motivator.
Consider the timing of your survey
Any feedback you collect on a process should closely follow the end of that process. If you leave too much time in between the end of the process and your outreach to faculty, their recollection of the experience will degrade. At the same time, put yourself in their shoes. If your Annual Review process closes on a Friday, it is unlikely that many faculty will take the time to respond over that evening or over the weekend if you send it out immediately after the deadline. If you are leveraging an online survey tool, you are likely using an email invitation to trigger that response. CoSchedule performed a review of 10 email response rate studies by marketing leaders and found the following:
- The best day to send emails in order of preference is Tuesday, Thursday, and Wednesday.
- The best time of day to send your email in order of preference is 10:00am, 8:00pm to Midnight, 2:00pm, and 6:00am.
Close the loop
This step is actually two in one. First, take action on the feedback you collect. To be sure, you will get some faculty who say “Stop doing Annual Reviews”. We know that is not an option. If you have planned your survey well though, you will get data and suggestions that point to clear opportunities for improvement. Take them. Second, inform your faculty about the changes you made based on their input. This is perhaps the most important action of all. Telling your faculty how you used the information they gave you to improve will increase the likelihood that they will respond again when you begin reviewing other processes. Doing this builds a cycle of trust.
Creating an Actionable Survey
After you have taken care of the basics, you will want to craft your survey. Remember, your survey should result in a decision or action that can be executed on and communicated back to faculty.
Here are some important points that will inform your survey design:
- Consider the amount of people you plan to survey and the amount of time/resource you have to analyze the results.
- Quantitative questions will be easier to break down and many survey tools can provide cursory statistical breakouts for you. They will also allow you to pivot your data on certain audience demographics (training or no training, faculty rank, or department) that may prove insightful.
- Qualitative questions will give you more depth, but will take more time to aggregate and more time to complete for the user. Survey Monkey has provided a nice overview of the two question approaches.
- We strongly recommend keeping your survey short with a maximum of 10 questions. We have found our sweet spot to be around 5 questions. The longer your survey is, the more likely your faculty will not complete it.
We recommend leveraging a majority of quantitative questions and peppering in qualitative questions occasionally.
- College: Text Box
- Department: Text Box
While anonymous surveys may sound better, we recommend collecting at least a minimum set of details so you can analyze your responses across groups. If everyone is satisfied with your process except for one college or one department, your follow-up is clearly narrowed down and you can maximize your time.
You could ask faculty to self-identify with their email address if they would be open to taking part in optional user testing or feedback on the changes coming out of the survey.
We used a Likert Scale for many of these questions. Qualtrics wrote a short piece speaking to the appropriate number of points to use in a Likert scale question. You can check out that article here.
- Overall, how easy was it for you to compile your annual report in Digital Measures? (Likert: 5 - Extremely Easy, 4 - Very Easy, 3- Somewhat Easy, 2 - Slightly Easy, 1 - Not Easy at All)
- The training I received made using Digital Measures easier (Likert: 7 - Strongly Agree, 6 - Agree, 5 - Somewhat Agree, 4 - Neither Agree or Disagree, 3 - Somewhat Disagree, 2 - Disagree, 1 - Strongly Disagree)
- Do you feel your annual report accurately reflected the contributions you made this past year? (Likert: 7 - Very Accurate, 6 - Accurate, 5 - Somewhat Accurate, 4 - Neither Accurate or Inaccurate, 3 - Somewhat Inaccurate, 2 - Inaccurate, 1 - Very Inaccurate)
- Was the data loaded on your behalf accurate? (Likert: 7 - Very Accurate, 6 - Accurate, 5 - Somewhat Accurate, 4 - Neither Accurate or Inaccurate, 3 - Somewhat Inaccurate, 2 - Inaccurate, 1 - Very Inaccurate)
- Was there are clear location to capture all of your activities? (Likert: 7 - Strongly Agree, 6 - Agree, 5 - Somewhat Agree, 4 - Neither Agree or Disagree, 3 - Somewhat Disagree, 2 - Disagree, 1 - Strongly Disagree)
- How likely is it that you would recommend this annual review process to a colleague (this is a 1-10 scale and it is a way for us to see Net Promoter Score)?
- Please offer one specific suggestion of how your annual reporting process could be improved. (open text box)
Additional resources from Digital Measures
We believe you should be surveying your faculty to improve your processes. To that end, we are happy to facilitate your survey and provide you with the results. We have access to a third-party tool (SurveyGizmo) and are happy to leverage it on your behalf. Contact your Client Success Manager. We can create the survey and provide you a link to give to your target audience.