Following are the speaking notes I used as a panelist sharing reflections from the field at the 2023 Canadian Society for the Studies of Education (CSSE) Congress, York University. I proposed a four factor model for organizations to evaluate and understand the response rates of their survey administrations.
Introduction
Surveys are a common tool used by school boards to meet a variety of needs including ministry reporting, local planning, public engagement, and evaluations. Whereas this activity of data collection once required specialized training and access to specialized tools to engage in large scale administrations, the availability of desktop tools such as Google Forms, or popular apps such as SurveyMonkey and Alchemer has made the process easy to access. Unfortunately, “easy access” to these tools has simultaneously created the expectation that it is also “easy development”, “easy administration” and “easy reporting”. It is often appreciated too late that, the construction of a question, the decision of who should participate, and the limitations of how the data can be summarized, benefits from theoretical and experiential reflections. When poorly constructed, poorly communicated, and poorly administered, a survey can become a process that damages relationships, conceals issues, and impedes an organization’s decision-making capacity.
As has noted by Doan, Conley and Martinovic (2020), there are inequities in education research across school boards in Ontario. Larger school boards, particularly in the GTA, have research departments as part of central office staff, while smaller school boards tend to disperse research and legislated data collection to other portfolios as supplemental tasks. Within boards with research staff, the engagement of those staff is similarly variable, often as a result of where the department is positioned in the organization as well as the organization’s perception of the value that research staff bring to their roles and offer the organization. While it would be reasonable to expect that board researchers are engaged at the outset in the development of surveys and data collection tools, this is not always the case. Many times, researchers are drawn into these post-survey discussions as though their only value is in data analysis. Unfortunately, there is often little that can be done to mitigate or compensate for the limitations that are formalized into the pre-survey decisions. It is through discussions and reflections on these kinds of issues with colleagues through research networks such as AERO (Association of Educational Researchers of Ontario) and DUG (the Ontario Data User Group) that a four factor model of awareness, access, availability and trust, has emerged to support the development of data collection when used pre-emptively, and to support root-cause analyses of limitations and constraints when used after-the-fact.
Overview of Four Factors
Following is a brief description of each factor in the model:
Awareness (Communication): This factor focuses on the communication and promotion of the survey, ensuring that potential respondents are well-informed about its existence, purpose, and significance. Effective communication strategies can increase awareness and motivate participants to engage with the survey.
Access (Equity): This factor pertains to the mode of administration and accommodations provided for the survey. By offering various formats and options that address different needs and preferences, researchers can make it more convenient for participants to engage with the survey and increase the likelihood of receiving responses.
Availability (Schedule): This factor refers to the participants’ schedule and their willingness to allocate time for completing the survey. By considering the timing, frequency, and duration of the survey, researchers can create a more participant-friendly experience and improve response rates.
Trust (Relationship): This factor signifies the relationship between the survey participants and the organization administering the survey. Trust plays a crucial role in the willingness of participants to share their opinions and experiences and is larger than the data collection tool and administration considerations.
Underlying each factor are relationships, both those that already exist, and the way in which those relationships will either be built up or torn down. Following is a more detailed reflection on each factor and how they relate to relationships and their expressions of trust.
Factor Descriptions and Discussion of How They Contribute to Trust
Practices
Awareness is a critical factor to drive survey response rates, as it ensures that potential respondents are informed about the survey’s existence, purpose, and relevance. Effective communication and promotion strategies can increase awareness, motivating participants to engage with the survey and provide their input. To build awareness a variety of practical strategies are available to strengthen the communication plan such as:
- Utilize multiple channels: Employ a mix of communication channels, such as email invitations, social media posts, website announcements, or physical mailings, to reach a wider audience and increase the likelihood of reaching potential respondents.
- Personalize messaging: Tailor the messaging to resonate with the target audience by addressing their specific concerns or interests, using language that they are familiar with, and highlighting the potential impact of their participation on the issues that matter to them.
- Translations: Make the survey available in a variety of languages that are common to the community that is being engaged.
- Emphasize the survey’s significance: Clearly communicate the survey’s purpose, objectives, and relevance to the target audience. This can help potential respondents understand the importance of their participation and encourage them to respond.
- Send reminders: Use follow-up emails, notifications, or other reminders to keep the survey top-of-mind for potential respondents. Be mindful of the frequency and timing of reminders to avoid annoyance or disengagement.
- Leverage existing relationships: Partner with well-respected individuals or organizations within the target population to help promote the survey and lend credibility to the research effort.
Impact on Response Rates and Relationships
It is straightforward to reflect on how a weak or inconsistent approach to communication can result in low response rates. Although a participant may have opinions and experiences they would be interested in sharing, it becomes difficult to participate in a survey that you are unaware of. Not only is an opportunity to engage lost, there is a risk of marginalizing the community that the survey was intended to engage. If an organization does not have a clear appreciation for any communication gaps, it can create impression that the disengagement is a community issue rather than an organizational issue. In addition to raising awareness of the survey there is another channel of communication that is simultaneously at play: The communication of the organization’s awareness through the content of the survey.
Whereas the communication plan is designed to raise participants’ awareness of the collection opportunity, then the survey tool itself is designed to communicate the organization’s awareness of the community it is engaging and the issues it is seeking feedback on. When raising awareness about a survey, the standard invitation highlights why a person is being invited, how the data that is collected will be used, the authority that an organization has to gather the information, and contact information for anyone who has questions. Moving past this front matter, the framing of questions, the selection and exclusion of response categories, and the choice of question format deliver organizational messaging and considerations of visibility and representation.
With respect to identity questions, organizations that are unaware of their communities are at risk of silencing participant voices by excluding their representation in question categories. The risk of response misattribution is also a considerable possibility when surveys do not include open ended response opportunities for participants to share responses that may not be included. Reflections on question categories provides an opportunity to identify how the survey could, whether intentional or not, silence communities. Not providing space to add open ended responses can lead to misattribution of experience when participants are unable to find themselves represented in the survey, have no opportunity to identify themselves due to the construction of the question.
Given the weight of communication that the survey design carries, at least in the education context, posting the questions for public review prior to administration provides opportunities for building relationships. Posting questions for review supports transparency, builds accountability, and engages potential participants. For some participants, the survey tool may be the first time they have encountered a concept and the survey tool becomes their first point of contact or engagement.
It provides an opportunity for the organization to be made aware of where the blind spots are as well as an opportunity to own the issue, make a change, and honor the engagement.
Access: mode of administration and accommodations
Practices
Access addresses the ease and convenience with which potential respondents can engage with the survey. By offering a variety of administration modes and accommodations, organizations can address and support different needs and preferences, ultimately increasing the likelihood of receiving responses. Providing strong support for participant access involves incorporating a variety of practical strategies such as:
- Multiple modes of administration: This can include online surveys, telephone interviews, mailed questionnaires, or in-person interviews. Offering multiple modes of administration allows participants to choose the option that best suits their preferences and circumstances. For example, online surveys are often more convenient and accessible for many respondents due to their ease of use and flexibility, while telephone or in-person interviews may be preferred by those who are less comfortable with technology or have accessibility concerns.
- Accommodations options: Another important component of access is ensuring that the survey is inclusive and accommodating for individuals with varying needs. This may involve providing alternative formats such as large print, Braille, or audio recordings for visually impaired participants, offering translations for non-English speakers, or ensuring that online surveys are compatible with screen readers and other assistive technologies. By addressing these accessibility concerns, organizations can create a more inclusive survey experience and encourage a wider range of participants to respond.
Survey Construction: The design and structure of the survey itself can impact access. This includes factors such as the length of the survey, question format, and navigation. A well-designed, user-friendly survey that is easy to understand and complete can significantly improve response rates by reducing the barriers to participation.
Impact on Response Rates and Relationships
Whatever advantages are gained through a clear communication plan, they will be negated by any barrier to access. Making community members aware of opportunities to provide feedback, and then providing it in English when they are not fluent in English, or online when they have no access to a computer, effectively revokes the opportunity to share experiences. It is straightforward to see how these experiences leads quickly to disengagement.
Availability: participant’s schedule and willingness
Practices
Availability addresses the participants’ ability and willingness to allocate time for completing the survey. By considering the timing, frequency, and duration of the survey, researchers can create a more participant-friendly experience that encourages higher response rates. Practical considerations include:
- Timing of administration: Distributing a survey during a busy period or a time when potential respondents are less likely to be available can result in lower participation. To maximize availability, organizations should consider factors such as holidays, weekends, or peak work hours when scheduling the survey distribution. Offering a sufficient response window also ensures that participants have enough time to complete the survey at their convenience.
- Survey Frequency and overlap: Over-surveying a target population can lead to survey fatigue, making individuals less inclined to participate. Striking a balance between gathering necessary data and not overwhelming respondents is crucial. Consideration should be given not only to when the last survey was administered but whether there is any overlap with other data collection schedules.
- Survey length: The duration of the survey also plays a role in participants’ willingness to engage. Lengthy surveys can be off-putting, leading to lower response rates or incomplete submissions. To maximize availability, organizations should aim to keep surveys concise and focused on the most important questions. If a lengthy survey is unavoidable, providing an estimated completion time and progress indicators can help manage expectations and encourage respondents to finish the survey.
By addressing availability through thoughtful scheduling, frequency, and duration, as well as considering incentives, researchers can create a more respondent-friendly survey experience and improve response rates.
Impact on Response Rates and Relationships
Building awareness of data collection opportunities and ensuring a appropriate accommodations can easily be scuttled when community schedules and timelines are a blind spot for organizations. When there are competing demands on a participants time, they are forced to make a decision about how to prioritize the time to complete the survey. This act of triaging time is quick and in most cases the decision is not in favor of the survey. This quick assessment of how much importance to place on participating in the survey is made terms of the relationship the person has with the organization. It is typically only those with a very strong relationship with the organization (either good or bad) that results in participation in the survey rather than dismissing the invitation.
Trust: relationship with the organization administering the survey
Practices
Trust is a critical factor related to response rates, as it influences the willingness and motivations of participants to share their opinions, experiences, and personal information. Establishing and maintaining trust between survey participants and the organization administering the survey is crucial for ensuring higher response rates and obtaining reliable data. Some of the considerations include:
- Transparency: Researchers should clearly communicate the survey’s purpose, objectives, and the organization behind the study. Providing details on how the collected data will be used, stored, and protected can alleviate concerns and increase participants’ willingness to engage. Additionally, it is essential to inform participants about any potential risks, benefits, or consequences associated with their participation.
- Confidentiality: Researchers must ensure that participants’ responses are kept confidential and that any personally identifiable information is protected. This may involve using secure data storage and transmission methods, anonymizing data, and implementing robust data protection policies. Clearly communicating these measures to participants can enhance trust and encourage more candid responses.
- Rapport. This may involve using personalized communication, expressing gratitude for their participation, and demonstrating empathy and understanding of their perspectives. If possible, providing a contact person or a support team that participants can reach out to with questions or concerns can further strengthen trust and rapport.
- Reputation and credibility: Researchers should ensure that their organization has a strong track record of ethical conduct, transparency, and expertise in the subject matter. Collaborating with well-respected partners, institutions, or experts can further enhance credibility and trust in the survey.
Impact on Response Rates and Relationships
The issue of trust has two facets for consideration. As highlighted in the previous section, there are actions that can be taken in an attempt to assure or gain the trust of participants. On the other hand, the relationship participants have with the organization based on past experiences and engagement, are a significant part of the calculus of whether to participate or not. Every invitation to share information through a survey implicitly asks participants to trust the organization with the information and experiences they share. Consequently, it is important to take time to reflect on whether the organization is worthy of that trust. This reflection is more than the statements that are made to cover privacy or research ethics. It is about the relationships that individuals and communities have had with the organization. Understanding these relationships requires a high level of engagement with communities and an inclination to “own it”. Following are reflections of how past actions and relationships influence the decision not to participate.
- Failure to act: When participants have shared their experiences and do not see decisions or actions being informed by what was shared, they will be less inclined to participate in future surveys.
- Silenced: When experiences and opinions that are shared through a survey are not surfaced in a report or presentation, community members will be less inclined to participate in future surveys.
- Myth of objectivity: Who administers the data collection will either leverage existing relationships, or be hidden by the institutional form. Central coordination of data collection has benefits in terms of economies of scale and access, and is periodically presented as being “objective”. However, the feedback being sought is often subjective and centered in the relationships community members have with the organization. When relationships are leveraged to administer the data collection, there is the potential for a more meaningful invitation and an opportunity to start to heal broken relationships. The healing of relationships begins to be broached by the acknowledgement of issues that is trying to be understood through the data collection. In some cases, this opportunity will become available in subsequent data collections once community members can see whether the commitment expressed is manifested through action.
- Dismissed due to response rates: When feedback is dismissed due to small sample size, it silences and marginalizes the feedback that has been provided by those who participated. Not only does this make participants feel silenced, it also validates the non-participation of community members who are cynical about whether the process will lead to action, change or opportunities. While non-response bias is a critical consideration for the kinds of decisions that can be made based on the data, dismissing the data entirely on this point risks creating a feedback loop of non-participation where new participants feel marginalized and less inclined to participate and previous participants feel justified in their non-participation.
- Myth of satisfaction: The myth of satisfaction arises from the fallacious assertion that non-participation in a survey reflects a lack of concern or issues in the community. It is built on the idea that only those with issues or concerns are motivated enough to participate. It does not require a trained ear to discern that the silence of disengagement sounds exactly the same as the silence of satisfaction.
- Repetition of Collection: Returning to a community to ask the same questions under different initiative titles, motivations, and reporting requirements strengthens the perception that previous feedback has not been heard. The issue of being survey-fatigue has important relationship considerations.
- Own it: Difficult feedback needs to be owned, which means it is heard, it is understood. In the moment of feedback, hearing and understanding achieved through active listening. It is not a time for rationalization, correction, or sharing of competing stories. In the process of reporting, owning it means communicating results in such a way that all who participate can see themselves represented in the report and have been genuinely considered in the decision-making process.
When engaged in this work, it is important to be aware of the seemingly negative consequence of successfully building trusting relationships that support data collection. As communities see their experiences reflected in the reporting, decisions, and actions of an organization, they are more inclined to engage or re-engage in future data collection. This renewed and increased engagement often means the sharing of more negative experiences in the expectation that they will be heard, and actions taken to support them. While community members may see this as a positive step in their relationship with the organization, an increase in negative feedback can feel demoralizing to the staff of an organization working to support their communities. As difficult as this work is, the opportunity for organizations to build trust with communities and increase response rates does not begin with the communication of the current survey opportunity, rather the actions and follow-through that resulted from the previous surveys. With this in mind, every decision in the survey development process becomes an act of building trust and restoring relationships. Given these considerations, when the factors of awareness, access, and availability are accounted for the response rate can be viewed as a proxy measure of trust and the health of the community-organization relationships.

Introducing REQAO, an R Package for those who work with EQAO data files. This package is a collection of functions to assist in the loading of files and the relabeling of values. As additional functions are added and expanded, the most up-to-date version will be found at 
