Three Ways to Analyze Write-in Responses to Update Survey Items

Three Ways to Analyze Write-in Responses to Update Survey Items

Bridget Yuhas, Allison BrckaLorenz, John Zilvinskis — At the most recent Association for Institutional Researchers (AIR) conference in Washington, D.C., Center for Postsecondary Research analyst Allison BrckaLorenz and project associates Bridget Yuhas and John Zilvinskis presented a program on analyzing write-in survey responses to improve response options. In a session targeted towards IR professionals seeking guidance on how to approach write-in analysis, the team examined three questions on the Faculty Survey of Student Engagement (FSSE) with write-in response options (Table 1) and took three different approaches to analyze more than 7,000 total responses.

Table 1

FSSE Question                                                Responses

The presenters each undertook a different method of analysis to examine these responses. Allison read through each response, hand-coded and grouped the data into four categories: common, less common, uncommon, and frustrating. Bridget loaded the data into Nvivo, a qualitative software tool, read through each response, and coded responses into categories unique to each question. John employed a data mining tool, RapidMiner, and searched for frequently occurring words in each response set.

Without conferring beforehand, the researchers all came to similar conclusions about which terms were most frequently used in responses. However, Nvivo yielded perhaps the most complete and actionable results by allowing for more nuance than RapidMiner and a more evidence-based comparison of response groups than reading/note-taking alone. While each method has pros and cons (Table 2), the best method for other researchers to use will  be determined by their reporting needs, decision-making structures, intended uses for the data, and  and resources of the office where they work.

Table 2

Method of Analysis                     Pros                                         Cons
Options for making edits to these items included making changes to the question stem, adding “rollover” definitions to the online survey, adding additional response options, and adding to parentheticals. Other challenges discussed included not having a universal language to describe the varied positions and roles in higher education and the ever-changing landscape of higher education degrees, courses, and types of employment. Click here to view the PowerPoint presentation from this session, and here to learn more about FSSE in general.


Yuhas, B., BrckaLorenz, A & Zilvinskis, J. (May 2017). Using write-in responses to improve survey measures. Program presented at the AIR Annual Forum, Washington, DC.