ABSTRACT

The University of Toronto and Dalhousie University used dictionary and sentiment-based text analytics developed by Explorance to analyse qualitative student evaluation of teaching data. Both teams analysed comments left by students across a large volume of data at each institution. Text analytics were particularly successful for narrowing and flagging potential comments of concern for human review. These comments of concern were rare, but important to identify, and included offensive student comments, reports of problematic instructor language or conduct, and serious concerns regarding mental health and/or safety. The authors reflect on lessons learned on the design and use of text analytics in the context of student evaluations, including a need for transparency about a tool’s assumptions and associating text analytic themes with specific comments for human review. A number of practice and policy recommendations also emerged, including a need to make data usable by stakeholders, and for engagement with stakeholders with particular expertise and/or context to identify and act upon identified comments of concern in a timely manner.