4 tips on getting college students to fill out course evaluations

I get close to a 100% response rate on my course evaluations. Why? I apply what I know about social psychology, usability, and user engagement:

– I send an email announcement with the course link. Students are more likely to pay attention to a message from an authority figure and regular fixture in their lives (me) than an anonymous form email from the administration.

– I make a course assignment in our Learning Management System (Canvas) with the link that is due on the final day of the course evaluations. This is the same design pattern that I have used the entire semester to remind them of deliverables and nudge timely submissions. They have formed the habit of checking off this list of to-dos that is visible in the LMS sidebar.

– I offer 1 extra credit point to each student if the class reaches a 100% response rate on the evaluation survey. This is advertised in the email announcement, the LMS course assignment, and in my in-class lecture. A participation incentive that the recruit actually wants is a key motivator for all of my survey research, and it works here too. Plus, it activates students’ altruism and self-interest to help out other students by participating and to influence them personally to take the survey as well.

– I give them 10 minutes in the last class meeting to fill out the course evaluation, if they haven’t already done so. Most have already formed the intention to act based on my previous steps, but have not acted yet on the intention. By making it a class activity with dedicated time, it lessens that inertial force to put it off in favor of other urgent deadlines. I step out of the room, though, to mitigate the social pressure of having me present as they fill it out.

The end result is that my course evaluations are more balanced than if only the students with a grudge against me fill them out. I can trust the results as being a true cross-section of students’ assessments of my work.

‘What Drives SMiShing Susceptibility? A U.S. Interview Study of How and Why Mobile Phone Users Judge Text Messages to be Real or Fake’ – paper at SOUPS 2024

My Phd student Sarah Tabassum is here with me at the Symposium on Usable Privacy and Security in Philadelphia, PA, USA, presenting our paper during Tuesday’s Mobile Security block: “What Drives SMiShing Susceptibility? A U.S. Interview Study of How and Why Mobile Phone Users Judge Text Messages to be Real or Fake.”

For this study, we interviewed 29 people (half students, half outside of campus) about how they make sense of the flood of strange messages received on their phones. Texts with links were commonly seen as “fake” (bad news for the political campaign trying to advertise a pre-primary rally!).

As an Apple user, I was surprised/pleased that Android owners get interface warnings of possible spam or scam texts (see pic). However, there’s no way to report messages. (iPhone has a “Report Junk” option, but no “Report Smish” button either.)

Screenshot of an Android phone screen showing the notification of "Why this Looks Like Spam" for a text message that is claiming to be Chase bank.

Our SPEX Lab group is now thinking about how to better support mobile users in making sense of these messages and learning how to spot the scam SMS-type texts (“smishing” = SMS + phishing).

Something to know – scammers often now will not send a fake link in the 1st text. Instead, they “soft sell”, building trust with a series of messages. Once you reply, THEN they text the link to steal your credentials – or, call and claim to be security investigating the text!

  • Sarah Tabassum, Cori Faklaris, and Heather Richter Lipford. 2024. What Drives SMiShing Susceptibility? A U.S. Interview Study of How and Why Mobile Phone Users Judge Text Messages to be Real or Fake. In Proceedings of the 20th Symposium on Usable Privacy and Security. Retrieved June 25, 2024 from https://www.usenix.org/conference/soups2024/presentation/tabassum-sarah

“A Framework for Reasoning about Social Influences on Security and Privacy Adoption” – new for CHI 2024

This framework gives structure to what is known in the literature and the SIGCHI community about the social-psychological drivers of security and privacy adoption.

Pleased to be getting a publication out from my thesis work! This short paper and poster recaps the initial work to synthesize a framework that provides structure to the growing literature on social cybersecurity.

Many usable security solutions exist (such as using password managers or reporting phishing scams), but people often are not fully aware of what they do or use them regularly. A conceptual model of the adoption process will help us to identify where people get stuck and how to leverage social influences to encourage secure behaviors. We will be able to form and test hypotheses and improve our designs.

Toward this goal, we have developed a framework that synthesizes our design ideation, expertise, prior work, and new interview data (N=17) into a six-step adoption process with path relationships, associated social influences, and obstacles. 

This work contributes a prototype framework that accounts for social influences by step. It adds to what is known in the literature and the SIGCHI community about the social-psychological drivers of security adoption.

Future work (from my lab, but hopefully others’ too) should establish whether this process is the same regardless of culture, demographic variation, or work vs. home context, and whether it is a reliable theoretical basis and method for designing experiments and focusing efforts where they are likely to be most productive.

  • Cori Faklaris, Laura Dabbish, and Jason I. Hong. 2024. A Framework for Reasoning about Social Influences on Security and Privacy Adoption. In Extended Abstracts of the ACM Conference on Human Factors in Computing Systems (CHI EA 2024), May 11-16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 13 pages. Available at: https://corifaklaris.com/files/framework_chi2024.pdf