Idea # 1 for future work: Investigating the role of resistance in cybersecurity adoption

I’ve noticed a peculiar pattern – or more accurately, non-pattern – in all my studies of usable security. At every step of adoption, people exhibit resistance to adopting cybersecurity measures (such as installing password managers or creating unique passwords for each online account). I expected to see a lot of resistant attitudes in people who do not adopt security practices, or those who only adopt security practices if mandated to do so. But resistance is also high among research participants who have voluntarily adopted security practices and who seem very engaged with cybersecurity overall.

As it happens, I have already developed a measure of security resistance that can help in these studies. Take the average of participants’ Likert-type survey ratings on these four items (1=Strongly Disagree to 5=Strongly Agree) [handout]:

  • I am too busy to put in the effort needed to change my security behaviors.
  • I have much bigger problems than my risk of a security breach.
  • There are good reasons why I do not take the necessary steps to keep my online data and accounts safe.
  • I usually will not use security measures if they are inconvenient.

So far, I have found that resistance alone is not a reliable differentiator of someone’s level of cybersecurity adoption. For example, in my working paper describing the development and validation of the SA-13 security attitude inventory, I find that my SA-Resistance scale (the one described above) is significantly negatively associated with self-reported Recalled Security Actions (RSec), but also significantly positively associated with Security Behavior Intentions (SeBIS). By contrast, in a more recent survey (forthcoming), I found that a measure similar to SA-Resistance was significantly positively associated with a self-report measure of password manager adoption, but significantly negatively associated with a measure of being in a pre-adoption stage similar to intention. A research assistant during the 2021 REU program, Faye Kollig, also found no significant variances in resistance among participants in our interview study to identify commonalities in security adoption narratives.

At the same time, adding these resistance items to those measuring concernedness, attentiveness, and engagement (the SA-13 inventory) appears to create a reliable predictor of security behavior. In a study at Fujitsu Ltd., Terada et al. found a correlation between SA-13 and actual security behavior for both Japanese and American participants (p<.001) that was stronger than that for SA-6. The authors speculate that this is because of the inclusion of the resistance items.

Is it consistently the case that resistance only helps to differentiate your level of adoption if it is balanced with other attitudes? Is some other mechanism responsible? I hope to follow up on these results with a student assistant when I join UNC Charlotte’s Department of Software and Information Systems this fall as an assistant professor.

Coincidentally, the New Security Paradigms Workshop has a theme this year of “Resilience and Resistance.” I may submit to the workshop myself, but I also hope that other prospective attendees will find my resistance scale of use in their work.

‘A Self-Report Measure of End-User Security Attitudes (SA-6)’: New Paper

This month is a personal milestone – my FIRST first-author usability research paper is being published in the Proceedings of the Fifteenth USENIX Symposium on Usable Privacy and Security (SOUPS 2019).

I will present on Monday, Aug. 12, in Santa Clara, Calif., USA, about my creation of the SA-6 psychometric scale. This six-item scale is a lightweight tool for quantifying and comparing people’s attitudes about using expert-recommended security measures. (Examples of these include enabling two-factor authentication, going the extra mile to create longer passwords that are unique to each account, and taking care to update software and mobile apps as soon as these patches are available.)

The scale itself is reproduced below (download the PDF at ):

  • Generally, I diligently follow a routine about security practices.
  • I always pay attention to experts’ advice about the steps I need to take to keep my online data and accounts safe. 
  • I am extremely knowledgeable about all the steps needed to keep my online data and accounts safe. 
  • I am extremely motivated to take all the steps needed to keep my online data and accounts safe.
  • I often am interested in articles about security threats. 
  • I seek out opportunities to learn about security measures that are relevant to me.

Response set: 1=Strongly disagree, 2=Somewhat disagree, 3=Neither disagree nor agree, 4=Somewhat agree, 5=Strongly disagree. Score by taking the average of all six responses.

If you are a researcher who can make use of this work, please download our full research paper and cite us as follows: Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. DOI: 10.13140/RG.2.2.29840.05125/3.

Many thanks to everyone who helped me develop and bring this project in for a landing, particularly Laura and Jason, Geoff Kaufman, Maria Tomprou, Sauvik Das, Sam Reig, Vikram Kamath Cannanure, Michael Eagle, and the members of the Connected Experience and CHIMPS labs at Carnegie Mellon University’s Human-Computer Interaction Institute. Funding for our Social Cybersecurity project is provided by the U.S. National Science Foundation under grant no. CNS-1704087.