‘Normal and Easy: Account Sharing Practices in the Workplace’ – new paper for CSCW 2019

Drumroll … I now am a co-author on an archival publication in the lead venue for social computing!

Our paper, “Normal and Easy : Account Sharing Practices in the Workplace,” is being published this month in Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW. This is part of the Conference on Computer-Supported Collaborative Work and Social Computing – which is what most of my life’s work in information technology and media has revolved around.

However, as much as I might want to be present, I am also practicing good self-care this fall – and part of that is limiting my travel so that I don’t run myself ragged trying to be different places plus keep up with my research and personal life! So, my advisor Laura Dabbish is presenting this research on Wed., Nov. 13, in Austin, Tx., USA. 

For this research paper, we conducted two online surveys. In Study 1, we asked people a series of open-ended questions to elicit their sharing practices and start to zero in on what are their key pain points, while in Study 2, we collected a series of closed-ended items to gather specific details about how and why people shared digital accounts with colleagues and what are their specific struggles with those activities. We have posted these survey protocols on our website at https://socialcybersecurity.org/files/WorkplaceSharing_OpenEndedShort_Qualtrics.pdf and  https://socialcybersecurity.org/files/WorkplaceSharing_ClosedEndedLong_Qualtrics.pdf.

Our results demonstrate that account sharing in the modern workplace serves as a norm rather than a simple workaround (“normal and easy”), with the key motivations being to centralize collaborative activity and to reduce the work needed to manage the boundaries around these collaborative activities.  

However, people still struggle with a number of issues: lack of activity accountability and awareness, conflicts over simultaneous access, difficulties controlling access, and collaborative password use. (Hands up, anyone who has a sticky note taped in their work space to share passwords for key accounts?)

Our work provides insights into the current difficulties people face in workplace collaboration with online account sharing, as a result of inappropriate designs that still assume a single-user model for accounts. We highlight opportunities for CSCW and HCI researchers and designers to better support sharing by multiple people in a more usable and secure way.

This is a BIG paper, so I’ll stop restating the abstract and send you to the link on our website: 

  • Yunpeng Song, Cori Faklaris, Zhongmin Cai, Jason I. Hong, and Laura Dabbish. 2019. Normal and Easy: Account Sharing Practices in the Workplace. In Proceedings of the ACM: Human-Computer Interaction, Vol. 3, Issue CSCW, November 2019. ACM, New York, NY, USA. Available at: https://socialcybersecurity.org/files/CSCW2019_NormalAndEasy.pdf 

‘A Self-Report Measure of End-User Security Attitudes (SA-6)’: New Paper

This month is a personal milestone – my FIRST first-author usability research paper is being published in the Proceedings of the Fifteenth USENIX Symposium on Usable Privacy and Security (SOUPS 2019).

I will present on Monday, Aug. 12, in Santa Clara, Calif., USA, about my creation of the SA-6 psychometric scale. This six-item scale is a lightweight tool for quantifying and comparing people’s attitudes about using expert-recommended security measures. (Examples of these include enabling two-factor authentication, going the extra mile to create longer passwords that are unique to each account, and taking care to update software and mobile apps as soon as these patches are available.)

The scale itself is reproduced below (download the PDF at https://socialcybersecurity.org/sa6.html ):

  • Generally, I diligently follow a routine about security practices.
  • I always pay attention to experts’ advice about the steps I need to take to keep my online data and accounts safe. 
  • I am extremely knowledgeable about all the steps needed to keep my online data and accounts safe. 
  • I am extremely motivated to take all the steps needed to keep my online data and accounts safe.
  • I often am interested in articles about security threats. 
  • I seek out opportunities to learn about security measures that are relevant to me.

Response set: 1=Strongly disagree, 2=Somewhat disagree, 3=Neither disagree nor agree, 4=Somewhat agree, 5=Strongly disagree. Score by taking the average of all six responses.

If you are a researcher who can make use of this work, please download our full research paper and cite us as follows: Cori Faklaris, Laura Dabbish and Jason I. Hong. 2019. A Self-Report Measure of End-User Security Attitudes (SA-6). In Proceedings of the Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019). USENIX Association, Berkeley, CA, USA. DOI: 10.13140/RG.2.2.29840.05125/3.

Many thanks to everyone who helped me develop and bring this project in for a landing, particularly Laura and Jason, Geoff Kaufman, Maria Tomprou, Sauvik Das, Sam Reig, Vikram Kamath Cannanure, Michael Eagle, and the members of the Connected Experience and CHIMPS labs at Carnegie Mellon University’s Human-Computer Interaction Institute. Funding for our Social Cybersecurity project is provided by the U.S. National Science Foundation under grant no. CNS-1704087.

Tips from my online survey research in 2018

I have a special affinity for quantitative research. Specifically: I LOVE online survey work! It’s a very efficient method to gather lots of data at scale that is almost automatically formatted for easy analysis and visualization.

Even though I have past experience with designing, collecting and analyzing survey data around current events, politics and marketing, my academically-focused work this year really sharpened my survey skills. I used Qualtrics almost exclusively as the online platform for creating and administering surveys, and I recruited larger samples than I ever have before using Amazon Mechanical Turk (MTurk), Qualtrics‘ own panel aggregation and a study pool run by Carnegie Mellon University’s Center for Behavioral and Decision Research (CBDR). My collaborators and I also recruited participants through US-based Survey Monkey; Prolific Academic, a UK-based company that also recruits US-based workers; and QQ and SoJump survey platforms based in China.

The increased stakes and many unknowns I encountered as my research evolved led me to reach out on social media, at workshops and in my own labs for help and for new ideas. I also reached out to MTurk workers for advice on fine-tuning my surveys, and I signed up myself as an MTurk worker to see how it looks from the other side. Below, I share some of what I learned during this year’s academic work that may also benefit your own survey research.

Continue reading “Tips from my online survey research in 2018”