I did a podcast! ‘Cybercrime Conversations #12 – Social Cybersecurity’

It was a pleasure to speak this week with Rod Graham, an assistant professor of sociology and criminal justice at Old Dominion University, about my Social Cybersecurity research and my lifelong journey to Carnegie Mellon University. We also talked a fair amount about Zen Buddhism – it turns out he and I have that in common, too. Small world!

My ideas for ‘Theory-Driven Interface Design Strategies to Address ‘False News’ on Social Media’

I have enjoyed my work for the past two years on our Social Cybersecurity project at the Human-Computer Interaction Institute at Carnegie Mellon University. Building on my news and social media background, I’ve also been working on some specific ideas for design strategies to address viral hoaxes, rumors and disinformation/misinformation in social computing systems. Many thanks to HCI faculty Niki Kittur and Geoff Kaufman for providing ideas for prior work to incorporate into these strategies, and to Kathleen M. Carley for her perspective as a computational sociologist.

Poster for Knight Foundation site visit to Carnegie Mellon University, April 8, 2019. Abstract: Non-expert users and experts such as journalists alike can have trouble judging the quality of the content and sources that they encounter in social media. Current interface designs may not be leveraging what we know about how users perceive and judge information when they are multitasking or quickly scanning a display. Our work aims to create new design guidelines for helping busy users to assess false news, unverified rumors and hoaxes in two contexts: (1) Helping users to make their own judgment of 
which specific content should not be trusted; (2) Aiding users in judging the credibility of information sources found in social media.

Today I will debut these for the first time in public and speak with people from the Knight Foundation about some of my ideas. Onward!

Tips from my online survey research in 2018

I have a special affinity for quantitative research. Specifically: I LOVE online survey work! It’s a very efficient method to gather lots of data at scale that is almost automatically formatted for easy analysis and visualization.

Even though I have past experience with designing, collecting and analyzing survey data around current events, politics and marketing, my academically-focused work this year really sharpened my survey skills. I used Qualtrics almost exclusively as the online platform for creating and administering surveys, and I recruited larger samples than I ever have before using Amazon Mechanical Turk (MTurk), Qualtrics‘ own panel aggregation and a study pool run by Carnegie Mellon University’s Center for Behavioral and Decision Research (CBDR). My collaborators and I also recruited participants through US-based Survey Monkey; Prolific Academic, a UK-based company that also recruits US-based workers; and QQ and SoJump survey platforms based in China.

The increased stakes and many unknowns I encountered as my research evolved led me to reach out on social media, at workshops and in my own labs for help and for new ideas. I also reached out to MTurk workers for advice on fine-tuning my surveys, and I signed up myself as an MTurk worker to see how it looks from the other side. Below, I share some of what I learned during this year’s academic work that may also benefit your own survey research.

Continue reading “Tips from my online survey research in 2018”