The Importance of Showing Up: College Edition

I spent this weekend talking over email and in person with students who (for very valid life-issue reasons) have completed – or want to complete – a flexible-time, low-cost, online-only path to breaking into a new career in tech.

Bluntly: Unless your last name is Gates or Zuckerberg, that’s not going to work. Experience as both a student and teacher has shown me two reasons why in-person classes are so vitally important: 

  • Social skills and networking. Most students do not realize what a precious gift these are at the college level. You’ll learn a tremendous amount by simply talking with classmates and staff in passing, developing relationships, and supporting each other. Doors will open that you never knew existed.
  • Time discipline! Very very few students can function at the level needed without the built-in structure forcing you to set aside FOCUSED time blocks for the commute and the class and group meetings. Logging in at home is simply not a sufficient substitute. You won’t absorb enough. You won’t keep up.

Social interactions and time discipline help you shape your competitive advantage. What will set you apart from the 100, 500, 1,000 resumes that flood in for one open position? Will it be that you showed competency (still important) or that you excelled in some fashion (what employers really want to see)? What’s your “special sauce”? Does an acquaintance who already works for the employer (maybe alumni, or a former colleague on a group project, or someone who worked with you in a club) have a reason to believe in your potential? Do you have external references whom the employer will trust and know by name, and who have recently spent time working with you? Did you prove your persistence and drive by achieving something not everyone can do, such as complete an accredited college degree with a 3.5 GPA or higher? Do you walk into an interview with the confidence that comes from setting such a difficult goal and achieving it?

Teachers can only help so much

I try to make it easy for students to work around life issues – maybe their car breaks down, or they are sick and don’t want to infect anyone, or a sibling is getting married in a foreign country. For these students, I offer makeups for in-class activities, and I post audio and video recordings of my lectures, so that they can watch while sniffling in bed or listen to my voice on commutes.

But I have also learned that students who enroll and never show up are, essentially, planning to fail — in the course, and afterward. It’s not only because they miss so many in-class points, such as presentations and exams. It’s really not that no one knows them, because colleges build in enough online interaction through our learning management software, Discord, Slack, and email communications to make others aware of their presence. No, these students have demonstrated to others that they do not show up, and they do not keep up. No one trusts them. No one wants to pick them for a group project. No one has a reason to go out of their way to help them. They have no peer to explain an assignment to them, or to help them study, or commiserate over a difficult concept. There’s no one to go to the trouble of Zooming them into a group discussion. Definitely, there’s no one reminding them that TOMORROW is the midterm or final, when 20% or more of their grade is up for grabs.

Hat tip to Marc Allan for the image suggestion!

Make a conscious effort to show up

Plan to succeed. Car-pool to campus. Arrange a babysitter. Talk with your boss about time off for attending class. Switch work shifts. Go to the club meetings. Go the extra mile in group projects. Don’t overload yourself – take only 1-2 classes per semester if you work full-time. Take out that loan – it’s an investment in yourself and relieves your money worries. Cancel streaming. Study on Saturdays. Cut back on drinking and drugs. Save as much as you can in a 529, 401(k), IRA, or other tax-advantaged account for the day when you can attend college full-time.

Your future self will thank you.

Policy on Use of AI Tools for my course syllabus, version 1.0

Ever since ChatGPT arrived, I have been talking with my students and colleagues about how best we can use it and other AI-powered creative tools such as DALL-E and Stable Diffusion in our work. I also have discussed with students, in particular, how these AI tools also could mislead them (such as by “hallucinating” output that looks and feels like a real-world search result or blog post, but composed of made-up information). This is partly because I feel strongly that students should be prepared for the working world where these tools are rapidly becoming commonplace, and partly because talking about it helps me work out my own thinking about their rightful place in the workflows.

Today seemed like a good day to formalize my thoughts into a written policy for my courses. I credit the blog post linked below with inspiring my wording. But the impetus is the sheer number of conversations I’m having with instructors who suspect AI tool use in coursework this month.

Here is what I have come up with:

“In this course, students are allowed to use tools such as Stable Diffusion, ChatGPT, and BingChat in a similar manner to use of non-AI references, templates, images, or body text, such as those in assigned research papers or obtained via internet search. This means that (1) no student may submit an assignment or work on an exam as their own that is entirely generated by means of an AI tool. And, (2) if students use an AI tool to generate, draft, create, or compose any portion of any assignment, they must (a) credit the tool, (b) identify what part of the work is from the AI tool and what is from themselves, and (c) summarize why they decided to include the AI tool’s output.”

Thanks to a timely comment from Jeff Bigham on Mastodon, I am contemplating adding the following sentence to the above, and retitling the section “Policy on use of AI and Other Creative Tools”:

“The same requirement to credit the use of tools for generating, drafting, creating, or composing work toward deliverables also applies to use of creative tools such as Grammarly and Canva.”

Reference consulted for the above: Kristopher Purzycki. 2023. Syllabus Policy for Using AI Tools in the Writing Classroom. Medium. Retrieved March 17, 2023 from https://medium.com/@kristopherpurzycki/syllabus-policy-for-using-ai-tools-in-the-writing-classroom-8accab29e8c7

Idea # 1 for future work: Investigating the role of resistance in cybersecurity adoption

I’ve noticed a peculiar pattern – or more accurately, non-pattern – in all my studies of usable security. At every step of adoption, people exhibit resistance to adopting cybersecurity measures (such as installing password managers or creating unique passwords for each online account). I expected to see a lot of resistant attitudes in people who do not adopt security practices, or those who only adopt security practices if mandated to do so. But resistance is also high among research participants who have voluntarily adopted security practices and who seem very engaged with cybersecurity overall.

As it happens, I have already developed a measure of security resistance that can help in these studies. Take the average of participants’ Likert-type survey ratings on these four items (1=Strongly Disagree to 5=Strongly Agree) [handout]:

  • I am too busy to put in the effort needed to change my security behaviors.
  • I have much bigger problems than my risk of a security breach.
  • There are good reasons why I do not take the necessary steps to keep my online data and accounts safe.
  • I usually will not use security measures if they are inconvenient.

So far, I have found that resistance alone is not a reliable differentiator of someone’s level of cybersecurity adoption. For example, in my working paper describing the development and validation of the SA-13 security attitude inventory, I find that my SA-Resistance scale (the one described above) is significantly negatively associated with self-reported Recalled Security Actions (RSec), but also significantly positively associated with Security Behavior Intentions (SeBIS). By contrast, in a more recent survey (forthcoming), I found that a measure similar to SA-Resistance was significantly positively associated with a self-report measure of password manager adoption, but significantly negatively associated with a measure of being in a pre-adoption stage similar to intention. A research assistant during the 2021 REU program, Faye Kollig, also found no significant variances in resistance among participants in our interview study to identify commonalities in security adoption narratives.

At the same time, adding these resistance items to those measuring concernedness, attentiveness, and engagement (the SA-13 inventory) appears to create a reliable predictor of security behavior. In a study at Fujitsu Ltd., Terada et al. found a correlation between SA-13 and actual security behavior for both Japanese and American participants (p<.001) that was stronger than that for SA-6. The authors speculate that this is because of the inclusion of the resistance items.

Is it consistently the case that resistance only helps to differentiate your level of adoption if it is balanced with other attitudes? Is some other mechanism responsible? I hope to follow up on these results with a student assistant when I join UNC Charlotte’s Department of Software and Information Systems this fall as an assistant professor.

Coincidentally, the New Security Paradigms Workshop has a theme this year of “Resilience and Resistance.” I may submit to the workshop myself, but I also hope that other prospective attendees will find my resistance scale of use in their work.