Episode 82:
Sarah Hartman – Caverly
and Alexandria Chisholm

This episode features two librarians who have developed digital privacy toolkit they call Digital Shred. Sarah Hartman-Caverly is a reference and instruction librarian and Alexandria Chisholm is an assistant librarian, both at Penn State Berks. They both have a healthy interest in digital privacy, and they developed a series of workshops for students on managing one’s digital identity. Those workshops have spawned a website with a bounty of digital privacy resources for students and librarians and other educators. One of our favorite librarians, Melissa Mallon, talks with Sarah and Alex about their entry into the world of digital privacy, how they help students understand the value of digital privacy, and the kinds of resources they’ve collected for Digital Shred.


Digital Shred, a privacy literacy toolkit
Privacy Workshop Series
•@Digital_Shred on Twitter
•Hartman, S., & Chisholm, A. (2020). Privacy literacy instruction practices in academic libraries: Past, present, and possibilities. IFLA Journal. [open access]
•“Version Control,” Sarah Hartman-Caverly’s 2017 speculative fiction, [open access]
Six Private I’s Privacy Conceptual Framework
Privacy literacy collection (professional presentations and publications)
Alexandria Chisholm
Sarah Hartman-Caverly
Leading Lines episode 62 with Chris Gilliard


[0:01] (music) 

Derek Bruff: [0:07] This is Leading Lines. I’m Derek Bruff. As I record this, we’re wrapping up our eighth week of the fall semester at Vanderbilt University. Unlike some universities, we decided to go ahead with an on-campus fall semester. And it’s been going surprisingly well. With a host of safety precautions, including weekly mandatory COVID testing for our students, we’ve been able to avoid any kind of pivot to remote teaching and learning. We still have a few weeks before the on-campus portion of the semester wraps up just before Thanksgiving, but I’m hopeful we’ll make it without a spike in cases.  

[0:42] As our faculty and students find their footing in a fall semester full of online courses and hybrid instruction, the Leading Lines team has reconvened to continue our explorations of educational technology in higher education. Thanks for bearing with us in our erratic publication schedule this summer and fall. I’m not sure we’re quite ready to return to our regular schedule, but we do have some interviews lined up for you that we’re excited to share and they’re not entirely about the pandemic.  

[1:10] First up is a conversation with two librarians who have developed a digital privacy toolkit they call Digital Shred. Sarah Hartman-Caverly is a reference and instruction librarian and Alexandria Chisholm is an assistant librarian, both at Penn State Berks. They both have a healthy interest in digital privacy and they developed a series of workshops for students on managing one’s digital identity. Those workshops have spawned a website with a host of digital privacy resources for students and librarians and other educators. One of our favorite librarians, Melissa Mallon, talks with Sarah and Alex about their entry into the world of digital privacy, how they help students understand the value of digital privacy and the kinds of resources they’ve collected for Digital Shred. (music)  

Melissa Mallon: [2:02] I am very pleased to welcome two amazing librarians here to join us on the Leading Lines podcast. We have Sarah Hartman-Caverly and Alex Chisholm, who are librarians at Penn State Berks. We are going to talk a little bit about their work on privacy literacy and some of the things that they have done related to that. So welcome, Sarah and Alex.  

Sarah Hartman-Caverly[2:29] Thanks. It’s great to be with you.  

Melissa: [2:32] So maybe we can start by hearing a little bit about what led you both to start working with privacy literacy or even, you know, did something happen or did you have a passion in this area, or what kind of led you in this direction? Maybe we could start with Sarah.  

Sarah: [2:52] Sure, happy to. So I started working on privacy literacy instruction prior to my current appointment at Penn State Berks and prior to meeting Alex. I was working at a community college at the time. And as I recall, it came right after the Edward Snowden disclosures of the global surveillance grid. So that occurred in I think 2013. And by 2014, I was delivering a standalone workshop called, Is Big Data Big Brother? I was doing that as an opt-in workshop for students. So as you might imagine, in that kind of a setting, not a whole lot of students came, but the students that did come and I had a very rich conversation about what all of their social media data capture meant, not just in the context of their day-to-day lives, but now in the broader context of understanding exactly how that could be siphoned up by our government and other governments through this third-party doctrine that we were learning about, as a result of the Edward Snowden disclosures.  

[3:52] Not long after that, the institution that I was working at started looking at adopting learning analytic software, particularly the Starfish system, which at the time was owned by Hobsons. I’m not sure where it’s at right now. So it was marketed as a student intervention and student success product, but because I had already started thinking critically about data capture generally and student data capture in particular, and the various forms of vulnerabilities and disparate impacts that could have on the particular students I was working with. I came to those discussions at the college with a slightly more critical bent than the average faculty member. So I started maybe doing a little bit of a rabble rousing with library faculty, and then actually led a faculty learning community looking specifically at learning analytics and the capabilities of not just Starfish, but our learning management system in general. And putting a sort of privacy and intellectual freedom and free inquiry spin on that based on the grounding that I had in library ethics.  

[5:02] And that inspired me also to undergo my first solo scholarly endeavor, which started out as a traditional academic analytical auto ethnography exploration of these kind of technology and student data capture systems and libraries. And it ended up being published in a volume as a speculative fiction sci-fi story. So that was kind of my pathway and long story short, I ended up voluntarily leaving that position and joining the library faculty at Penn State University libraries at the Penn State Berks location. And that’s where I met Alex. 

Melissa: [5:42] Awesome. That is fascinating and I love to kind of hear about the different paths that your interests took depending on whatever the current disaster related to privacy was. So that’s really great, very interesting. Thank you. And how about you, Alex?  

Alexandria Chisholm: [5:59] Sure, so my journey is little bit shorter than Sarah’s at the moment and I kind of started from a more personal angle. So my entire life as a child, when I started learning about marketing and advertising and how manipulative it could be, the psychology behind it, it always fascinated me. So then when I was in college and things like Facebook started and ad tech was burgeoning at that time. And noticing the personalized advertising coming up. That personal interest, I started paying a little bit more attention to digital privacy and things like that. From the professional angle, I just came at it from data literacy initiatives, initially. So I had attended an IMLS funded web conference a couple of years ago, I think it was back in 2016, where there were a few presentations by actually high school librarians talking about privacy and how it related to data literacy. 

[6:57] And that’s when I started really thinking like, oh, there is an angle for me to bring this into my professional work and kind of combine that personal interest. And around that time, I had been playing around with the idea of a privacy literacy session for first-year seminar, which is a group that I liaise with, I’m the FYS coordinator at Penn State Berks library. I had been considering adding that as an option to our workshop programming. And at that time, in 2017, Sarah started and I learned that she had an interest in this as well. And I figured, you know, she has a longer run with dealing with the topic, I wonder if she can help me get this started and what she would, you know, how that collaboration would work. And thank goodness I did. It’s been fabulous since then. But that’s what really started all of this for us, that first FYS workshop that we did. And it’s just gone off from there, which has been great. 

Melissa: [7:56] So I want to loop back to talking a little bit more about your experience with the first-year seminar because I’m really interested in hearing some of your perspectives on working with first-years, but then also upperclassmen as well and what the differences there look like. But before we get into that, I think it would be really good to talk about your co-creation called the Digital Shred. It’s a private literacy toolkit that the two of you created. It sounds like it was a confluence of events that brought you two together, which is amazing. So what made you decide to create this tool and can you talk a little bit about sort of your goals for it? 

Alex: [8:33] Sure. So I definitely I know it was a hard sell for Sarah at the beginning because she was afraid of the maintenance that would be involved. But it came about with our Digital Shred workshop that we created, which is actually all about mitigating harms and taking steps to kind of erase any sort of past behaviors as best that you can online. And as we were working on that workshop and we wanted to create an activity where students would be able to digitally shred past behaviors, we realized that we didn’t want to create the tools and the instructions for how to go in and change your privacy settings and how to use all these different tools. It seemed overwhelming to create them in house.  

[9:17] And so we realized talking through this, that there was potential to house all of the how-to’s on a website. And then we also realized this was a great opportunity to kind of support different educators, whether that’s librarians or other educators into these topics. We had already begun our first collaborative scholarship, which was all about privacy literacy practices in academic libraries. And feedback that we were getting back from our survey was that people didn’t have the time to learn about these topics. They didn’t feel confident and they didn’t know where to start. There were no materials to work with, which is true when you look in Creative Commons licensed locations. So we realized that this was an opportunity to kind of support our Digital Shred workshop with those how-to’s but also to create a space to house and curate teaching materials and case studies that you can use in classes. I will pause and let Sarah interject if she has anything additional I might have missed.  

Sarah: [10:13] I think Alex captured a lot of it. And while we’re talking about Digital Shred, the workshop, I also want to make sure we acknowledge our third external collaborator. We worked with another librarian, named Alexandrea Glenn, who at that time was at Susquehanna University. And so yeah, so I just wanted to acknowledge her collaboration on that workshop installment, Digital Shred. But then there’s Digital Shred, the toolkit and Digital Shred, the Twitter feed which maybe we’ll talk about. Exactly as Alex described and exactly, Melissa, as you were kind of picking up on, there’s always this confluence of forces not really giving us much of a choice.  

[10:49] So we had conducted this survey-based study. We were in the process of, actually, we had submitted our manuscript which is now published in IFLA Journal. So we had gone through all of the analysis, all of the writing. And as Alex described, one of the really clear needs being articulated even by other librarians leading in this area of practice, the privacy literacy area practice, was like this is an overwhelming area to maintain current awareness in. It’s an overwhelming area to try to think about developing your own how-to guides in. And as Alex and I talked through all of those challenges, we were like why would you ever develop a standalone “how to manage your Facebook privacy settings” when Facebook has its own terminology that points you to the current method on how to do this. Not to mention these how-to methods are always changing as the platforms change, as the Terms of Service evolve. So it’s kind of a task to think that as a librarian trying to teach students about these different technologies and systems that you can maintain your own independent materials, better to have your digital pointers that point back to that documentation on the native platforms, that as the documentation updates, your own materials update in real time.  

[12:05] So that’s the approach we were taking with our own workshop guides, which we built out on the literature guides platform, which, no irony intended, but that’s all hosted by Amazon web service, right? So we have all those implications. At the same time, Alex really convinced me there is a need that similar efforts such as Library Freedom Institute, really aren’t filling in this capacity. So LFI does really excellent work on specific technological solutions and specifically critiquing state surveillance, part of the profile that we undertake with Digital Shred and our privacy literacy work, but we think it’s only part of the story.  

[12:44] And so a lot of our effort has gone into how do we get people to think about this problem in a slightly broader way? And how do we think critically about the technological solutionism that we see in some other privacy advocacy spaces? So Alex and I, I think, take a much more metaphysical bent. It is much messier and more complex, maybe less fulfilling, in the sense that you can’t just unload like a ten-step solution on your students. But I think it gets them asking some of the questions that they will need to continue to ask going forward because this landscape is dynamic, it’s not static. And so the solutions we might be able to promise today are not going to be applicable going into the future. And they know the types of questions they should be asking and the critical dispositions they should be bringing to their relationships with their personal data and their technology and app use. Then it gives us some hope for the future that maybe through some combination of yes, the regulatory environment and the technology sector and the user base, we might be able to see a healthy privacy culture revival that we think is so intrinsic to some of the other areas of individual and social well-being that we talk about in our workshops.  

Melissa: [13:57] This is fantastic. I love that what started out as just a couple ideas in your head then it devolved into a workshop. And then this powerful OER that is available for all educators to use is amazing. You’re absolutely right, Sarah. This is not an area that’s going to stop being an issue, right?  

Alex: [14:18] Yeah, it really enables us to free up teaching space to get past the how-to, just like Sarah was talking about. And our workshops are all theoretically grounded. And one of the reasons why we don’t ever want to get too heavily into the how-to’s is because of something called the control paradox. You don’t want to teach students about these tools that they can be utilizing and give them this false conception that they have control over their data online. Because when you dig deep enough, you understand that there are things you can do to reduce harm, but it’s very limited. We can’t see the back-end of these technologies and how they’re collecting our data. We can’t control that. Our privacy settings in these social media platforms are very limited in what we actually can do there.  

[15:01] And so instead of focusing on that and having them leave a workshop feeling empowered and that they’re safer online, which coincidentally is feedback we sometimes get at the beginning of a privacy workshop is like I’m already smart about my own here. But instead of that, they can then leave having this framework to ask questions and to think about future behaviors and make intentional choices based on their personal preferences.  

Melissa: [15:25] When I talk to students about privacy, especially in relation to their social network and their activities online, it usually inevitably comes up to a convenience versus personal security type of paradox. And I’ve noticed over the years, the personal security piece is definitely getting a lot more traction than the convenience piece. And I was curious in the years that you’ve been doing these workshops and that you’ve been building the Digital Shred site, have you noticed any differences? Have students commented on they are a lot more critical now than they were a couple of years ago, or have you noticed trends there? 

Sarah: [16:06] I would say, so we ask in the first-year seminar Embedded Privacy workshop, one of the reflection statements we open with is what are you already doing to protect your privacy? And I would say this semester more so than ever, and Alex, correct me if I’m wrong, we’re seeing more responses about VPN and to a similar extent, Incognito mode. And then we ask students what do these two technology solutions do and how might they be complimentary? They can articulate what a VPN does versus what Incognito mode does with respect to your cash browsing history as opposed to the browsing activity that your ISP can see. And that’s been an interesting shift.  

[16:44] But as far as their sense of personal security, one of the other areas that Alex and I talk with them about is sentiment shaping and sentiment analysis. So it’s very common now, even in like the generalizations I use to hear, like, “oh, privacy intrusions are often about advertising and about directing your purchasing behaviors and nudging your purchasing behaviors.”  And I didn’t understand why some people would have a fairly blasé attitude with respect to those kinds of behavioral nudges. But one of the most powerful anecdotes for me that has come out of our workshops from a student was when a student explained that they had been in a relationship with their significant other for a certain period of time and now on their social media ads, they were starting to see ads for engagement ring purchases. And I thought that’s a great example of not just a purchasing nudge, but a really intimate life choice nudge. And you know Alex and I being ladies of a certain age will often share that we’re seeing fertility treatment advertisements in our social media. Again, is fertility treatment a consumer product of a sort? Yes. But it is also reflective of this really fundamental intimate life choice that I don’t think it’s any advertiser’s or social media platform’s, business to intrude upon. 

[18:04] So we try to get them to understand how the more that’s known about you, the softer a target you are for that kind of sentiment shaping and that kind of intrusion into all these different spheres of your life. Not just your data privacy, not just your credit score, right? Not just identity theft, those kinds of traditional harms that we think of. So our focus, especially in that initial privacy workshop, is on getting them to understand what is a positive case for privacy. What if all this technology went away tomorrow? Like God forbid, we have a solar flare. None of this technology works. Do privacy problems go away?  

[18:42] Privacy and privacy challenges are intrinsic to the human condition. Alex and I have done a lot of research in the area and we would argue, although we engage the critique that they’re universal to the human condition, in some respects. And so when we’re talking about the positive case for privacy, we take it beyond this data capture zone into thinking about what we call the Six Private I’s. And this is a framework that we developed kind of first as a way to organize some of our scholarship in a lit review and then we were immediately like, wow, this is such a great teaching tool, it’s such a great metaphor. So we talk about their identity, their inflect, their integrity, both bodily integrity and contextual integrity, which is theory from Helen Nissenbaum, their intimacy, and their interaction in isolation at the very outer edge of their private sphere. So essentially their social privacy and their ability to withdraw into seclusion.  

[19:34] And so we say, even if, and again God forbid, even if your identity is never stolen, your credit score is never implicated. Like you go through life never being canceled or called out on social media. Privacy is still something that’s valuable to your lived experience and to the human condition, in general. And so we’re really trying to shift everyone’s thinking toward that positive case for privacy and understanding not just what the sort of more common harms that we talked about, again, in the general discourse are. But what are the hidden harms once you look at a particular privacy case study and start to unpack all the implications using that Six Private I’s framework? 

Melissa: [20:14] I love that you’re focusing and helping students see that it is more than just the harms that we hear about in the news or, you know, I mean, I think it’s kind of becoming ubiquitous at this point that, oh, there’s some data breach from this bank. But to think about that more analog piece of privacy too is really important and you just, that is a conversation, but it’s not always combined with the data piece as well. I love that and I’m excited to hear about this, the framework. Have you explored any of this within disciplines or within upper level courses? 

Alex: [20:49] Currently, that’s definitely in the pipeline and something we’re working to really rollout sometime soon, some faculty type facing training. But currently we just have our four-part workshop series that we’ve been operating with. Outside of that, I would say it’s been informal ways that we can inject privacy topics into different classes. So with kinesiology, students talking about biometrics. A lot of our students and our faculty participate in research with things like Fitbit technologies and so being able to purchase items for our collection, but then also being able to use examples of privacy related topics to bio-metric data collection. That’s the extent that I’ve done that in subject areas. I can let Sarah speak to her experience in subject areas as well.  

Sarah: [21:35] Yeah, so with those other three additional workshops, so you got Digital Leadership, which is really an undergraduate sort of preparing to graduate focused workshop. So these are definitely advertised for first-year students because they have a work attendance requirement as part of their first-year experience. But we partnered and actually I should say, Alex has partnered with different units on campus. So that’s a partnership we have with Career Services. And any student and frankly, any affiliate of the college is welcome to attend. But it really focuses on helping students understand you’re preparing to leave the undergraduate bubble of Penn State Berks. You might be thinking about grad school. You might be thinking about looking for a job. You might be an entrepreneur. And so you’re now looking at representing yourself as a business entity to a public clientele. What can you be thinking about in terms of your online persona and your social media presence? So I would say it’s much more sort of upper level student focused. Although we do have a lot of first-year students that participate. And then the other two workshops are our Digital Wellness workshop, which focuses on thinking out the place of technology in your overall well-being. And for that, that’s a great example of the magic of our collaboration I think which I’d love talk more about too, just in general. Because literally Alex and I sitting in her office, thinking we’re up beating digital leadership and then realizing, oh no, this is its own workshop. 

Melissa: [23:02] This is a new thing, yeah.  

Sarah: [23:03] Wellness uses the metaphor of the “wellness wheel” and we adapt that to think about it from a particular digital or technology-driven context. Like in what way does your relationship with technology impact all these other spheres of your personal well-being? So that’s Digital Wellness, which again is kind of like student level agnostic. And then Digital Strategy, which Alex has already talked a lot about, is kind of a compliment to Digital Leadership. Digital Leadership is very future oriented in thinking how do you want to be portraying yourself online in order to optimize your success in your personal life as well as in your professional or ongoing academic life. And Digital Shred says alrighty, you’ve been through Digital Leadership, you know what you’re thinking about going forward. Maybe it’s time to look back at your digital past, your digital exhaust and think critically about are there things that you need to securely delete, securely shred, leave behind you, denounce and disclaim, make a whole new social media handle, you know? And so that’s what that workshop’s about.  

[24:06] And we use a really fun metaphor. I actually liaise with our engineering, business and computing programs, which includes our security risk analysis and cybersecurity analysis and operations programs. And so we use an actual intelligence community directive. It’s a government document. You can go out and find it, forget which number it is, but we can put it as one of our links, if you want. And it’s a process that an intelligence agency would go through, if they think that it’s possible that they’ve experienced some kind of sensitive information breach. So we use that as the metaphor for the activity, the damage assessment activity in the Digital Shred workshop.  

[24:41] Which pulls kind of another dynamic that Alex and I try to bring into these learning experiences for students. We already talked about, you know, our philosophy is to not reinvent the wheel if we don’t have to, because we’re all too busy for that. But if you can bring in an authentic information artifact from the so-called real-world beyond the ivory tower, all the better learning experience for students to help them understand this does have real-world application. You might be an SRA or a SIAT major who might actually do this work on behalf of an intelligence agency at some point in your career and here you are learning about it as an undergraduate student at Penn State Berks. So that’s another kind of instructional design approach that we have is we’re never trying to create something for the sake of creating something. If we can, we go out and find a preexisting like the wellness wheel or like this intelligence community directive. And then we say, oh, that’s beautifully applicable to this learning activity or this learning outcome that we’re trying to achieve with our students. How can we adapt that and really apply it to this model of Digital Wellness or Digital Shred that we’re trying to get them to understand and really participate in? 

Melissa: [25:51] So that’s good and I think that gives a good transition into the next thing that I want to talk about, just that, you know, that so much of this is information that can be used in different contexts, having that kind of big picture framework in mind and then applying that probably the best way to go forward with this, right, rather than making it so narrow. And so that leads a little bit to my next question. You mentioned earlier that the privacy toolkit is very encouraging in terms of sample assignments or activities that can be used. So do you have any strategies or suggestions for educators who are wanting to incorporate that into their class? So let’s say I’m teaching a history course focusing on the industrial revolution, what kind of advice would you give that instructor, if they wanted to try to incorporate some tenants of privacy literacy or the Six Private I’s framework? 

Sarah: [26:50] So my first thought is jump into your nearest library database or discovery tool and pop in “industrial revolution” in one search box and “privacy” in the next search box. And I’m only, I’m speaking way outside of my league here, but I’m only saying that because I think there was a concomitant transformation of the domestic sphere along with the industrial revolution. Because prior to that you had cottage industries and a lot of work happening in the home and a lot of gender differences in the way that work was performed, which all has implications for privacy and what is the private sphere and what is the public sphere? So certainly, you could trace the impact of that kind of a broad-based manufacturing and economic transformation and its impact on individual experience of privacy and what is home life and what is work life. And find out in a company town, what kind of influence does my employer have over my personal decisions that needs to be mine and all of that which we’re all still dealing with now that we’re in the next industrial revolution, right? 

[27:50] So that would be my first thought is like whenever Alex and I kind of we talked about trying to incorporate authentic information artifacts into our instruction but Alex mentioned earlier, we also are incorporating a lot of theory. If someone hasn’t already written that scholarly paper then that instructor needs to write it because there’s stuff to unpack there. And then my next thought would be, so you talked about history. So we sort of have some insights, hopefully I’m not sharing too many cards here, that there’s folks doing really interesting privacy literacy work with post custodial archives and use of primary sources and undergraduate and even graduate and scholarly research as well, and this takes us beyond the realm of history, but how to protect yourself as a scholar when you’re doing open digital research, right? So I think there is some interesting work in that space. Unfortunately, neither Alex or I liaise with the history programs at Berks, so I don’t know how much more she’d be able to say to that.  But my first inclination would be, you know, we might not have a whole lot specific to history in the Digital Shred toolkit, but yeah, you’re looking for a way to incorporate privacy into fill in your blank discipline, great Boolean search, right there.  

Alex: [29:03] I’m not going to lie, history’s way out of my lane in terms of how to incorporate privacy. I will say that I’m confident there’s ways to incorporate it into almost any class. And one thing with using our toolkit, one suggestion I would have, like a practical suggestion, would be our case studies tab. We categorize them by a lot of different disciplinary type areas like health, criminal justice, consumer profiling. And so I think that one of the best things people could do is try to build some current awareness and some practices that can help them stay up to date in this really overwhelming area, because the more you immerse yourself in it, the easier it’s going to become to find these examples. We have some colleagues that we work with at Penn State that are really starting to dive a bit more into these topics. And it’s really fun to watch them send out an email and be like, “oh my gosh, did you ever hear about this example? I’m using it in this class”. So the more that you build that self-efficacy and how you can keep up to date yourself, the more I think it’s going to be seamless to incorporate topics in. 

Sarah: [30:06] And I’ve never taught for credit or for grades, but I would just say have the students do it, put it out to the students. What is the relationship between privacy and the industrial revolution? Discuss. 

Alex: [30:16] We definitely put a lot on our students. We put a lot of trust and a lot of expectation on them, even in workshops, and they never disappoint. And the most fun part, we’ve taught this class, the privacy workshop for our first-year seminar students. We’ve taught it dozens of times to over 500 students. And every class is different and unique and interesting. And because we shape our entire learning experience around the students and let them kind of drive the decision on where we’re going, it’s very fascinating. And so I would say have confidence in your students’ ability to speak to these issues and let them drive the experience. It’s been very rewarding for us. We always learn something. So that’s another thing I would encourage. So Sarah’s suggestion, ask them. Yeah, put the work on them. They always do great things in our experience.  

Melissa: [31:07] A couple things that I heard you say as you were talking about that one. So Sarah, you mentioned some of the more scholarly issues that could come up in relation to open scholarship, open data. And I’m thinking especially for graduate programs where maybe a student is going on in the academy, thinking about their own work and how they’re contributing to information behind a paywall or information that’s free and open for everyone to access. So that’s really great. And I really like what you said about putting a lot of this in the hands of the students. I’m so with you there, to let the students run with it and make the connections on their own is really gratifying to see. It’s just amazing. So I love that and that got me thinking a little bit about something that I was interested in when I was looking around in the Digital Shred toolkit, this idea of crowdsourcing a little bit. And so I notice that you’re using Padlet in some really interesting ways. And I was wondering if you could talk a little bit more about that and what that looks like. 

Alex: [32:12] Sure. So Padlet is one of our favorites, if anything we overuse it, like it’s just so intuitive and easy for anybody to use and it embeds nicely in different websites. So I would say we love reflection or maybe I love reflection questions. Sarah does too, I’m sure. But we love to give students the opportunity to really think through these things on a personal level. And Padlets are this anonymous way for students to share out thoughts. And I’ve seen this be really effective on some really sensitive topics outside of my privacy workshops. And it just enables them to be able to share out without any concern for who may think one thing or the other. And it also allows us to, especially in a Zoom environment, to real-time share out while they’re working on some of the responses. So if you’re ever worried about the crickets in a classroom, Padlets are great tools to use because you’re allowing them to share anonymously and you can be kind of talking about what’s coming up and the trends you’re seeing as they’re doing that. So as much as I don’t like hearing myself talk, it is a way to fill silence and not have crickets if you’re afraid of that, so if you have teaching anxiety.   

[33:28] But we really love using it for our reflections, in particular. And then also as they’re experiencing tools in real time, whether they’re exploring ad profiles, their personal ad profiles or one of our favorite learning objects, Click, Click, Click. They can then share out reactions in real time while everybody’s working at their own pace. So that’s really how we use them in classes. We definitely rely on them a lot more heavily now that we’re remote for sure. Because another favorite of ours is to have stations around the room with the big Post-it notes. When that was eliminated, we started using the shelf Padlet that allowed us to kind of replicate that in an online environment.  

Melissa: [34:09] So as we’re sort of coming into the end here is there anything else that you would like to say about your efforts in this area? You mentioned that one of your kind of future possibilities is looking at more faculty training and doing some more professional development. What are some of your other next steps? Do you have grand ideas or dreams for taking over the world? Because I would be in favor of that, I think you guys would do an excellent job.  

Sarah: [34:34] We do have some really exciting things in the pipeline right now. Alex mentioned working with collaborators in Penn State University libraries so we have some other librarians spread across multiple campus sites, including the university part, working on some peer facing learning materials around privacy literacy. So that’s shaping up to be an interesting virtual, semi-synchronous, semi-asynchronous three-part series, which will be an adapted version of the privacy workshop. A workshop working titled “Intellectual Privacy” to really drive home some of these connections between privacy and intellectual freedom. I rely a lot, I think on Neil Richards’ work. And then the third workshop, third installment is tentatively working titled “Privacy in the DX” or “Privacy in the Digital Transformation.” So really bringing those two initial workshops together in thinking what does this mean with respect to our use of technology for teaching and learning and library systems for delivery of resources and services in the university libraries. 

[35:35] So we’re super excited about that and we’re excited to collaborate with some of our peers on developing and delivering those experiences. And then I don’t know how much we can say at this point, but we’ll be collaborating on a co-edited volume. So we’ll be putting out some calls for chapter proposals for that within the very near future so we’re excited there. And we’re in the thick of transcribing and analyzing interviews for a qualitative follow-up to our survey-based study which came out in IFLA journal just a bit ago. So that’s really, I think, where our combined strengths lie as far as the scholarship side, we both have undergraduate degrees in anthropology. So we’re kind of going back to our roots there with transcription and coding and digging into those really nitty-gritty fun details. Other than that, I don’t know, Alex, any other grand plans and dreams and schemes? 

Alex: [36:26] No, I mean, our big goal is just to constantly be encouraging other librarians and educators to create their own materials and to put them out there as OERs. Something we heard in our survey-based study over and over again, people were really hungry to find materials that they can adapt because they didn’t feel like they have the expertise or time to develop things on their own. So we’re hoping with all of our professional development workshops that we’ve been doing and professional presentations that we can continue to encourage people to build their self-efficacy and learn about these topics and know that they don’t have to be tech wizards, they don’t have to be a coder or anything like that to delve into this. And I think Sarah and I have kind of proven that in our approach with having it be a bit more theoretical and metacognitive and metaphysical, as opposed to being straight-up tech solution-based.  

[37:18] So I know that’s really our big goal is to just inspire people to continue to create and share. There’s a lot of space in this area. And we’re just hoping that it continues to get traction. And people talk about this, particularly academic libraries or in academia in general, higher ed. There’s not really any sort of stakeholder that’s talking about privacy on campuses. That was another thing that came out of our survey as well. So we’re really well positioned as ethically trained professionals to discuss these topics. It’s a part of our code of ethics. It’s something that the ALA has in our Bill of Rights that was just recently added to educate about this. So we’re really in a great position to be talking about it and it’s just so important. So that’s our overarching big goal, I would say.  

Melissa: [38:05] I love that and I’m so inspired listening to you talk about that because I think you’re right that librarians, especially in academia, are really well positioned to talk about this. And to sort of bridge the gap between the student perspective and the faculty perspective and really be the advocate on campus. So that’s amazing and I’ll be excited to hear more about your future publication, as well. That’s really exciting.  

[38:31] One other question that I’m going to ask you, which is one that we ask at the end of every Leading Lines interview and I may break the rules a little bit and put a tiny spin on this. So the question we usually ask is, what is your favorite analog ed tech tool? But I think I might put a spin on it and say, what is your favorite analog ed techish sort of, but maybe privacy related tool? 

Sarah: [39:18] But mine would be the question.  

Melissa: [39:20] I like it.  

Sarah: [39:22] And as Alex has already described, we like to make our students think. A lot of people assume that your student is not ready to think about some really deep metaphysical, ethical conundra. But they are, they’re hungry for it. That’s what they came to college for. So when in doubt, when Zoom goes down, when you can’t connect over email, find a way to ask them a question.  

Alex: [39:49] I would completely agree, any reflection question. I already expressed that earlier. I love reflection questions and I overuse them. So and students never disappoint. They’ve got thoughts and like Sarah said, they love to be challenged, but I don’t think we’re always like, comfortable in ourselves in challenging them. 

Melissa: [40:05] And that goes back to what you said earlier, you’re putting it in their hands. You’re giving them that opportunity to develop, kind of guiding them in that direction. And that’s so important. Thank you both so much for taking time out of your day to talk with me for the podcast. These are such important issues. But thank you for the work that you’re doing here. It’s really important and it’s fascinating.  

Alex: [40:27] Well thank you so much for having us. This was wonderful. Thanks for allowing us to ramble for almost an hour about our work.  

Sarah: [40:36] Yeah, we really appreciate the opportunity and I definitely pinch myself everyday that I get to do this work and that I get to do it with someone like Alex so. 

Alex: [40:43] Yes, we have very complimentary skills which work out well. (music) 

Derek: [40:50] That was Sarah Hartman-Caverly and Alex Chisholm, both librarians at Penn State Berks. As long-time listeners know, I teach a first-year writing seminar on cryptography and privacy at Vanderbilt. So I was taking careful notes during their conversation with Melissa. I was struck by the story Sarah related, the one about the student who started seeing ads for engagement rings in his social media. Sarah said, the more that’s known about you, the softer a target you are. I can totally see sharing that example with my students in the spring as a way to help them understand their own relation to digital privacy.  

[41:24] For more on this topic, check out the Digital Shred website, as well as some of the publications that Sarah and Alex mentioned. And listen to my interview with Chris Gilliard, known as @hypervisible on Twitter, back in episode 62. You can find links to all those resources in the show notes. And you can find those show notes, as well as all our past episodes, on our website, leadinglinespod.com. Leading Lines is produced by the Vanderbilt Center for Teaching and the Jean and Alexander Heard libraries. This episode was edited by Melissa Mallon and Rhett McDaniel. Look for new episodes when we publish them. I’m your host, Derek Bruff. Thanks for listening. And be safe. (music) 

One thought on “Episode 82:
Sarah Hartman – Caverly
and Alexandria Chisholm

Leave a Reply

Your email address will not be published. Required fields are marked *