Episode 62:
Chris Gilliard

In this episode we talk with Chris Gilliard, Professor of English at Macomb Community College. His scholarship concentrates on privacy, institutional tech policy, digital redlining, and the re-inventions of discriminatory practices through data mining and algorithmic decision-making, especially as these apply to college students. Chris talks with Derek Bruff about some of the problems and concerns about educational technologies that may not be immediately visible to others.


Links

• Chris’ website, hypervisible
• Follow Chris on Twitter
• November 2018 CBC Radio interview with Chris “Bad Algorithms are Making Racist Decisions


Transcript

[0:01] (music)

Derek Bruff: [0:05] This is Leading Lines. I’m Derek Bruff. I get pretty excited about technology’s potential to enhance what we do in higher education. I’m always looking for creative ways to use technology to support student learning. And I like sharing these ideas here on the podcast and elsewhere.

[0:23] I’ve learned, however, that if I focus too much on the potential, I can miss some of the problems that technology’s caused. I found it important to read and follow the work of those who are a little more critical of educational technologies. Those who can see the problems and concerns that aren’t immediately visible to others.

[0:40] One of the people I follow in this space is Chris Gilliard, professor of English at Macomb Community College in Michigan. His research focuses on privacy and surveillance and as he says, the reinvention of discriminatory practices through data mining and algorithmic decision-making.

[0:57] You may know him better by his handle on Twitter,”hypervisible,” where he shines a light on some of the dark sides of technology. When we spoke this summer to talk about privacy and surveillance in the educational technology space, I was struck by how lighthearted Chris was in spite of the heavy subjects. I really enjoyed our conversation and I think you will too.

Derek: [1:21] Thank you so much for being on the podcast today, Chris.

Chris Gilliard: [1:23] Oh, thanks for having me, really appreciate it.

Derek: [1:26] I want to start by looking back a little bit. Can you tell us about a time when you realized you wanted to be an educator?

Chris: [1:33] Oh, wow. I’ll try. I mean, I, I sort of wound up in this field by default because I hate everything else. (laughs) I, you know, I knew I wanted to go to grad school and I had like a couple of tremendous mentors in my undergrad. And I started going to grad school. And I just kept going until the end. And then I looked up and thought about all the things I didn’t like to do. But also I had I had I taught while I was in grad school and it was something that, you know, that I enjoy having that interaction with young people and getting to talk about ideas and things like that. And so that is kind of where I wound up. I mean, it’s one of the few things that I’m somewhat good at that I can get paid for. (laughs)

Derek: [2:33] That’s always helpful. Well, say more about your journey because I’m curious how one ends up as a professor of English at a community college studying privacy and surveillance. That’s, that’s an interesting job description, actually.

Chris: [2:49] Yeah, I mean, there are not that many of us that I know of. I mean, I’ve always been interested. I mean, the kind of narrative I usually provide about privacy and surveillance is that growing up black and in Detroit, you know, with the family and the parents that I that I have, you know, I was always pretty made pretty keenly aware of the degree to which surveillance is used against particular communities. Whether that’s the FBI monitoring the Civil Rights Movement or police departments in Detroit, you know, monitoring, surveilling, and ultimately terrorizing black residents.

[3:34] And so that’s always been a concern of mine. So then I thought about and read about and been taught about, and I wound up at, uh, at my current position at a community college kind of as a very circuitous route. Yeah, I’ve taught at some R1 places and small liberal arts places, but for various reasons they were not for me.

[4:03] And yeah, just kinda I almost left teaching, actually, but got my current job where I’ve been for about 12 years. It’s a little bit better suited to my, my particular set of skills. But I, you know, so it’s enabled me, seeing some of the effects of surveillance and student data. And seeing some of the effects of, of how students are surveilled, at colleges and how lack of access to technology and to broadband affects the ways students can do their work at a community college has given me a great deal of perspective and kinda helped me start to think about some of the things that I talk about quite a bit now.

Derek: [4:52] And I’ve got more questions about your students, but I want to circle back to something we were kind of joking about before we hit record. As I said, you are sometimes hard to find out about online. And you know, for instance, your Twitter handle is “hypervisible.” And I don’t, I don’t know that your name is actually on your Twitter profile anywhere. What’s listed as your name is the phrase, “One Ring doorbell to surveil them all.”

Chris: [5:21] (laughs) Right.

Derek: [5:24] Can you, can you tell us what that means?

Chris: [5:27] My Twitter handle, and I gotta give credit where it’s due. Autumn Keynes helped me come up with it. She’s a buddy of mine and we sit around sometimes and bat around ideas and I forget. Sometimes people change their Twitter handle as the seasons change, you know, so they’ll have a Halloween themed one and a Christmas themed one and things like that. And I think it was sometime shortly after Christmas or as the season was winding down and I thought I needed to change my, my, my Twitter handle for whatever it had been before that. And, you know, I’m a pretty vocal critic of Amazon and of surveillance structures in general. And I’m very troubled to say the least by the surveillance network that Amazon is building.

[6:24] And so we happened upon that, sort of the play on The Lord of the Rings, that most people are somewhat familiar with. And it turns out it’s drawn a lot of attention. (laughs) Like some people who happen to be writing about Amazon have approached me just because of my Twitter handle. So yeah, that’s I mean, I just thought it was catchy, but also accurate, you know.

Derek: [6:56] Well, let’s get into that a little bit. What is Amazon doing and what are some problems that you see in the surveillance network that they’re building?

Chris: [7:03] Oh gosh. I mean, what is Amazon not doing?

Derek: [7:06] (laughs)

Chris: [7:08] I  mean, they have, you know, Amazon web services that provides the infrastructure for government surveillance in the form of CIA. And they provide, and ICE for that matter, they do, they provide the infrastructure for companies like Palantir and things like that. I mean, they do obviously they do some bits. I don’t wanna underplay it. They do surveillance on, on their customers to better serve them. You know, more things to sell.

[7:52] On that long list is they have a technology called the Ring Doorbell, which I mean I mean, basically it’s a doorbell with a camera. But unfortunately, that is I mean, that description is not very apt. But they also, I mean, it does a lot more than that. But they also partner with police departments and offer incentives for police departments to- so they will, for instance, and this has been reported in Motherboard and by CNET and in many other places.

[8:35] So they offer, they partner with police departments to offer incentives for people to buy the doorbell. So in the form of like say, police giveaways and things like that. Or in some cases even offering police free doorbells to have incentive programs. And they also partner with police departments by giving police departments materials, bait packages. (laughs) So yeah, Ok. I can tell about looking at your face, maybe you’re not familiar with this.

Derek: [9:12] Bait packages. Yeah, I mean, so the idea behind the Ring Doorbell, in theory is that it’s a security measure, right? So like when I’m not at home and maybe Amazon has dropped off a package on my front porch, I have a camera there and that let’s me know it’s arrived and if someone steals it, I have a, I have a capture of their face, right? That’s the kind of ostensible reason for this technology, right?

Chris: [9:35] (laughs) Theoretically, yes. Yeah.

Derek: [9:38] Well, and I guess and again, I mentioned that I, I teach this first-year writing seminar and we talk about privacy and surveillance. And one of the things that I run up against with my students is that this kind of notion that, “hey, you know, like I’m 18-19.” I teach traditional age college students and “I don’t have anything to hide, right? So why should I worry about this?” And so a pitch like that, “hey, wouldn’t it be great to have a camera on your front porch, to add a little extra layer of security?” seems like a pretty easy sell, I think for my students.

Chris: [10:11] Mhhm. And so I’m imagining like, how would you problematize that for someone to help them see that there’s actually some layers here that aren’t quite so on the up and up?

Chris: [10:22] Well, there’s a there’s a lot of different ways. l mean, first of all, the “nothing to hide” argument is very thoroughly debunked by lots of people. I mean, Daniel Solove wrote a whole book on it. But ultimately, I mean, that’s not how rights are supposed to work. (laughs) So it’s not that I only want to, that people are only supposed to exercise rights to the extent that they think that they should have them, you know?

Derek: [10:51] Yeah.

Chris: [10:52] I mean, there’s that’s a pretty like ultimately that that leads to some pretty troubling, that ability to pretty troubling aspect of society, right? If people, we only exercise rights to the extent that we thought we should have them or we thought we needed them or whatever.

[11:10] But you know one of the ways, to point, to think this through with students, is to help them look at ways that it’s already being used against them. So a very simple thing I do, and I mean, even today, 2019, people, lots of people still don’t know this. That if you show students either Google or, or if they have an iPhone, the map that shows every, you know, that’s that typically is on by default, the map that shows everywhere they’ve been in the last couple months, right? (laughs) People don’t know that those things exist.

Derek: [11:50] Your iPhone has one of these, probably. Unless you’ve turned it off.

Chris: [11:53] Yeah. So I can’t tell you the number of students I’ve shown this to…

Derek: [11:56]: (laughs)

Chris: [11:48] …who are creeped out, immediately, turned it off, you know, and that’s like an entry point, right? Because, there’s, if you don’t know that’s happening or you haven’t welcomed that, are actively invited it. You know, that’s kind of troubling. But also if we think about student analytics or the ways that companies or schools monitor students’ behavior. Whether that’s, I mean, I mean, there’s a variety of ways, whether it’s by looking at their social media or tracking their behavior on the web. Or tracking their behavior on an LMS, right?

[12:37] Once students become, once you show them some of these things, they all of a sudden lose that, if they had it in the first place, they all of the sudden lose that, sort of, “I’ve got nothing to hide idea.” Because ultimately, I mean, I think that framing is useful, “I’ve got nothing to hide” framing is useful from the perspective of people who want to surveil you.

[13:03] I mean, so it’s not that people necessarily have something to hide, right? It’s that we should be in control of what information is revealed about us and not have it extracted. And most people agree with that on principle. They’re just not aware of all the various ways in which information is extracted from and how that’s going to be used to form narratives about them that are beyond their control. Which again, is another thing that students really, I shouldn’t just say students, like people don’t like this, right?

Derek: [13:39] Yeah.

Chris: [13:40] Like when you ask people, you know, who should get to tell the story about you? Or should you have some control over the story that’s told about you, when you apply for a job, when you apply for a mortgage, when you know, when you meet somebody in your life.

Derek: [13:57] (laughs) Right.

Chris: [13:47] Most people think that that’s fair, that they would be the ones to tell that story, not random bits of data that were extracted from and recombined in some ways.

Derek: [14:09] Sure. Yeah. So I have two short stories. One is when I started dating the woman who is now my wife, a few years ago, she looked me up on Facebook to see, you know, like, “Who’s this guy I’m going out with and is he legit?” I pay a little bit of attention to my privacy settings. I had my Facebook locked down well enough that all she saw were my RunKeeper posts of when I go jogging. And so she thought, “Hey, this guy likes to workout, that’s good.” So that worked in my favor actually, but I had paid a little attention there. But, but to contrast that one of the things that I shared with my students is, depending on how you access Facebook, there’s a way to kind of get in there and Facebook will tell you what it thinks it knows about you. And I don’t know how complete that is, but it’ll tell you some things. It’ll, it’ll tell you it thinks you’re in this demographic or this age category, or this political persuasion or this kind of consumer.

[15:12] And, and my students aren’t, they don’t use Facebook a ton, so it didn’t have a huge impact on them. But it did convey to them, that the idea, that here’s what Facebook thinks it knows about you based on all the data that it has access to, right? On the platform and, and on all the other platforms that interacts with and, and it’s going to show you stuff in your feed based on all this, right? So if you’re using Facebook as your window to the internet, you’re getting a very particular view of things, entirely determined by what Facebook thinks it knows about you.

Chris: [15:43] Right.

Derek: [15:44] And you’re not actually kind of seeing the world as it is. You’re seeing this very filtered version of it, based on a story as you say, that you really don’t get to control, right? Facebook is controlling that story.

Chris: [15:55] Yeah, and I think that’s very important too because there is the misconception that we give this data to Facebook. I mean, I think people in our field know, but lots of people outside of it don’t, right? I mean Facebook’s gotta category for people who have never used Facebook or aren’t on currently on Facebook or whatever, right? Non-registered users. I mean, so that and this is one slice of the ways that we’re consistently tracked, either, both by the government, by, you know, by private companies, you know, as we go about our lives in ways, again that I think most people are, who aren’t fully invested in this stuff, don’t appreciate.

Derek: [16:45] Yeah. Well, let me ask about another example that is in the news this week, actually, as we record this in mid-July, and I think you tweeted about it this morning, actually. There’s this app called FaceApp that’s actually been around for a couple of years, but it’s getting a lot of attention this week because apparently, you can upload a photo of yourself or anyone else, and one of the things you can do with this app, is it will project what you look like in the future, like 30-40 years out. And the photos I’ve seen are incredibly realistic. I don’t know if my friends will actually look like this when they’re old, but it’s an impressive set of technology and it’s kind of gone viral this week as people are sharing these photos of themselves, as old people. My initial reaction was this worries me.

Chris: [17:27] Yeah.

Derek: [17:28] And and my, I think my, I’d love to hear your thoughts on either that app in particular or apps like that that want to take our photos and do fun stuff with them. What are the things like? If you were talking with your students about this app, what, what kind of concerns would you share with them? Or would you want them to be attentive to, when using apps like this?

Chris: [17:48] So I mean, my general policy, I mean, there are pictures of me on the web, but I did not put them there. I found out that, I mean, being I mean, I don’t I don’t know how to accurately describe what I do. I mean, some, I don’t like this term, but some people might use the term, “public intellectual.” But I do talks and occasionally write and things like that. You can’t actually do that and not have pictures of yourself up because it will become some kind of like, you know, sort of performance, some Kaufmanesque performance or something like that. So, I kinda gave up that idea that I could sort of do what I do, and not have a pictures of myself up, Personally, I don’t put pictures of myself up on the web, but I would encourage people to, when they do, to think about what the, you know, what exactly those photos are going to be used for.

[18:50] I would encourage people. So if my students ask me about this app, or we’re talking about in class, or anything like that. I’m not trying to give the machine more ammunition, if that makes sense. Because ultimately, and I don’t know about this app specifically, but often when apps like this are used, they’re a good way of obtaining training data for some, some kind or other of application for facial recognition technology, which again, is often winds up being used in contexts that the people who are you, who are doing something that they think is relatively innocuous, like, “haha, let’s see what I look like in 40 years, right?” I mean, people don’t anticipate that harmless and funny thing they thought they were doing might wind up helping to subjugate Uyghurs in China. But that’s often what happens, you know?

Derek: [19:58] Yeah, yeah.

Chris: [19:58] So to the extent that we, that we are able to not do that, I think we should object.

Derek: [20:06] Well, and I’m also reminded, I saw some headlines, recently, about with some convenience, gas stations convenient stores somewhere that was adding some facial recognition technology to their front door. And basically they won’t let you in if you’re on a list of known shop lifters, for instance.

Chris: [20:26] Yeah.

Derek: [20:27] And I mean, that bothers me for a number of reasons, but I’m also aware that facial recognition is not great, right? Like, you can get a lot of false positives. And in fact, it’s, a lot of facial recognition software, as I understand it, is trained on white people’s faces. And so its accuracy gets, gets much worse when it, when it’s looking at people of color. And so that’s where my mind goes when I, I hear FaceApp, or these things where like yes, if you are giving them more data to train on, who’s going to be using that facial recognition software and what kind of problems might be caused by false positives? And I’m, I’m thinking this is part of the objection to something like a Ring, the doorbell camera, right?

Chris: [21:11] Yeah. Well, so yeah, I have lots of thoughts on that. I mean, my biggest problem, I guess, is not whether or not it works accurately. I mean, that is an issue, but I think perfect surveillance is still evil and bad and wrong. Right? Rather than imperfect surveillance. I mean, I don’t want to live in a society where everyone and every, every organization, every group, every individual, every private company, government, all have cameras everywhere constantly identifying you, “identifying you,” I put in quotation marks. Even if it were a perfectly accurate, that’s a pretty, you know, that’s a pretty scary and frankly awful scenario. I think, I think it’s under, understated, the extent to which obscurity matters in a society, for us to maintain, I don’t mean us, I mean everyone, like to maintain some semblance of freedom, right? Which means you can go about your business. And again, whether that’s something embarrassing or just mundane, where you can go about your business, in day-to-day and not be subject to everyone knowing what you’re doing at all times. I mean, there are legitimate reasons. Like let me give you a very innocuous one. So I go get a coffee every day. Okay. And you know, maybe I don’t want to get one for all of my coworkers. So I go and get one before work, and don’t tell anyone. Okay. So like I come to work having consumed my coffee, and no one’s the wiser.

Derek: [23:14] Nothing wrong with that, right? Nothing wrong with that.

Chris: [23:17] Yeah, there’s nothing wrong with that, right? In a, in a society where we are all tracked, right? And this is like the most harmless example I can think of. I mean, I have dozens of harmful examples (laughs), but like I don’t, it’s like actually no one else’s business, whether or not I got a coffee before work, and the fact that I’m able to go and do that without broadcasting it to every other party on the planet is actually a good thing. I mean, there and again, like as we get into other things like whether your insurance company wants you to get a coffee, whether the government wants you to get a coffee, whether people who sell coffee, think, you should’ve gotten one there. You know, whether, you are behind the line with someone who was on a gang, on a gang database. Like, I mean, there’s all kinds of and I mean, again, these are like lots of like dystopian scenarios, but that are not fantastic and I made them up, right, like when I was up in the middle of the night. I mean, these things happen already, you know.

Chris: [24:30] And so the more surveillance will make these things worse. And by the way, I mean, be more, be more harmful to communities that are already kind of over surveilled and, and vulnerable.

Derek: [24:46] Are there other assignments or readings or activities that you do with your students that help them think more deeply about some of these issues and see their relevance?

Chris: [24:56] Well, one of the things I have students do or encourage students to do is just think about it, you know, for one, for their future aspirations, right? Whether that be, you know, a four-year college or a job or things like that. And to think about the ways that monitoring and tracking affect, affect those. So for instance, an example I found myself coming back to a lot lately is that there were, there was a recent article in the Wall Street Journal about colleges that were sending emails to students that had tracking pixel embedded in it. And so depending and so it would report back to the school when the student open the email. And the colleges, were using that as a part of the metric for gauged interest in the college. So college “X” sends you an email saying, you know, with some interaction. And if you open it right away, that helps them figure out, they say, whether you’re interested and if you don’t open it right away, that they, they, you get a lower gauged interest score. Well, this is obscene and problematic. (laughs) But you know, so like people who are applying for your places, right? Like this is something that’ll affect them.

Derek: [26:28] Let me jump in because, say a little bit more about why that is egregious. Because again, on the surface, it seems like a bit of utility for the college to know.

Chris: [26:34] Absolutely. So, you’re asking me questions that none of which have short answers. (laughs) So there’s lots of problems with this. I’ll combine it to two. First problem, they didn’t ask permission. They are doing it again, surreptitiously. So this is one of my kind of big, large beefs, you know, which is that colleges, universities, K through 12, for that matter, educational institutions borrow the worst practices of surveillance capitalism, of platforms like Facebook and Google and Amazon and things like that. Borrow their worst practices, you know, under the guise of innovation or of retention or things like that.

[27:27] And one of the worst practices of surveillance capitalism, you know, is extraction, which it is the idea that, that a technology has, because there is an ability of a technology to take some bit of data that you have, that you have the right to do that, or that you can do that without any kinds of informed consent. And consent, the way it’s currently used and understood is problematic, but I don’t have a better term at this moment.

[28:05] But the school never said to individuals, right? “Well, we’re going to send you some e-mails and based on how soon you answer them, like we’re going to, this is going to help us figure out whether you want to go here or not,” right? So I mean, they’re essentially spying on students. I mean, they’re observing them in a way that they never asked permission for and the student doesn’t know about.

Derek: [28:33] Right.

Chris: [28:36] So that is spying. (laughs) And again, like if we if we did that, if that were some other context, right? If we said, so it’s not illegal, right? Which is a thing that people often say. So if we, I think I like to do often is switched contexts on things. So if I said that colleges and universities are looking through students garbage to, you know, because they figured out that it helps them determine whether or not, there’s a correlation between students’ garbage and what kind of student they’re going to be. So in a lot of states it’s actually now illegal, to look through someone’s garbage. (laughs)

Derek: [29:19] But if I’m a student, and I’m struggling in Intro Bio or something and my, my professor comes to me and says, “well, you know, we were looking through your garbage last night, and based on past data, students with your refuse pattern are unlikely to perform well in Intro to Biology.” That would be a little weird.

Chris: [29:39] Almost everyone would know that that was wrong.

Derek: [29:42] Like, why are you in my garbage?

Chris: [29:44] Almost everyone would be unnerved, rightfully so, by that and realized like, that’s terrible behavior. We probably shouldn’t do that.

Derek: [29:54] But to put a fine point on it, that’s what a lot of the analytic systems in our course management systems do, right?

Chris: [30:00] (laughing) It’s true.

Derek: [30:02] We’re going to look at all the data we have access to because we have access to it, not because we asked you or because you gave it to us intentionally. But we’re going to look at the data and see what we can make of it. And, and, and, and again, I think a lot of the intentions are good, right? In many cases, colleges are trying to improve retention rates, right? And trying to kind of intervene with students who need help. But I think your, your, your first point at least is about, again, this is not the best term, but ” informed consent” right? The students are not actually, they don’t know it’s happening, they haven’t signed onto this. They haven’t told the entity that this is ok.

Chris: [30:39] I mean, like for instance, just again, these our imperfect examples. But in most states there is a, you know, there are laws. I can’t say most, I actually don’t know the number of states, but there are, many states have laws that say that if I’m going to record a conversation that I need to inform you. So but spying on email is somehow different because? You know, I mean, and there’s a question mark on the end of that.

Derek: [31:11] Right, right.

Chris: [31:14] I don’t think it’s different. I mean, there’s obviously years of laws and habits and decisions and things like that, where people treated as different, and laws treated as different, I think probably poorly. I mean, I don’t agree with them, I’ll put it that way.

[31:37] But here’s my second point, which is this practice and lots of others, falls under what I might term, “digital red-lining” because it, it takes something. So we, we’d have to ask ourselves who does and has the luxury of opening their email right away? So who’s got broadband, who’s got lots of free time, who’s got email, who has email on their phone, you know, all these things, right? And then when we start to answer that question, we realize, oh, okay, well, some students have jobs, some students have two jobs, some students have jobs and kids, some students have family obligations, on and on and on. Some students don’t have broadband, some students don’t have email on their phone, you know, some students have a data plan, right? That, you know.

[32:31] And so by the way, lots of, you know, when I say some students, when I, when I say that lots of times, what I mean is some black and brown students, some poor students, you know, some rural students, right? Typically, the people who don’t have those things or have less of those things or have less access to these things are black, brown, poor, rural, you know.

[35:39] And so while the person who thought this was a great idea isn’t necessarily, and I say this because I don’t know that they’re not, but well, while they’re not necessarily saying I’m going to do this to disenfranchise students who don’t have access to these things, that is in effect what they’re doing because they haven’t thought this out.

Derek: [33:25] Yeah. So gosh, I have so many questions for you. I just have to ask this because I ran into this in my class. I’m talking about kind of helping students understand how these systems work, how these systems of surveillance work, and thinking about the example of a tracking pixel in an email, right? And how students aren’t, they’re not informed that this is something that’s being done to them. I’ve had students come back when I’ve given examples like that. And they said, “well that’s just how the internet works, right?”

Chris: [33:56] (laughs)

Derek: [33:57] Like if you’re gonna use Facebook or go online, you’re going to be tracked and they’re gonna do creepy stuff with it. Like that’s, that’s just how it works. Right? And there was this remarkable lack of concern on their face. When they’re like, “Oh yeah, that’s, that’s just how online works.” So what do you say to a student like that?

Chris: [34:13] Well, I mean, I wrote an essay about this, “Pedagogy and the Logic of Platforms,” that’s in EDUCAUSE magazine.

Derek: [34:32] Ok.

Chris: [34:23] But using one of my favorite tricks, I mean, let’s shift the context, right? Like throughout history, people have said many things that sound like that. “Oh, well, that’s just how life is that people are enslaved.” “This is how life is that women don’t get to vote, right?” (laughs) “It’s just how life is that we can’t have a black president.”

[34:36] So like so I, you know, I’m sure that people say that, right? As someone who finds myself online a lot, railing against some of these practices. You know, people say that a lot about a variety of these practices. But so the short answer is that’s the lie that companies tell you in order to keep doing things that are, that are the way they are, right? So target advertising, I think, is one of the largest ills of the way the internet works right now. It’s actually not that effective, right?

Derek: [35:39] Right.

Chris: [35:30] It doesn’t work better.

Derek: [35:31] Right.

Chris: [35:33] The only metrics that you have about say it works better are provided to us by Facebook, right? (laughs) So like you could do all, almost all this stuff and not spy on people, right?

Derek: [35:46] Right, right, right. Well, and I think to your point, like sure, that maybe is how it works now, but like, like it’s our internet too, right? Like if there are better ways to do things, we should be able to advocate for better ways and try to make some change.

Chris: [36:02] Yeah, I mean, and also it’s the way it works because a particular set of choices, right? It’s not the only way that it could work. The way it works now is a result of a particular set of choices and policies, mainly dictated by the people who it’s going to benefit the most. And so it’s not, again, it’s not that it has to work that way. It is true that that’s the way it currently works. It’s not true that that’s the way it needs to work.

Derek: [36:34] Well, on that note, on Leading Lines, here on the podcast, we were ostensibly about the future of educational technology and not so much predicting it because I think that’s a fool’s errand, but trying to shape it. So, so given your perspective, where would you like to see educational technology go in the next five years?

Chris: [36:58] Well, I mean, I’d like to see it go away from surveillance and tracking. I’d like to see it go more towards asking people what they want. Things like that, talking to students, you know, talking to them about what they think is going to work for them, rather than using some conglomeration of data to, to make wholesale assumptions. But, you know, I mean, I think I am and ultimately what I typically ask or request to people, right, or when I talk about this stuff is, is to consider the implications small and large scale of using particular technologies and asking students to use them. Or deploying technology and certain technologies against students. I mean, I think, you know, I, as you say, I think it’s a fool’s errand, often to assign intent to people about what they were thinking when they helped build a technology. But I, I don’t, I try not to do that typically, but I think it’s very important for us, educators to not think about intent, but just to think about effect. And often when things are cool or useful. That question, it comes third or fifth, it’s way down low whatever, you know, however long your list is. So that’s what I would encourage people to do in the future.

Derek: [38:47] Thank you, Chris. I’ve got one more question.

Chris: [38:50] Yeah.

Derek: [38:52] We ask all of our, our guests this last question, on the podcast, we talk a lot about digital technologies and digital educational technologies. Do you have a favorite analog educational technology?

Chris: [39:05] I mean, my favorite’s the pencil. (Laughing) I’m not gonna lie. Pencil, just straight old school. You know, there’s probably somebody listening who’s like, “that’s not technology.” It definitely is.

Derek: [39:12] (laughs) Did you say there’s not anyone listening?

Chris: [39:22] No, I said there’s probably someone who’s listening (laughing) that will say that’s not a technology.

Derek: [39:28] All the things are technologies, but it’s also true that there’s probably not anyone listening to my pencil right now?

Chris: [39:39] (laughs) Yes, no one is listening to your pencil, but in terms of like my, you know, my favorite, maybe that is not the answer you’re looking for.

Derek: [39:53] No, it’s good. It’s good.

Chris: [39:56] I think that’s it.

Derek: [39:57] Well, I can imagine that you probably in your classes sometimes have some intentional technology free spaces for your students?

Chris: [40:05] Yeah, absolutely. And I think again, we need to consider, I mean, we didn’t get into this, but how we ask students or didn’t get into it much, how we ask students to use tech and when, but also, I mean, I think it’s important. You know, I have a kid and I mean, very short story like, but the upshot being, I think it’s important that people, we help students figure out what, how they best learn. And often, you know, and some students are going to take notes on their laptop and some are going to write it down and some are going to take it with their phone. Right? And each of those is okay. I mean, I think it’s part of our job is to help students figure out which one works best for them. And so, yeah, I mean, that’s really important.

Derek: [41:02] Well, thank you, Chris. This has been a really great conversation. Thanks for coming on the podcast today.

Chris: [41:06] Oh yeah. Thank you. Appreciate it. (music)

Derek: [41:12] That was Chris Gilliard, Professor of English at Macomb Community College. I keep thinking about his comments about the tracking pixel in that college admissions email and how it disenfranchised certain groups of students. I’m wondering about other ways technology is used in higher education that have unintended, but discriminatory consequences. And I’m grateful for scholars like Chris Gilliard who can help us see the systems we live in through different lenses. If you’re a Twitter user, I highly recommend following Chris there. His handle is hypervisible and you can find a link to his Twitter profile in the show notes, along with a few more links to his work online.

[41:18] I would love to hear your thoughts on this interview and the ways that you teach students to think critically about the technologies they use. You can reach out via email leadinglinespod@vanderbilt.edu or via Twitter, where our handle is @leadinglinespod. If you’re a regular Leading Lines listener, I have a favor to ask. I just checked on iTunes. And while we have a five-star rating, which is awesome, there aren’t any public reviews for the show. If you wouldn’t mind, could you take two minutes and leave a review for Leading Lines? That would go a long way toward getting Leading Lines in front of more potential listeners because algorithms. Thank you.

[42:24] Leading Lines is produced by the Vanderbilt Center for Teaching, the Jean and Alexander Heard Libraries and the Associate Provost for Educational Development and Technologies. This episode was edited by Rhett McDaniel. Look for new episodes the first, and third Monday of each month. (music) I’m your host, Derek Bruff. Thanks for listening. (music)

 


 

Leave a Reply

Your email address will not be published. Required fields are marked *