In this episode, we talk with Mike Caulfield, director of blended and networked learning at Washington State University Vancouver and head of the Digital Polarization Initiative at the American Democracy Project. Mike talks about some of the shortcomings of the way information and web literacy has been traditionally taught, the moves and heuristics he and his colleagues at the Digital Polarization Initiative are teaching their students, and the strategies they’re using for helping students re-think how they make sense of sources and information. If you have any interest in fake news or fact-checking or viral content or just helping students find and work with sources, you’ll find this interview engaging and practical.
- Hapgood, Mike Caulfield’s blog
- @holden, Mike Caulfield’s Twitter account
- Digital Polarization Intiative
- Web Literacy for Student Fact Checkers
- A Short History of CRAAP
Derek Bruff: This is “Leading Lines,” I’m Derek Bruff. Twice this year on the podcast, we’ve had guests mention the work of Mike Caulfield. In both cases, our guests were talking about teaching students Web literacy. In both cases, the interviews were conducted by Leading Lines’ producer and one of my favorite librarians, Melissa Mallon.
I asked Melissa recently, “Hey, I actually know Mike Caulfield from conferences and Twitter. Do you want to see if he’ll come on the podcast?” Well, Melissa said yes, then Mike said yes. So I’m very excited to share our conversation with Mike Caulfield on this episode of Leading Lines.
Mike is the director of blended and networked learning at Washington State University Vancouver and head of the Digital Polarization Initiative at the American Democracy Project.
Mike talks about some of the shortcomings of the ways information and media, and Web literacy have been traditionally taught, and the moves and heuristics he and his colleagues at the Digital Polarization Initiative are teaching their students. If you have an interest in fake news, fact‑checking, viral content, or just helping students find and work with reliable sources, I think you’ll find this interview engaging and practical.
Derek: Mike, thanks for speaking with us today about Web literacy and related topics. I’ll start off by asking what is the Digital Polarization Initiative and why is it important?
Mike Caulfield: It is a multi‑campus initiative. We got out of a nine‑school pilot. It aims to retool the way that we teach media, information, and Web literacy. Particularly in regard to the current challenges we’re seeing in the online space around information.
Derek: You’re trying to change how we teach these things. What’s not working about the way psychologists in universities try to teach these forms of literacy?
Mike: There’s a couple assumptions that are built into a lot of information literacy that don’t really apply to what students are dealing with on a daily basis. The first thing is the difference between the scarcity of information and the scarcity of attention.
Our traditional idea of propaganda and if you go back and think about an Orwellian state controlling information. The idea with that sort of information literacy is information is scarce. You get information that’s put in front of you and you have to dig deeply and look deeply and try and find the hidden contradictions in that.
Notice that maybe, last week we were never at war with East Asia. This week, we’re always at war with East Asia. What’s up with that? Really dig into the piece itself because information is scarce, and attention is rather abundant. We live in a world were information is abundant and attention is scarce.
That means most of the work that you’re doing is figuring out what to read, what to pay attention to, what to invest your emotions and your intellect in.
Then the pieces out of that, the deep critical reading, that’s still important, but if you make wrong decisions in the first 20 seconds of what you’re going to invest your time in, you can be the deepest critical reader that you want and you’re still going to end up more confused than clarified. That’s thing number one.
The second piece of it is that a lot of what we teach students to do, and this comes out of the attention piece, a lot of what we teach students to do is fairly involved, 20 minutes with a document, something like that. It’s analysis. It’s what we would think of as analysis, and because we’re academics we’re really attracted to that idea that we’re going to become better analysts.
As a matter of fact, what you find when you look at people who are really good at this is they’re not necessarily better deep analysts. They have better heuristics. They have better rules of thumb. They’ve built them in at the level of habit.
When they’re approaching information, they’re able to make very quick decisions that even very intelligent deep analysts can’t do. Part of what we do is we try to focus on those heuristics. Then the third thing…
Derek: Did I say there was two things? It’s three, or maybe point 2B, we want to tie those heuristics to specifically to things done in the Web domain. We know from working in education that domain transfer, it’s the million‑dollar question for a lot of this stuff. Yet we know what the environment students will be practicing these skills is, and we need to look at that environment.
We need to match the heuristics to the affordances, the actual digital affordances of the environment they’re practicing them in, and try to build that as a habit.
Melissa Mallon: I have a follow up question related to point 2A, I guess.
Melissa: You’re talking a little bit about the depth of analysis that we’ve sort of found ourselves trying to teach our students. And I’ve had a lot of conversations about this with librarians in the last year or so that there used to be this method of Web evaluations that was very prescriptive, and there were checklists or the CRAAP test, which is a set of criteria that people can use to analyze or dig deeper.
Those were almost more, I think, meant to be quick. Like, “Here’s five to things to look at,” and then you can move on and decide if it’s good for your research or not.
It seems like we’re almost rubber banding back and forth here. Do you have any thoughts on that move away as you suggest to a quicker analysis, but with that being quite so prescriptive about it?
Mike: Yeah, I could talk all day about the CRAAP test.
Mike: The CRAAP test has probably done students more damage than good. In fact, when we look at our assessments that we give students and we see students just making bizarre decisions, very often we can pinpoint that they’re using things that they had been taught via CRAAP. There’s a couple things about CRAAP, CRAAP is actually not that quick. There’s 26 questions associated with the original version of CRAAP.
Melissa: That’s true, that is true. If you did dig into each one.
Mike: And a piece of that is if you actually know how heuristics work, the idea of heuristics of course is that less information is better. That you actually make better decisions on less information than more information.
The problem with CRAAP is if you look at those 26 questions and if you look at those six criteria, you get a very mixed picture of a site. And at the end of it you just throw up your hands. Because this is professionally presented, but it’s a dot org and the author has an email address I can reach.
Mike: These are surface heuristics and there’s too many of them. They’re very surface, there are things that can be fake, there’s things that can be counterfeited, and there’s too many of them. So what you find is that the students, it doesn’t actually help them because when you have 26 criteria it’s as good as having no criteria. It’s maybe even worse. Because you get overloaded and you just pick the ones that seem to support what you think.
When we talk about heuristics, we’re not talking about anything when you sit down and do a checklist. I actually wrote a couple blog posts on why CRAAP goes so wrong. It’s interesting, it was research on the predecessor to CRAAP, Co‑coa, or, I forget what its name…
Mike: …the predecessor, two years after it that showed that students were just overwhelmed by it. When we talk about things, we don’t talk about that level, we talk about things like this. If there’s a breaking story, if the world has…If you just come into your feed that Winona Ryder had died, it’s not going to be on one blog.
We have a heuristic called, “Check other coverage,” and it just says, “Hey. For things that there should be multiple sources, are there multiple sources?” The outcome of that means there’s three directions you can go. You go and you find there is no coverage and you say, “OK, well, this is highly suspicious,” and maybe you go further.
If you think, “Well it’s not Winona Ryder that died, it’s actually the drummer from the British psychedelic band ‘The Pretty Posies’,” or something, OK maybe that’s not broad enough. But you know that there’s something a little suspicious, you know that you’re going to have to dig deeper. You’re going to have to go, or you find it is widely reported.
Then, luckily, since you’ve gone that direction, maybe you get a better story about that. This comes for you in the most viral way, often from the better source. By searching to see of this other coverage, you go and you find an entry point into that story that is better, is more in‑depth, is better reported.
Check out the coverage is a simple one we do. Go upstream to the source, or sometimes called find the original. It’s another one we do. If you’re looking at something where the context is important, don’t look at the block‑quoted version of it. Go and click through and just go up and find the original source and see what it look like in context.
If you’re looking at a photo, try not to find just that photo. Try to find the photo set that it comes out of. Try to find the series of photos. Try to find the one that’s being published by the photographer themselves that may have a caption on it, that is the photographer’s explanation of it.
These are the sorts of levels and they’re very anti‑checklist. As a matter of fact, if you look into the theory of heuristics, there’s a lot of stuff out there that explains why checklists go so wrong, but we try to address that. We’re talking 60‑second decisions in a lot of these cases.
Derek: I think you’ve just identified two of your four moves.
Mike: That’s right. So which one…
Derek: Tell me more about that.
Melissa: That’s perfect.
Mike: I did check on the coverage. I did go upstream to the source. Investigate the source is another one, just know what the source is. That can get a little complex but one of the things that we’ve realized, we try to radically simplify these things.
When you get up to the source, you could take a checklist approach. It could be like, “How was this funded? Is it transparent?” Again, you start to get overwhelmed.
One of the things we encourage students to do is before they investigate the source, imagine in their mind what they think this source is. Click through to find out what it is and if they were wrong, rethink it.
You’re reading something, you think it’s a new source, and you’re like, “Oh, OK. I just got to check where the source is.” Then you click through and you find out this is a blog run by the American Beverage Association.
That doesn’t mean the research site in there is wrong. It means that your original impression which was based on a different conception, you have to throw that away and you have to start from scratch, or you just go and you find a better source. We ask students to investigate the source and really test whether their understanding of what they were looking at was correct.
There are a couple of things we look in there. We have three things that we talk about in terms of transparency, machinery of care, and biased. But really, are you surprised?
The other one is circle back. We don’t talk a lot about circle back in a lot of these interviews I do because it’s not apparent why that’s so important until you watch the students do it.
One thing you’ll find the students do is they’ll pick an original search term, they’ll pick an original path, and they will just follow that down, [laughs] and down, and down, even as it gets incredibly confusing.
Just the idea that if you go and find this is the American Beverage Association. It’s actually reporting on stuff that is related to the welfare of Pepsi. If you keep going down that path, maybe just circle back and say, “OK, well, now that I know that there’s this study and this is the result, maybe I can formulate a new search based on this result.”
One of the things that we find, it’s a fascinating piece about the Web, very often students will have the right answer 15 seconds into their search, 20 seconds into it. You leave them with the Web for three minutes and they have the wrong answer. Why is that? Because the answers that they’re getting first are actually fairly standard answers from recognized authorities.
As you start to navigate down these paths on the Web, you start to get to more and more fringe stuff. You start to get to the opinion, you start to get to all these other things. Because of the way our minds work, because of recency effects, the things that you just looked at are weighted heavier than the things you started to look at.
The nature of the experience of having the most trustworthy, descriptor stuff upfront, and the least trustworthy at the end of your experience causes you to wrongly weigh some of these different views of things. Circling back is important to make the students realize that if they feel like they’re going through a labyrinth, they need to just back up and try again.
Melissa: I’ve also noticed that there’s this weird middle ground of the more traditional safe sources that maybe they’ve been taught all their lives on one end. On the other end, it’s the sensational tabloidesque‑type resources. Then, in the middle, there are ones that seem like they maybe. It could go either way.
I’ve observed students getting caught up in that middle ground of, “This seems legitimate, but I’m not so sure because I don’t recognize.” Maybe they don’t know about whatever news organization it’s coming from, and so you see them get caught up. That blends to the idea of circling back.
If in your gut, you’re thinking this probably is a trustworthy and legitimate place to start, maybe go back to that. That’s OK.
Mike: That’s the thing. Most of these questions, we tell our students the Web is abundant. You have to have abundance mindset when you approach the Web. What that means is it could be the case that you’re on some nutritional supplement site that has a treatment of whether Vitamin D wards off various forms of cancer or not.
It’s not like someone printed out this sheet of paper and you’re stuck with this or nothing. Maybe you look at that and you say, “Oh, this raises interesting questions.” But, is it the best place. If it’s not, hit the back button.
Derek: I teach a first year writing seminar. I’m a mathematician by training and it’s a math course, but we do a research paper in this course. One of the things that I’m always finding challenging it’s a course on, among other things, privacy and surveillance.
Students come in with a fairly limited background in these topics. By the end of the course, I want them to craft a thesis and make an argument and have an opinion.
To get there, it’s really tempting for them, at some point, just to say, “I’m gonna pick a thesis, and then I’m gonna go find some sources to support that thesis.” That’s not how we make meaning as scholars. We dig, we explore, we find connections, and eventually, we’re like, “Oh, I see a thing here. Let me see if I can name that and articulate that better.”
What I’m also hearing from you is that, if students are just trying to find five sources that back up a claim, that’s a very different set of behaviors than, “I have questions about this topic. Let me see if I can find some useful information on that.”
That requires exploration, searching back and following some trails for a while.
Mike: In part of Check other Coverage, you’re trying to get people to realize, before they figure out where they are in the spectrum of opinion, what’s the broad consensus around this? Can you identify the broad consensus?
We have these exercises where we just have them do the Google News search, scan the blurbs, and just say, “OK, now, before you even look at the best blurb, what seems to be the consensus about what happened here?” Just scan the blurbs, and get a sense of the consensus.
You might disagree with that consensus ‑‑ and many smart people disagree with consensus opinion ‑‑ but only a fool will disagree with consensus in opinion without first figuring out what the consensus of people that are schooled in this is.
Getting students to think in that way is kind of a zoom‑out approach. That before we get into whether we think this happened or we didn’t, can we identify what the consensus story is, what the consensus narrative is, what the consensus findings are?
Then move past that, if we want, but move past it in a more knowledgeable fashion.
Melissa: How do students react to that? Do you feel like they’re coming along for the ride, for the most part? Is there a pushback?
Mike: I haven’t gotten much perspective of the classes I’ve taught individually. We have 45 other classes across this project. In those 45 classes, we’ve seen some stuff in the assessment that shows that some students do push back a little bit.
But given that we’re doing these two‑week interventions, less than you would think. We have one on gun control. It’s probably our hardest prompt.
It’s a MoveOn tweet that’s referencing a “Center for American Progress” article that is citing a Brookings poll that was done by Public Policy Polling.
Melissa: Whew. That’s a lot of layers.
Mike: That’s a lot of layers.
Derek: It’s actually a pretty difficult prompt. What mastery would be is saying, “Well, look. There’s a bunch of intermediaries here that are putting a spin on it. The funding for the poll was provided by this organization, which has an agenda. Even if it’s a center’s agenda, it’s still an agenda. The polling organization does have a reputation for doing quality, non‑partisan polling.
If you look at the dozen other polls that have done on the same issue, some of them find 10 percent higher on this particular poll. Some of them find 10 percent lower, but they’re all within that range. That would be mastery. That’s the response you just die to have.
Now students come in. Literally, their reaction was, “Yeah, this seems true because all my friends feel the same way. It’s only these old jokers who…
Mike: …disagree with gun safety,” or “This isn’t true. This is just more liberal whatever,” or “Yeah. [scoffs] Twitter. I’m going to read Twitter.”
Mike: These very impulsive things where they’re not going beyond what’s in front of them. By the end of the course I would say that, if you are really defensive, the poll supports the idea that there’s actually broad support even among gun owners for various gun safety initiatives. That’s what the polls support.
I would say that, if you look in the data, you do see that some people are struggling with the fact that, “Hey. I looked into this. The poll is this Brookings institution, which is a research institution,” or “I found this fact check on it, which seems to summarize it. It seems like it’s true, but I don’t think it’s true.” You’ll still see some of that.
What’s interesting is you still see that they move from a very low trust in it to a moderate trust in it. It’s like they’ve moved a little but their actual identity is going to…The world would be a horrible place if we didn’t have a really rigidly stable identities that we could refer to on a daily basis. If your identity just shifted based on three hours of instruction, that’s a scary world.
Melissa: [laughs] Mm‑hmm.
Mike: You do see that they pack off from the emotion a little bit. You do see that their trust level goes up on it even if they’re still dismissive of it.
Yeah, I think it works. The more counterintuitive finding that we found is that students become much less cynical after this. Everybody says, “Oh, well you give students media literacy, they’re going to turn into these cynics.” Or everybody says, “Oh, you got to teach students to debunk.”
We find the offset, we find the students coming in and they don’t trust anything. Everything is like, “Eh! Everybody has an opinion. Everybody has an agenda.” They’re very, very cynical. We find after the course, they’re far less cynical.
They’re keeping the trust ratings of the stuff that’s unreliable down. They’re staying down. But the biggest impact we’re seeing is that their trust in prompts that are true or credible is going up.
Melissa: That’s pretty amazing and great to hear, really.
Derek: I’m not surprised that they come in not trusting things. I get the same vibe from my students that all news is biased. Everyone’s got the agenda, so that’s the powerful result.
Do you have any sense of…because I asked earlier about the research context where students were pursuing a topic and trying to find useful information, but I think a lot of these tools are also helpful when the news hits your feed and you’re trying to make sense of this bit of information.
Do you have any indication that students are processing that kind of reactive moment differently, thanks to these heuristics?
Mike: Obviously, we can’t spy on our students. [laughs] There’s a couple of indications that with enough practice it could have impact. One of the things that we saw was that in the pre‑assessment and the post‑assessment, when you looked at it, the students did better in the post‑assessment. They’re really a lot better. Because I actually executed the assessment and did the protocol, the thing that surprised me was they took less time.
On the pre‑assessment, they were just digging really deep into these things. And everybody, they’re like, “No, no, I can’t…” I’m like, “It’s the protocol, we can’t shut you down.”
Mike: On the post‑assessment, we had students done in half the time and they just moved on. The reason why I think that’s important is because a lot of the reason why people do not investigate the things that come into their feed, or they’ll share it and then they’ll say something along, “Well, I don’t know if it’s true or false, but I just liked it.”
Mike: I actually believe a lot of these dismissal behaviors, because people believed that the process of finding out whether something was more or less reliable is such an involved lengthy process that they could never do it with all these stuff in their feed. So they don’t do it, they can either get the quick sugar hit of sharing this, or reading this, or getting a rage, or whatever. Or in their mind, they could do this 20‑minute process where they go through this which is a complete buzzkill.
The fact that we’re showing them 60‑second processes, I think, is really important. The fact that they’re taking less time to solve these things at the end and feeling more relaxed about that is in the nature of the news feed. That’s a really good thing.
Derek: Right, because if some news hits their feed and they think, “Oh, let me hit Google News and see what other people are saying,” 10 seconds later, you have a sense, right?
Mike: 10 seconds later, you have a sense. That’s part of what we say too, is with the heuristics, it’s…Sam Wineburg, who I worked with on some of this stuff, has this wonderful phrase, “taking bearings.” So much of what we’re trying to show the students is how to take bearings on a claim, or an issue, or source.
Sam’s analogy is imagine you just dropped in a random location, parachuting to a random location. Now, you got to get out of the woods. What do you do? If you’re really dumb or maybe just really impulsive, you just pick a direction. You bolt in that direction.
I think what most people familiar with that situation would tell you is the better idea is take a few minutes, figure out which way is north, which way is east. You’re going to get going in a few minutes, but do the initial work before you start that longer journey.
Melissa: It will save your time down the line, right?
Mike: It will save your time in the end. Another thing we find related to papers that’s really interesting is the go upstream to the source. [laughs] One of the things I found in talking to faculty is they actually think that’s one of the more useful things.
They see a lot of students coming in that are, in other ways, very, very smart and hip to research, and yet they’re citing sources that are the coverage of the press release of the actual study and not understanding there’s a few layers of distortion there.
The press release is sometimes helpful because it can give you a broad understanding, but you got to go and…You may have quotes from the researchers which help ground your understanding of it, but you got to go, you got to make sure that these things match.
There are things in that way too that I think can be very helpful to research. They’re simple things. You know this. We teach the students, “Now, here’s how science works. Here’s statistical significance. Here’s what qualifies as good statistical significance. Here, it affects size.”
We teach them all of this machinery to do the detective work, themselves. That’s the journey. That’s the taking off through the woods, but we don’t give them the three‑minute protocol before they start digging into that paper to make sure that effort is being put into something that’s useful.
Derek: I have so many questions.
Melissa: [laughs] That’s fine.
Derek: Can you say a little bit about the role Wikipedia can play in these heuristics?
Mike: Yeah. One of the things our students come through ‑‑ they’ve been taught in K‑12 not to trust Wikipedia ‑‑ what you see when you look at fact checkers ‑‑ Sam Wineburg and Sarah McGrew have done some of this work ‑‑ is fact checkers use Wikipedia as their first stop.
If you do a Google search, it’s the best place to get a summary on an organization, on an initiative, on an event. Of course, it’s all sourced at the bottom. If you need a deep knowledge of an issue, you’re going to have to dig into those links.
Very often, you don’t need a deep knowledge. Very often, you just want to find out, “Hey, is this a right‑leaning site, or is this actually a white supremacist organization I’m reading?” You need that level, right?
Derek: Right. Is this is a medical journal, or is this a pharmaceutical vendor?
Mike: Is this newspaper broadly considered to be a tabloid, or does this have a history of reliable journalism? That stuff, you can get from Wikipedia. Wikipedia is very good at that.
There’s a number of reasons why students come in with that. One is, of course, that their teachers have taught them that. I think our understanding of the trustworthiness of Wikipedia and how Wikipedia works is stuck in 2006 for some reason.
Mike: People do not understand the vast efforts in progress that have been made in protecting articles from vandalism and the whole processes around citation and so forth. When I actually show people some of the mechanisms that are in Wikipedia for the larger stuff…Anybody can go and find a very small topic that gets hit 300 times a year and insert something in it.
I’m not saying that that’s not possible, but for the articles that you’re actually hitting, when I show people the mechanisms that are in place to protect those articles, they’re usually shocked. No one has shown them this. As a result, they’re skipping Wikipedia, which could be their best first stop for a lot of this.
Derek: Again, it’s quick to do the search.
Mike: Yeah, it’s really quick.
Derek: 10 seconds later, you have a general sense of what the source is about. Then you can decide, “Do I want to keep going, or is this a little…?”
Mike: You get to choose your position with these things. If you look at the Wikipedia article, and it’s like, “I’ll just find something else,” that’s great. If this is the only possible source for this thing, and the Wikipedia article seems to say this but you’re not sure about it, then maybe you have to dig deeper. You got to decide what level of precision you want for this task.
Derek: You’ve got your bearings at that point.
Mike: You got your bearings, yeah.
Derek: You know which direction you want to go.
Derek: You’ve mentioned these courses that you’ve put together. What does it look like when you’re working with students around this? Are there activities you found helpful or structures? You’ve talked a lot about what you’re teaching them. How do you teach them these things?
Mike: We have the students bring in their laptops. Some students do it on mobile devices as well. We prefer starting off with laptops. Some of the moves are easier initially on laptops because of your ability to get up into the omni‑powered stuff.
We have the students bring those in. Then we go through prompts. We just drop a prompt and this kind of a rhythm to the class. Like we’ll throw up a Swolebama. There’s a picture on the Internet of Obama. He’s all buff.
We announce it and say, “OK, here’s a picture of Obama. Caption reads, ‘Whoa, retirement for Obama has really worked out, been working out with The Rock.’ Is this picture real or not?” Initially, the students always want to approach it via the thing in front of them instead of using the network of reputation that the Web provides. They’ll say, “Well, of course it’s faked. I can see the, whatever.”
In this case, it turns out to be fake, but we tell them, “You gotta do the work.” If you want to tell me this is fake, you got to tell me where the real picture was taken. Just go and find the real picture. Find the original and if you can tell me where this real…”
Then they go, they do that and when about 75 percent of the students get through that prompt, which usually takes, I don’t know it’s fast, it’s like 2.5 minutes or so?
We have a little signal when they believe they’ve got it. Then we’ll just go rapid fire and get each piece of the story from the students. We’ll say, “OK,” and we use the little ticket system where students try to get rid of their tickets. We don’t have one person dominating the conversation. If students know something they’re excited to get rid of one of their two posted notes. I will say, I’m not sure. This is a teaching learning blog so people know what I’m talking about.
Where you have the students write their name on to like, some people do with popsicle sticks. You have students write the name on two posted notes and then when they want to answer a question they have to raise your post you know when you take it and then you give participation points based on that.
Derek: When they’re out of tickets they’re…
Mike: …and when they’re out of tickets they can only answer something if no one else has the answer. And so it really deals with this problem that students are coming in with different levels of this. It also makes students super excited when they have an answer they’re like, I could get rid of my ticket. So we do that and we’ll break it up into smaller questions.
We’ll say, “Caitlin. True or false?” And she’ll say, “True.” And you take the ticket and you say Brian, “Where’s this picture from?” Brian, “Hawaii?” Great. You know that’s the one.
Does anybody know where it was modified? Oh, Reddit. It’s from…Yeah. What’s the group’s name on Reddit? So you’ll go through this and you just have the students do this in a very quick fashion. But that piece that sort of quickness is really crucial to the pedagogy because you’re trying to get the students to understand this is something very snappy, where they’re trying to be very directed with what they’re doing, find out the specific things that they need to know and not over complicate it.
And then what you’ll do is you’ll have those prompts and there’s a certain rhythm to it. You’ll do a bunch of really easy ones and then you throw in a hard one that actually is a little more nuance, and you’ll make sure that you’re building the student’s ability to sort out the quick stuff quickly, but that you’re not eroding their ability to pay attention when there’s a deeper story under here.
You try to get that rhythm right. I think in general it’s like I don’t know, three out of four are very quick, or maybe four out of five are very quick, and then you’ll hit one that’s maybe sort of a deeper questions and then if the students want to talk about some of the social issues that come up, then you engage them in the social issues.
We have one where it shows a bunch of stuff on patch.com when the hurricane was coming in saying, “Hurricane making a B line for New York,” and it’s fake, it’s absolutely fake and happens with every hurricane. All these stories that, “Oh, the hurricane’s going to be a direct hit on your city.” Yeah, just like Johnny Depp has always moved to your town.
Derek: Yes. I remember that.
Melissa: I’m pretty sure he really lives in Nashville though. I saw an article.
Mike: With that one we go through it and we say, “OK, well what is the site it’s on? Is this true?” and you use those things. You check other coverages and the consensus is, “No. It’s going to hit somewhere in the middle sea port.”
You check the source and you realize, “Oh, anybody can post to this particular news source, it’s a community news source, is not vetted, no editorial policy.” But then you press the students and you say, “How might this be damaging to society?” And what’s interesting to me is the students don’t know this at first. They don’t know this at first.
I think you immediately can see what would happen, but the students don’t see it and you have to wait and be a little patient while they discover, “Wow. If you show people repeatedly that hurricanes are going to hit their town, and they don’t hit their town, A when an actual hurricane is making a B line they’re going to be like ‘They always say this.'”
They always say this. “Why should I leave?” B, they’re going to get the impression that climate science and weather reporting is that no one knows everything. There’s both a specific erosion that could put people in jeopardy because people die mostly in these things when they don’t evacuate. It could kill people. It could literally kill people.
Then there’s this broader thing that could kill us all.
Derek: That was Mike Caulfield, Director of Blended and Networked Learning at Washington State University, Vancouver and head of the Digital Polarization Initiative at the American Democracy Project. Mike had so many interesting things to say. Melissa and I had a tough time cutting this interview down for time.
If you’d like to hear more from Mike or learn more about the Digital Polarization Initiative, check the show notes for links. Mike’s blog is really fantastic. He’s quite prolific there, and his Twitter account is great, too. His new open‑access book, “Web Literacy for Student Fact‑Checkers,” is full of great resources, including the four moves that he talked about in the interview.
I’ve also put a link to his recent blog post titled “A Short History of Crap” for more on the problems with the checklist approach to information literacy. You’ll find those show notes as well as past episodes and transcripts on our website, leadinglinespod.com. It is a dot com and not a dot org.
If you have thoughts about this episode, please share them either on the website or on Twitter, where we can be found @leadinglinespod or via email, firstname.lastname@example.org.
Leading Lines is produced by the Vanderbilt Center for Teaching, the Vanderbilt Institute for Digital Learning, the Office of Scholarly Communications at the Vanderbilt libraries, and the Associate Provost for Education, Development, and Technologies. This episode was edited by Rhett McDaniel. Look for new episodes the first and third Monday of each month.
I’m your host, Derek Bruff. Thanks for listening.
Transcription by CastingWords