Episode 9:
Paul-Olivier Dehaye

In this episode, we feature an interview with mathematician Paul Dehaye. Dehaye is known as the instructor of a 2014 massive open online course (MOOC) about massive open online courses that was mysteriously cancelled one week in. Dehaye is interviewed by John Sloop, Vanderbilt’s Associate Provost for Digital Learning, who met Dehaye at an Open edX conference last summer. Dehaye shares his perspective on that 2014 incident, and he comments on the role of for-profit companies in higher education, the future of online education, and the still untapped potential of MOOCs.

Links

Transcript

00:00 background music

Derek Bruff:  00:09 Welcome to “Leading Lines,” a podcast for Vanderbilt University. I’m your host, Derek Bruff, Director of the Vanderbilt Center for Teaching. Leading Lines is produced by the Center for Teaching, the Vanderbilt Institute for Digital Learning, the Office of Scholarly Communications, and the Office of the Associate Provost for Digital Learning. One of the benefits of producing this podcast across a number of groups here at Vanderbilt is that we’re casting a wide net for interview subjects.

00:29 In this episode we feature an interview with someone I probably wouldn’t have picked for this podcast, but my colleague, John Sloop, Associate Provost for Digital Learning here at Vanderbilt, met Paul Dehaye at a conference earlier this year and found Dehaye’s perspective on higher education fascinating.

00:46 John really wanted to interview Dehaye for the podcast. I wasn’t thrilled with the idea, but I respect John a lot and I wanted to keep an open mind. See, I know Paul Dehaye as the instructor of a massive open online course, or MOOC, about massive open online courses, who, back in 2014, deleted his entire course one week in. The first week of the course, which was called Teaching Goes Massive: New Skills Required, ran like any other MOOC with some videos from Dehaye, active discussion boards and so on.

01:17 After a week, however, everything in the course vanished — the videos, the discussion boards, all the student posts, everything. Dehaye went radio silent, no word from him on his blog, on Twitter, through the course platform. It was several days before he posted something of an explanation.

01:33 Reading his explanation back at the time, it seemed to me that Dehaye had deleted his course as a cross between a publicity stunt and a public art project. I gathered he wanted to get a few thousand people active in his MOOC, then pull the plug to make the point that when we participate in online, supposedly free courses, we don’t control our experience or our data.

01:52 The vendors like Coursera, who hosted his MOOC, are the ones who control all that. Those discussion forum posts that all his students wrote, that intellectual work was gone as a way to prove his point.

02:03 I didn’t think the stunt was very respectful to the hundreds of students who enrolled in his course and Dehaye’s public statements, including those made on Twitter, didn’t reassure me of his intentions. So when John Sloop brought his name up as a possible podcast interview, I was rather skeptical, but as I said, I wanted to be open minded.

02:21 What follows is John’s interview with Paul Dehaye, conducted this past summer. It’s definitely interesting and it makes clear that there was more to the story of Dehaye’s MOOC experiment back in 2014 than what I heard or understood. It also provides a little behind the scenes’ perspective on some of the MOOC mania of years past and Dehaye raises some important questions about the future of online education, and the still untapped potential of MOOCs.

02:45 background music

John Sloop:  02:47 So I’m here today with Paul Dehaye, who, at the time of the broadcast of this podcast we first put out, will be a scientific collaborator, working on a computer science mathematics research project for the European Commission. He will be based, as he has been for a while, at the University of Zurich’s mathematics department. How are you? We’ll start off with just a hello. How are you, Paul?

Paul Dehaye:  03:14 I’m doing very good, thanks. I’m glad to be here.

John:  03:17 Yeah, I’m happy to have you joining me. I met Paul first at a conference a while back dealing with Open edX, and issues with Open edX, but I had known you before then because of some of your fame, notoriety, I don’t know which word would be better, caused by a MOOC that you had run on Coursera.

03:44 Let me give a little summary because the story is a bit confusing still, if you try to read about it on the Internet, and I wanted to hear a little bit about your thoughts on that.

03:55 You first came to my attention, and to a lot of people’s attention, outside of the mathematics world for a course on Coursera called Teaching Goes Massive: New Skills Required. Inside Higher Ed described this as a MOOC for those skeptical of MOOCs.

04:11 After a week — this is my understanding of the events as they unfolded — after a week, the course disappeared, and then it reappeared with no content, without answers, etc. You tweeted to your students. “Students reflect on the fact that a technology company has now effectively replaced your teacher,” saying that you were removed from the massive teaching course.

04:33 Then there’s a great deal of confusion in the online stories about this. Now, I don’t know how confusing it was at the time for you or your students. Coursera comes out and apologized and said they were trying to complete the course, you claimed that you were still on the course, but it removed content, and it goes on and on. Then it appears as if this was an experiment to let students understand how Coursera works, what universities do, etc.

05:02 One of the students wrote that you appeared to be conducting a social media MOOC experiment. Later, you wrote on MoPad the goal of this experiment, you wrote, was “to confuse everyone, including the university, Coursera, the Twitter world, as many journalists as I can and the course participants, the goal being to attract publicity. I want to show how Coursera tracks you,” end quote.

05:23 I know that came down. After this happened, George Simmons wrote a blog post. It was a congratulations post, but mixed reviews. He seemed to be a little confused even about what was the overall purpose or goal.

05:38 Now, talking to you today and reflecting on it, can you tell me a little bit about what you were up to and if you felt like you were successful raising the issues that you wanted to raise?

Paul:  05:49 Just one thing, there’s one quote you raised, the goal of this experiment was too confused and so on. I didn’t technically say that. That’s a quote that appears in Inside Higher Ed. I’ve tried to get them to correct it, but they won’t correct it. I said something very similar, but not that.

06:10 The goal of removing the content was to get students to think first, also attract attention, indeed. Because at the moment I removed the content of the course, I was extremely confused about Coursera’s data practices. I couldn’t get any straight answers from them. Also, it seemed to me like they were not ready to respect European privacy laws.

06:37 That was my understanding at the time, but it was an extremely confusing time for me and there are still many parts of this that are confusing to me. I’ve tried to get clarification from Coursera or my university and it has been very difficult.

06:54 There are still many things that are completely unclear, in particular how my course was suspended, who made the decision to delete some of my posts or to silent‑ban my account. Silent‑banning means that to me it appears as if I am posting, but to the others they just never see anything. Effectively like having my tongue cut.

07:24 The reason I was trying to confuse people and I felt my only way out of this MOOC was that I had little support around me. I had signed a lot of contracts that were very binding in terms of what I could say and what I couldn’t say about Coursera, about the contracts I had signed themselves, about the data policies of Coursera, all those things.

07:50 For better or worse, that’s the strategy I adopted, expecting that later on I would be able to clarify once some information bubbled up. That never actually happened or actually happened much, much, much later because there were all kinds of legal issues thrown in.

08:13 There was a disciplinary inquiry by my university that physically required me to not say anything. That’s why I didn’t answer journalists at the time, when they were investigating. That was in June 2014, by the way. June, July. It was later clarified, I feel, in the press in Europe. In Germany and Switzerland, to some extent, also in the French‑speaking press in Switzerland, but it has never made it back to the US.

John:  08:46 If it has, it has not made it back in a way that’s very easy to find. A quick sleuthing on Google, etc., really brings up the same stories from 2014.

Paul:  08:59 Right. That’s another big problem in terms of…my whole stance on this is very proactive with regards to data and how it can affect people and stuff from your past. That’s exactly an example of it.

09:16 Some incident happened in my professional past, some hot take was written on it within a couple weeks where it’s very unclear what’s going on, there is a misquote in Inside Higher Ed that drives a lot of coverage at the time and misleads a lot of people. Then, eventually, that’s the settled fact, if you want, of a Google search on my name.

09:41 If you look, there is five or six newspaper articles that have been written since in the biggest newspapers in Germany, I think the three biggest, and that doesn’t bubble up of course. It’s in German. The fact that I deleted my course was not planned. There were other parts of the course that were planned as experiments.

10:01 That’s why this conflation of quotes is very confusing to lots of people that are trying to understand what’s going on. There were other experiments that were completely announced to students as such. The course was completely experimental. I was completely open with students about that.

10:21 I saw a lot of parallels with another incident that happened at the same time in that weekend at the end of my first week of the MOOC, which was this Facebook emotion study, where Facebook actively manipulated users to try to make some of them sad and some of them happy.

10:40 It turns out that Coursera is also in certain ways manipulating courses, manipulating instructors, manipulating content, if you want, in ways that they can do because of their central location in this whole ecosystem. One has to be really aware of those things and think through the consequences long term for education in general.

John:  11:05 That likely brings us in a way to a new article that you have coming out in Academe. If we could, I’d like to talk about that for just a second. Here’s what I want to do, you have an article coming out in Academe on MOOC Surveillance and Control.

11:23 What I’m going to do is quote your conclusion and then ask you to tell us a little about what you’re arguing and how you get there because it’s a pretty interesting conclusion, it’s pretty forceful.

11:36 Here’s what you say, “In the end, we risk being collectively complicit in an unconditional intellectual surrender to venture capital‑funded educational disruption. Indeed, we risk directly contributing to the numbing effects of constant and indiscriminate surveillance. As professors, we should always insist that education remains emancipating and should resist the coercive logic of surveillance capitalism.”

12:05 Ending the quote there, so I’d like to hear a little bit about what you’re saying here. Most professors would agree with that last line, “We should always insist that education remain emancipating.”

12:15 My take, and reading you, is that some of us, some faculty are…may think that what they’re doing is emancipating, but in fact they’re working with them what you called the course of logic of surveillance capitalism. Can you tell me a little bit about what you’re overall argument is?

Paul:  12:35 There’s no doubt that bringing education to the masses is a great goal and it’s extremely emancipating. The question is how exactly to them and what are the consequences medium term. How is this short term logic of bringing a course to a mass of people affecting things five years down the road?

12:57 What I explained in the article is the mechanics of how MOOCs can evolve and suddenly not seem as ideal as they are when you start them. We can talk about that logic or that mechanics, but MOOCs are collecting a lot of data and this data has to be monetized some way eventually.

13:22 I mean, you can think that they will always make money off the certificates, but that’s not the logic of venture capital. The logic of venture capital is that you should have returns on your investments that are huge, that are 10, 20 fold as much as possible. If there is money to be made from the data, there will be money made from the data. That’s it.

13:45 One of the things that is anticipated by Coursera contract, one of the monetization strategies that have been added in a bunch of contracts, probably also with Vanderbilt and other universities, is transcript services. With the consent of the user, with the consent of the student, Coursera can start building a transcript service. Some service that will certify, “Yes, this student has this degree.”

14:15 Now, that’s starting to look very much like a digital rights management system, a DRM system to me. You don’t own your songs anymore. You don’t own your old records. You start renting them from Apple, and Apple constantly monetizes on that.

14:30 That’s an example of a very…a danger to me is the fact that suddenly students would not own their degrees anymore. They wouldn’t be able to go to the Vanderbilt office, pay a small fee, and then they get a copy of their degree, but they would have to constantly pay to be listed in some global search engines for skills.

14:54 That’s potentially very problematic. Of course the employers would be really happy. You have a company that’s needs to relocate or to find…to build a new extension somewhere. They need to find locals that have lots of skills, it’s a very specific skills, there you go. You find them right away.

15:14 You can see a whole host of data certifying that those students really have the skills. Most likely others in other locations have taken the same courses so it’s very easy to make sure that they have the same level. The employers are happy, but the students might not be so happy because they have to constantly pay to be listed there, or someone has to pay there and a lot of money is extracted from that.

15:43 That’s one possible future. Of course, it’s very pessimistic. I realize that view is very pessimistic, but I’m hopeful that others hear that and anticipate it and react to it. It doesn’t need to happen. If others see a danger, then we’ll react collectively and build other solutions.

John:  16:05 Your fundamental statement, your starting point is undeniable. Platforms for MOOCs don’t exist outside of an economic system. They have to have either venture capital or they have to operate on capital in some other way. edX operates on a very different system than Coursera, but it also has to have money coming into it.

16:26 There have to be ways of creating revenue streams and I understand the direction you’re pointing into in terms of monetizing it, does take a stand and a path with…it could be problematic. What I’m seeing more and more of, as we’re going forward, are more and more ways of attempting to credentialize the material in ways that are articulated to you to the universities themselves.

16:57 That might be a more positive way of going about this, so that if a university gives the credential rather than the credential coming through Coursera, one of your fears may be less likely realized, that way they come back to Vanderbilt or to Zurich or whoever to get their credentials. That might be a little different. There’s still going to be some…there’s still problems to think through etc.

Paul:  17:21 Do you think so? Coursera has all the data. They could start just delivering an alternate certificate. They could just make it linkable with LinkedIn or with other employers, monster.com, Amazon, whatever you want. Amazon web services, Amazon Mechanical Turk. Suddenly you have Turkers.

17:44 I don’t know if you know what Mechanical Turk is. It’s a platform that’s built originally by Amazon where people can do very, very basic tasks that are difficult for computers but easy for humans. All kinds of classification. It’s often used as source of training data for artificial intelligence models. You can link the two. You can link Mechanical Turk with some form of degree that has been certified somehow by real instructors.

18:21 There’s nothing really to prevent Coursera from building that. That’s in the system. That’s in the logic of the contracts that the universities have signed. I’m sure that Vanderbilt will not build something like that, but if Coursera thinks there is value there to construct it, they will.

John:  18:45 Every university, I’m sure, has a legal counsel look over these contracts, but because MOOCs has been a relatively new phenomenon, there are implications, I’m sure, in the legal contracts that most of us haven’t even thought through yet. Just because we couldn’t.

Paul:  18:59 They might have. Let me tell you my experience. I complained about some aspects of the contracts to the Zurich local regulator. The result is that, for a while, and possibly still, the University of Zurich hasn’t been using Coursera because it’s not legally compliant with local laws. Another university in Switzerland, which I won’t name, has renegotiated their contract because they’ve realized there were a lot of problems there.

19:26 A lot of those contracts were rushed extremely fast two or three years ago. I’m quite convinced that legal issues were not the most forefront concerns. A lot of people were concerned at the time with copyright. They really wanted to retain copyright over the courses and just grant a license for Coursera to use the content, but keep the freedom to use it on other platforms.

19:59 I guess universities have been burned with all the scientific publishing and understand those issues now, but here there’s a whole host of other issues because the content they’re offering is being interacted with by a lot of people. So they’re creating a lot of very useful data around that content that can be very, very interesting.

20:20 Here is another example. The translations are crowdsourced. The translations is derivative work that belongs to Coursera, even though it’s made by lots of individual students who are grateful to the universities producing the content originally, but the translations belong to Coursera, at least in all the contracts I’ve seen.

John:  20:44 I’m assuming that a lot of universities have negotiated differences in some of the contracts. What you’re saying is that the majority of the contracts at base allow the translations to be owned by Coursera. Is that correct?

Paul:  21:01 Yes. Yes. An analogy that I’m thinking of, and that I find very useful and I think it will resonate with your own background…I’m picturing we’re in 1996, or something like that, at the “New York Times.” There’s this Internet. We have a lot of very valuable content. We pay journalists to go all over the world. We pay professors to learn at the best universities. Then they produce content and that content is useful to lots of people, but we don’t know how to monetize it, so let’s just put it on the web for free.

21:38 That’s essentially what universities have been doing and what newspapers did 20 years ago. The result is that the newspapers contributed to build a whole ecosystem around online advertising. Programmatic advertising that survives on a lot of tracking of individuals.

21:59 Now the best solutions newspapers have is to ally with Facebook and distribute their content on Facebook to try to get a cut of the revenue, but they’re just being cannibalized completely by those centralized platforms. It’s absolutely terrible what happened over 20 years in journalism. That happened because there are those software mediator between the reader and the journalist. We’re building exactly the same. We’re building platform intermediaries between students and teachers or universities.

John:  22:38 That analogy actually I think everyone is going to find very compelling. It makes a lot of sense to trace it back that way and think about us going forward. I want to move on a little bit to talk about platforms. You use Coursera as an example quite often and Coursera is the largest platform provider for most English speaking universities, as I understand. We also have other options.

23:02 I met you at an open edX conference, which is an open source version of edX. edX would also be an option for some universities as a platform. What do you see as the differences between these? I want to put you in the role of the overseer for a university. A chancellor, etc. You’re faced with MOOCs. The question of MOOCs or putting content online.

23:26 How would you choose amongst those type of platforms, or would you choose not to go in that direction at all? What would you choose to do?

Paul:  23:36 Putting in as the choice of the provost or the chancellor is already affecting the decision very much because then that person has to decide based on something that will work for everyone. Ease of use becomes extremely important. Ease of deployment becomes extremely important. So they will go for the easiest solution.

23:59 That’s where there’s most support and that’s exactly what Coursera tries to answer. I don’t think it should be a decision of the provost. If a professor wants to use their own platform or another platform, they should be encouraged just as much as the other professors. Also, if it’s taught as a solution that a provost has to come up with on their own, it’s pretty restrictive as well. You know a lot of other professors in similar positions at other universities.

24:33 I’m sure there are alliances of universities, just as there are in Europe. Why not build communal solutions that we own. That we own really. Joint ventures between, I don’t know, Nashville, Vanderbilt and your favorite other university around that you collaborate with on a lot of other issues. Then offer the possibility to your professors to host courses there, if you want.

25:03 Do the advertising jointly and save money that way, all those things. That’s really possible, but I’m not entirely sure that many provosts think it’s possible.

John:  25:11 What I would do, is take it away from…I understand what you’re…I like the way you’re answering the question. You’re talking about a structural position, but if you were making the choice, from your position, you’re saying, there are other creative ways of doing this. Using open source, or using other types of platforms, making coalitions that are outside of that other structure.

25:33 That’s the direction you would prefer as a faculty member, as a person who’s doing that.

Paul:  25:39 Yeah. It would be a lot richer, too. Because if you’re a provost, pick up your phone and call another provost. Try to see if you can build something together. Two, three, ten different universities together. It will be extremely interesting because then you will really be able to freely collaborate away from a lot of issues that have to do with privacy and all those things. You really are in control and more things are allowed to you to do.

26:08 You can do a course in one place, and share the data with another. You can do joint courses. You’re free from a lot of the shackles that would come from a centralized platform. That would be my impression.

John:  26:23 Just to make sure we’re clear on this while we’re talking about it, you don’t have a problem with MOOCs or online education in general. You have a problem with the way in which platforms have used data. The contracts between professors and universities and an entity like Coursera. Theoretically the idea of online content doesn’t bother you?

Paul:  26:47 No, not at all. On the contrary I got into this mess, if you want, because I was really curious about what was possible with MOOCs. I still feel the area is completely under‑explored, which is barely touching at the possibilities. You know probably of this movement called citizen science, where you try to push science into the hands of citizens.

27:16 You can try to find people to collect data, to think through solutions and all those things, put different skills together by just having the largest coverage you can. MOOCs can definitely help there. If you can teach a MOOC and simultaneously do a citizen science research project, that works perfectly together. That’s one example of things that will happen down the road, I’m convinced.

27:44 It’s already sort of happening in a few places, but there needs to be more of it. This offers fantastic opportunities. Imagine, you train people to collect data around them. You train 1,000 people all over the world. Those people will start interviewing or collecting data, whatever, about their local neighborhoods. This contributes to a research project.

28:09 It raises a whole host of other issues, about quality of data etc., but it’s a fantastic opportunity, I would find.

John:  28:20 Something just occurred to me, and I don’t know if you have any comment on this or not. It’s not about MOOCs. It’s not really about education. It’s about centralization of information.

28:32 Pokemon Go, big craze. Everyone’s doing it. I’ve got all sorts of friends out every weekend the last couple of weekends. My child is out of the house, I don’t have the impetus to do this myself, then I read online a lot of concern about how much information you’re turning on to Pokemon Go when you participate. Do you have any thoughts on this?

Paul:  29:02 Yes. I have a lot of thoughts. My thoughts align most with what Oliver Stone said recently. He recently came out with himself saying that this was an instance of surveillance capitalism. Exactly the same terms that appeared in my piece on Academe. I think it is. I think it is. It’s a commodification of one aspect of life by big corporations, by centralizing and collecting a lot of data. It’s how you play, how you interact with your local world around you.

29:37 It’s trying to map that out. It’s trying to map your interactions with that world. And more insidious, it’s trying to alter that. Now you start to see stores that pay for lures so they can attract gamers around those lures. You can use this augmented reality to start to shape the reality around you. Obviously, potentially a good thing, but it’s also potentially a bad thing. Who controls this should be the real question.

John:  30:10 [laughs] That always should be the real question. It’s…I don’t want to say humorous. It really is interesting. We all grew up in a world where we were warned through things like “Brave New World,” etc. that the technologies were going to be…we weren’t going to pursue that…We would never put ourselves under surveillance, just for its own sake, but that we would be seduced by fun, pleasures etc.

30:39 In a certain sense that’s exactly what’s happening, but we don’t do anything about it because we want to be a part of a community doing the same sorts of fun things.

30:49 Even with all the warnings we’ve grown with up, we seemed to embrace it without a whole lot of thought. We need people like you, of course, to remind us, but it is, it’s interesting to me. Everyone you talked to about this will say, “Yeah, I get that but I’m not seeing any ill effects right now and this is fun and my kids want to do it.”

Paul:  31:12 People warning about those things, there are lots of them, but what is missing is people coming with convincing evidence of dangers. There is a reason that they are not coming with that. The reason is that it’s very hard to collect that. It’s very hard to actually have proof of those things.

31:32 If you were at the platform and your goal was to uncover evidence of the danger associated to the platform, you could, but you’re not and you don’t have access to a lot of the data there. It becomes a lot more difficult.

31:49 Let me give you an example. Lots of people are talking about a filter bubble. The fact that you see your Facebook news feed is tailored for you. It’s showing you things that you want because that’s what keeps you on Facebook.

32:02 Well, lots of people are claiming that this is having an effect on political discourse in the US with all the consequences that everyone can see, but that’s just words. It’s really hard to prove. It’s really hard technically to get into thousands of people’s news feed and see how they interact with it. It’s very hard unless you’re Facebook.

32:25 If you’re a Facebook engineer working at Facebook, you could do it, but the thing is you’re optimizing for other metrics which is engagement of the users with the content and all those things. It’s not Facebook’s priority to make sure that it’s having a positive effect on society overall. It’s Facebook’s goal to make sure they have revenue that they can show to their shareholders and all those things.

John:  32:50 But more than that, the argument about the filter bubble, an argument that it was made earlier, Cass Sunstein made it in terms of cascading new sources and how we broke ourselves off from others, etc. We do that ourselves through our choices because we have multiples choices and now Facebook makes it even more pronounced.

33:13 I agree with you, but even if we could show, even if a Facebook engineer would come forward and show us exactly how that takes place, the step between saying “Yes, that takes place” and “That has lead to a problematic political culture” still has one step removed. You still have a hard time convincing people of that.

33:30 Some people would say, “That’s not what the problem is, it’s over determined. There are multiple other causes.” Making that argument…I guess what I’m doing is I’m agreeing with you. It’s very difficult to make that argument both because some of it’s hidden from us.

33:46 It’s not the purpose of Facebook to make that argument for us, but secondly, these are complex issues we’re dealing with. Ideology is always complex. Making the link from one step to another, people don’t want to buy it. They don’t want to think that about themselves. They don’t want to think that, “I’ve filtered myself so far off that I’d become part of this dynamic.”

Paul:  34:06 Sure. Ideology is very complex but it happens in all kinds of circumstances. It’s over‑discriminatory, maybe involuntarily because people don’t…drivers don’t go to poor neighborhoods to collect riders. They don’t go around there.

34:25 That’s not really ideology; it’s just a practical consequences of people’s biases. Same thing with Airbnb, is it harder to rent something if you’re of a different ethnic of origin than the majority? It’s going to be very hard to prove if you are not…if you don’t really own the data, if you don’t control the data and what’s done with it. That’s my take.

34:49 Even if you do, even if you…if you have a convincing argument based on data or based on social audits or things like that, the central platform can always present a different picture and control it. For instance, that happened with the filter bubble.

35:08 Facebook did its own study on the filter bubble and came up with its own conclusion, all kinds of people attack the conclusions, but they don’t have access to the raw data or anything to counter the argument. Now, there is this Facebook filter bubble paper that sort of says, “Well, the filter bubble is not as worse as individual’s own biases.”

35:31 That might not be the most interesting question and anyway you can’t really retest it. You can’t reproduce the experiment. You can’t do all kinds of…

John:  35:39 Because we have no access to that data.

Paul:  35:41 Right and to go back to education, Coursera makes a similar claims. I think at a partner conference, Coursera made a claim to professors and provosts and all those things that the quality of translations obtained by crowd sourcing was higher than the quality obtained, or equivalent to the quality obtained by professional translators at a much lower cost, of course.

36:08 That claim ended up on Twitter, but once it’s on Twitter it’s…you take a screenshot and then it can re‑circulate. It somehow landed into the whole translating community where a lot of people actually volunteer their time on Coursera to translate courses because they agree with the general goal of spreading education.

36:28 They were pretty upset because they were spending their own leisure time translating for free, even though there were professional translators, just to see a result that would say, “The translations are just as good.” That’s a claim that’s completely…no academic [inaudible], as far as I know, asked for the data, but I asked remotely and it was never given.

John:  36:55 Because that data is…you don’t have access to it. It’s valuable data, correct?

Paul:  37:01 Right. It’s valuable or it’s at least…the fact that they own it, it means they can say whatever they want. I don’t know if they’ve done an actual study to see the quality of the translations, but they’ve claimed it was the same quality.

John:  37:15 I’m going to switch gears a little bit. There’s a question we’re asking every person and this is going to take you outside of the digital realm. In terms of educations, what is your favorite non‑digital education technology?

Paul:  37:33 I’m a mathematician, so I have to say chalk.

John:  37:36 [laughs] That’s because you’re a mathematician, right?

Paul:  37:39 Right. Chalk and blackboard. It has a completely different feel to any kind of other tool you could use to make a presentation. It’s much more physical to actually write with chalk on a board. It feels very manual and very ancient. You know that thousands of years ago people were doing the same. It’s great to write with a chalkboard.

John:  38:06 I talked to mathematicians at Vanderbilt about this quite a bit when we were…a couple of years back, we were…some of their buildings, their classrooms were being refurbished, and they insisted on not whiteboards, not markers but on chalkboards. It’s going to be the only building on campus with chalkboards.

Paul:  38:24 Yeah. That’s the same at many, many universities.

John:  38:26 Yeah. Is that about the tie to a tradition? The difference between a whiteboard and a chalkboard gets lost on me. What do you think the…is it just the tactileness of it? What is that?

Paul:  38:41 There is some aspect of that. I mean, it’s like asking a smoker what’s the difference between a cigarette and a nicotine patch. There is something about fiddling with your fingers that way, but also it feels more practical. It just does. It has a tone, a ring to it. I mean sound is associate to the chalkboard.

John:  39:08 I like that. I like the analogy. The tobacco analogy is probably one that won’t resonate with everyone because there’s a lot of non‑tobacco users, but we’re talking about the delivery system for nicotine, but it’s very, very different for each person which one is the one that they want to embrace.

39:24 Telling somebody who’s quit smoking that they could switch to dipping is not likely to work because that’s not where they’re taking their pleasure, so I like that.

39:32 You’re looking forward to your article coming out in Academe, which issue will that be coming out? When will that be making its debut?

Paul:  39:38 The next month, September or October.

John:  39:42 All right. I appreciate you joining me today.

Paul:  39:46 Sure. Thanks for inviting me.

39:47 background music

John:  39:47 Thanks.

Derek:  39:50 That was mathematician Paul Dehaye interviewed by Vanderbilt’s Associate Provost for Digital Learning, John Sloop. I’ll second Dehaye’s recommendation of chalk. I’m part of the math department that John Sloop mentioned and I’m happy that we’ve kept chalkboards on most of our classrooms.

40:04 It’s not just the tactile quality of chalks that I like. I appreciate the line quality it enables. Dry erase markers tend to make lines that are all the same thickness. With chalk, I can vary the line thickness for aesthetic effect and as our guest, Zoe LeBlanc, mentioned in our last episode, aesthetics matter.

40:20 A quick word on MOOCs, we spent a fair amount of time discussing the past, present, and future of MOOCs in this podcast and there are two minds about that. On the one hand, it was MOOCs that rock the higher education landscape a few years ago, promising or threatening, depending on whom you ask, to change everything.

40:37 A lot of faculty, staff and the administrators are still making sense of MOOCs and so I’m glad to talk about them here in the podcast. On the other hand, it seemed to me from the start that the disruptive potential of MOOCs was in most cases overrated.

40:49 They were interesting certainly and the potential to engage thousands of students at once in a shared learning experience presented all kinds of teaching and learning opportunities, but I was skeptical that they would revolutionize higher education as we knew it.

41:02 Now, it’s been a few years and higher education is pretty much where it was back in 2012 when MOOC mania hit. That mania led a lot of universities to pay more attention to their educational missions and to experiment with online learning and other educational technologies, and that’s definitely a good thing.

41:17 I’m not convinced that MOOCs will continue to change higher ed. Universities and faculty will continue to run MOOCs for particular purposes and audiences. I, myself, have two that I helped teach, both focused on preparing graduate students to teach well. I don’t see MOOCs as a disruptive force some thought they would be back in 2012.

41:34 All that to say, I understand the concerns that Paul Dehaye raises about the dangers of outsourcing education to for‑profit companies, but I’m skeptical that institutions of higher education will do that, in part because some institutions got burned during MOOC mania and they are approaching vendor relationships with more intentionality these days.

41:52 That’s my takeaway from Dehaye’s warnings. There are dangers here certainly, but with awareness and some sensible choices, we can avoid or mitigate those dangers.

42:03 For links to some of Paul Dehaye’s writings, see the show notes. You can find those show notes on our website leadinglinespod.com. Please leave us a comment there or via Twitter where our handle is @leadinglinespod. I would really like to hear what you thought of the interview on this episode.

42:17 background music

Derek:  42:18 Look for new episodes of Leading Lines the first and third Monday of each month. I’m your host, Derek Bruff. Thanks for listening.

Transcription by CastingWords

Transcription by CastingWords

Leave a Reply

Your email address will not be published. Required fields are marked *