Improving Civic Discourse by Fostering Web Literacy: An Interview with Mike Caulfield
Mike Caulfield is the director of blended and networked learning at Washington State University, Vancouver (WSU). He regularly blogs about edtech topics at Hapgood. He is an expert on informal learning, online communities and open educational resources. Throughout the past decade, he also has become has become increasingly interested in civic literacy and digital citizenship. He has created many instructional resources that help teachers integrate digital citizenship skills into the classroom. If you are interested in navigating the post-truth society, I strongly recommend that you read his book ‘Web Literacy for Student Fact-Checkers’.
At WSU, Caulfield runs the Digital Polarization Initiative (Digipo), an cross-institutional initiative to improve civic discourse by developing web literacy skills in college undergraduates. Caulfield maintains the ‘Four Moves’ blog which comprises prompts for lessons on fact-checking. Instructors can use the material to teach basic web techniques for verification and source assessment.
What are the facets of digital citizenship, and why is it important to teach it?
I’d point out three – collaboration, participation, and critical consumption. We need to know how to work together effectively to solve problems and improve our communities. On an individual level, we need to know how to make our voices heard effectively and ethically, as well as how to foster environments that lead to greater dialogue and awareness of concerns outside our own experience. How to advance our aims and amplify the voices of people who may not be heard. And finally, critical consumption: we need to make good choices about what we give our attention to, about where and how we seek out expertise and perspective, about how we process what we read and view.
I say “critical consumption” which is a term I’ve been using since maybe 2009. I think I first heard it from Howard Rheingold. But in the intervening years I have to say I’ve kind of come to hate the term, because it has behind it this “producer/consumer” dichotomy that I’ve come to see as dangerous. There’s this idea you encounter in the wild a lot that the goal is to move students from being “consumers” to “producers”. I used to espouse this myself. This is a very capitalist notion I think, the idea that production is the primary way people actualize. We used call consumers “readers” of course. Or film or music lovers, or thinkers. If you say “Let’s move people from being consumers to producers!” I’m not sure why that’s a good goal. I think you can show some benefits of that in terms of pedagogy, but I think you also risk reinforcing some of the worst elements of academe. If I could have a world of better writers or a world of more discerning readers and thinkers, I’d take the latter in a heartbeat.
How has civic literacy changed with the rise of social media?
It depends what you mean by social media. But as more and more of our civic life moves online, it becomes more important that students have the skills to effectively collaborate, engage, and inform themselves online.
What that means can vary. But the main shift I see is what Zeynep Tufekci has been talking about. Online civic literacy used to be more concerned with access. Years ago I used to focus on giving my students the skills to publish. My first digital literacy project, in 1997, the students wrote raw HTML. The fact they could put something on the web was stunning.
Now everyone has access, anyone can publish, and it is people’s attention that is the scarce resource. A lot of what you see now reflects that problem. Ubiquitous access to publishing technology has pushed the locus of power further downstream, in this very distributed way.
So all these things shift somewhat. Participation has to grapple much less with access and much more with how one invests time productively, ethically, fairly. I haven’t taught participation for a while, but if I taught it now I’d focus a lot more on messaging strategy than the techniques of production. Actually, I do teach participation a bit now in regards to Wikipedia editing, and this is the focus: what does this community value, and how do we show adherence to those norms while advancing personal and social goals. We did this stuff before, of course, but it’s become much more of the focus.
And critical consumption – well, again, we used to spend a lot of time teaching students how to deeply read documents. But in a world where attention is scarce most of your most profound decisions are not what you think about what you read, but what you give your attention to in the first place. In my opinion, education is woefully unprepared to teach students how to do this.
What triggers your personal misinformation spider-sense? When do you trust a source, and when do you decide to investigate?
I investigate everything I share and anything that I have a strong reaction to, even if I don’t share it. There’s some exceptions of course. If I truly know the source, then fine. But a premise of my work is since it’s our attention that is being hacked, we are far better off doing 20 30-second checks in a day than two five-minute checks. The point is to get these checks down to such a simple level that you don’t even decide to do them consciously. It becomes like checking your mirrors before changing lanes or using a turn signal.
There’s an interesting question in there as well, when you think “when to trust” something. Gerd Gigerenzer has some insights on how heuristics work, and one thing I’ve been thinking about is his idea of a “stopping rule”, which is something you use in search and sorting strategies. Broadly, when looking for something, how do I know I’m done, that what I’ve found is good enough? That’s really about search, but it’s part of a larger insight about investigation too. The prudent investigator knows when to stop, when the level of certainty is appropriate for the task at hand.
It’s interesting, because we often reward our students for the opposite, and give them the impression that the good student is the one that digs deepest. But teaching students that in a world where attention is scarce is unethical. We need to get students to think about that attention/certainity ratio, particularly since pushing a need for “certainty” is a well-known technique of disinformation actors.
Please tell us more about the Digital Polarization Initiative: Which campuses are currently involved in the project? How can students benefit? And how can educators become involved?
Well, it started out a grassroots movement, around the textbook, and in its current incarnation it’s a twelve-campus pilot. If you are an American Association of State Colleges and Universities (AASCU) campus, you can join formally. But most people just use the materials we’ve been putting out, which I list here: https://hapgood.us/2018/09/04/a-roll-up-of-digipo-resources-4-september-2018/
What prompted your interest in teaching web literacy to students?
Well, as I mentioned, my first web literacy project was in 1997, having English Composition students write bios of underrepresented people for the web, something called the Persona Project. At that time, a lot of the thrill was the authentic nature of the learning.
Since about 2010, though, my main interest has been the civic ramifications of digital illiteracy in a time when so much civic life is mediated by the web. So that’s been a shift, I think. Authentic experience is still important, but the civic question is even more pressing.
What sets your lesson plans and instructional approach to web literacy apart from other digital literacy resources?
We focus on the filtering of information more than its evaluation. As such we replace more traditional analysis with quick web-native techniques.
I’ll give you an example. So we find a post that says Morgan Freeman has died. (Morgan Freeman dies once a year on the social web). Now we can have the students look at the page. Does it look professional? Well written? They can consider the claim. Was he old? Does it make sense he died in France? We can look at the author. Are they trustworthy?
Or we can open up another tab search Google News and see if other people are reporting it. And we use a heuristic there: breaking stories, when true, will be reported by multiple outlets. We get this answer in ten seconds, and with a minimum of cognitive effort.
Not everything resolves that quickly, but even when it doesn’t the process – which is built around utilizing the expertise of others more than “critical thinking” – gets you in a good position for what follows. So as an example we have one prompt that says a professor, dressed like a Black Bloc protestor, was charged with brutally attacking Trump supporters with a bike lock at a rally. And this turns out to be true. But in the process of checking for other coverage, you also get more of the story and a range of better sources to pull from that give a better perspective of what happened.
The one thing we *don’t* do is encourage students to read and evaluate anything without applying the quick techniques first. Those techniques are the way you find your footing for deeper analysis if necessary.
What is your opinion: Do we hold students to a higher standard than ourselves? Don’t we all repeat, repost or quote things without thoroughly checking it?
Well, yes. But I’d take issue with “thoroughly” there. The reason we don’t check things is because we’re so hung up on this idea of thorough checking. “Thoroughly” can do a lot of harm to students. I want “good enough” checking.
It’s like brushing your teeth. It’s better to brush your teeth every day than brush them thoroughly, but only on the weekend. And yet we show students the equivalent of 30 minute brushing routines and wonder why they don’t do it. Show them how to do it in 60 seconds, and maybe they’ll bother.
To what extent are students confronted with misinformation and why do they struggle to sift through sources and claims?
The biggest problem is that many students have a high distrust of things they see, and believe the truth just isn’t knowable. People of all age groups consistently overestimate what percentage of news in their feeds is fake, by huge amounts. At the same time people fall for a lot of really basic stuff, partially because misinformation is environmental. Just seeing it out of the corner of your eye warps your perspective, creates deep feelings over time that you really can’t source.
I call this whole phenomenon Gullible Cynicism, and if you read your Arendt you know that it’s some of the most fertile soil out there for totalitarianism. So my goal in my work is not that students question more, necessarily, but that they question more efficiently, and feel less overwhelmed by the firehose.
Ultimately, we’re taking away the exhaustion of this sorting process, and helping them to resolve easy questions quickly, building some confidence that many facts are discernable if you let go of your ego and tap into relevant expertise.
Reading your work makes me wonder what percentage of the stream from your or my social media network is filled with misinformation. What’s your estimate?
It’s highly variable, and there’s so many ways to measure it it would be irresponsible to say. But the number is likely lower than you think.
In a way, the amount is a wrong end of the question. I’m sure that Facebook would tell you that only the smallest percentage of posts in Myanmar were anti-Rohingya conspiracy theories or hate speech, but those posts helped shape the narrative that fueled violence and support for ethnic cleansing. My take is that we see a lot less misinformation than we think we do, but are far more affected by it than we care to admit.
Do you have any data on how effective fact-checking training is? How can the critical analysis of sources become an everyday information habit?
We haven’t been able to follow students outside of the classroom and observe. And, in any case, I think the intervention has to be persistent across many classes to stick. But I know it can become a habit for some people, and for some people it already is.
Incidentally, I don’t think we need everyone to have this habit. It’s like vaccination – there’s a certain level of vaccination that gives you herd immunity to epidemics. We want enough people to have these skills that we have a herd immunity to nonsense. And that might happen at much, much lower figure than 100%.
What is your vision for DigiPo? Where are your plans for the initiative over the next couple of years?
Get a longer funding runway so I can staff up and get some sleep. And maybe take a vacation.
On larger goals, we say that our end goal is to change the way online information literacy is taught at the college level. So I would like these techniques to be as ubiquitous as CRAAP and RADCAB were in the mid-00s. But sleep sounds like a good goal too.
My last question brings me to a point that I have been trying to wrap my head around: How can we move beyond the echo chamber? What is the best way to point out misinformation in a way that allows the person who posted or shared it to take action without creating (further) alienation or distrust?
I think we overestimate the echo-chamber, actually. But the way I think about tribalism in general is it’s a way to filter information. It’s a way of dealing with information overload. To attack it, you need better and relatively effort-free alternatives to quickly sort information. And so we find in our classes that many students become less tribal in how they evaluate, because they have new filters. They go from “I would trust this video because Democrats did put coal workers out of work” to “This video has some good statistics, but on investigation turns out to be put out by the coal industry, I’d rather see a better source.”
As far as correcting people, think of the audience for the correction being *other* people who look at the post who might be a bit more persuadable. Acknowledge that the feelings that the person may be expressing with the post are valid, but that the particular thing shared is not a good selection. Let people know that it’s OK to express their feelings with these things, and point them to sources that help them do that while maintaining the integrity of the information environment.
I come from a family that engages in pretty heated argument. And that’s OK. But obviously if someone starts getting in another person’s face, bystanders will tell that person to chill. You get to argue your point, but you’ve got to follow the rules. So show them how to argue their point without destroying the discourse environment as we know it.
Mike Caulfield is the director of blended and networked learning at Washington State University, Vancouver (WSU). As director he helps professors master new modes of teaching while co-directing a team of academic services professionals. Before that he was employed by Keene State College as an instructional designer, and by MIT as director of community outreach for the OpenCourseWare Consortium. You can follow his work on twitter @holden and through his blog Hapgood.
Vintage House Indonesia
May 5, 2020 at 12:49 pm
Hurrah! Finally I got a weblog from where I be capable of actually obtain useful facts concerning my sstudy and knowledge.