On Wednesday, May 22, Word for Word, Southern New Hampshire University’s (SNHU) online literary series, was honored to welcome Dr. Paul J. LeBlanc as our special guest. LeBlanc served as president of SNHU and is the author of two nonfiction books: “Students First: Equity, Access, and Opportunity in Higher Education” and, most recently, “Broken: How our Social Systems Are Failing Us.”
Under the 20 years of LeBlanc’s direction, SNHU grew from 2,500 students to over 225,000 learners and is now the largest nonprofit provider of online higher education in the country.
LeBlanc immigrated to the United States as a child, was the first person in his extended family to attend college and is a graduate of Framingham State University (BA), Boston College (MA) and the University of Massachusetts (PhD). From 1993 to 1996, he directed a technology start-up for Houghton Mifflin Publishing Company, was president of Marlboro College (VT) from 1996 to 2003 and became President of SNHU in 2003.
LeBlanc was joined by Associate Deans Jacob Powers and Paul Witcover, and by Senior Associate Dean Derrick Craigie.
The following is Part II of an edited transcript of LeBlanc’s interview. To learn more about LeBlanc, take a look at Part I.
W4W: How do you envision AI and other emerging technologies shaping the future of liberal arts education and writing at SNHU and beyond?
PL: I’ve talked about this at lots of conferences. I think education is going to be a lot less about epistemology, about knowing what do you know, how do you know, what counts for what you know and more about ontology, about being.
There was an amazing article in Science magazine about protein folding. Teams of PhD, doctorate-qualified medical researchers will spend years, a decade, longer, trying to do one of these structures, which are the basic building blocks of life. DeepMind did them in a year, all of them. It’s an amazing story, because it unlocks unbelievable possibilities in terms of life sciences. One of the skeptical doctors said, “I didn’t think in my lifetime I would see this.” And in one year, DeepMind did all of it. So we have these amazing knowledge tools available to us.
But here’s the thing. If I’m right about moving from epistemology to ontology, which are really questions about being and how are you in the world, what does it mean to be a good human, how do you understand and navigate a world of others, et cetera, that’s the realm of the humanities. That’s what literature does. That’s what stories do.
When I was a kid reading in those libraries, I was traveling the world without ever leaving that room. Being a reader, being a literature major, forces you to live in the skin of people. When I read “Anna Karenina,” I know what it means to be a woman in 18th century Russia. If you’re not a reader, you only lead one life, the one you’re in. But if you’re a reader, you sample many lives.
My hope is that a lot of the knowledge jobs that have a lot of status right now and get paid a lot are going to go away. I think that’s probably a good thing. As the famous computer scientist Stuart Russell at Berkeley says, if you had told our ancestors 300, 400 or 500 years ago, imagine this amazing future where you spend all day in a box. It’s called an office building, but it’s basically a big box. And when you go into the big box, you get a little glass box called a cubicle. And then in that little glass box, you spend the day staring at a little lit-up box called a computer screen. They would have run the other way. But that’s what a lot of the knowledge economy looks like.
What if instead we could move over to the existential questions we identify with the humanities? I’ll give you the example from Chris Dede at Harvard, who I mentioned before. If you imagine now that nurses have all this power and this ability, and the work of medicine is no longer about getting the right treatment, the right diagnosis, the right treatment, what happens is your caregiver does something much more important.
God forbid, Derrick, but I’ll use you as an example: “Derrick, I know this is a devastating diagnosis we’ve just come upon. Talk to me about the conversation you’re going to have tonight when you get home with your family. Talk to me about your community. Talk to me about your support system. How do you think about quality of life? We’re going to have some pretty tough decisions in front of us. How do you want to make those decisions, Derrick? Who’s going to be in your circle? How can I help you?”
Those are really human questions. AI can’t do that. AI is not going to give you that. So I argue in “Broken” that my hope is as knowledge jobs get obliterated, we’re going to be able to flood our K-12 system with amazing teachers, social workers, coaches, support staff. We are going to be able to rebuild a mental health system that is completely decimated in America and flood it with great clinicians, and social workers, and psychologists, and rebuild. We could build a compassionate, caring, affordable system of geriatric care for an aging society. Anybody on this call right now, on this podcast, who has an aging parent and is struggling to navigate that system knows how hard it is.
We could have national daycare so women especially are not penalized for having children and families are not penalized for having children. All of that could be done. None of those jobs get done by AI. The problem in a knowledge economy, is those jobs aren’t valued. They’re not paid well. We don’t like to pay for them.
But in a world where knowledge work is now being done by much more capable AI systems, maybe that is an invitation to get back to jobs that actually make us feel better about ourselves, that actually improve the lives of others, that actually improve our communities.
I’m a big fan of an economist named Carlota Perez, who wrote a book called “Technological Revolutions and Financial Capital.” I know that does not sound like the typical book you would do in a creative writing program. But she’s really good about talking about when a shift like this happens, you go into a very messy period where everything gets re-ordered. The things that used to be winning are losing. The losers become winners. Things you didn’t think were possible become possible. Whole new worlds spring up.
So if you think about the Industrial Revolution and technology, steam, electricity, that bundle of things, think about what happens. The landed aristocracy gives way to capitalists. Capitalism gets created, for good or bad. Cities explode in size. All of a sudden all this new infrastructure takes place because we have supply chains and you have global trade because you have to feed those factories. The whole world changed. And the middle class was invented. Now you have a merchant class that just, again, explodes in size.
We’re in for one of those periods. And I don’t want to make people on this call to feel like he’s giving me a headache. But yeah, it’s going to be messy. What Perez would say to you is that when you’re in these periods, governments fall, wars happen, revolutions occur because the re-invention is hard.
If you think about my example of doctors and nurses, do you think the American Medical Association is going to go down without a fight? We’re going to see this huge backlash against AI. The writers’ strike in Hollywood was in part a reaction against AI. Jeffrey Katzenberg, who’s the co-founder of DreamWorks, when he saw Sora, which is the voice-to-video technology from OpenAI, his comment was something like, “Ninety percent of the world’s animators just got a pink slip. They don’t know it yet.”
Now, that’s terrible if you’re one of those folks. But you’ve just put that technology in the hands of millions of people and democratized the ability to do film.
I don’t want to be Pollyannaish about it. I can lose sleep just as easily as you about all the ways this could go wrong. Yes, as someone said, I’ve seen “Terminator.” That’s the nightmare dream. But I do think that a world in which we can get pulled back into human work and we ask the question, “How do we be better humans?” is not such a bad world. It’s probably not a bad question to ask right now, because we ain’t doing that well.
If you’re worried about AI screwing up systems of society, which one’s working well? Just throw it out there. Which system is working well? I’ve argued that higher ed’s largely broken. I like to think that SNHU is trying to reinvent it and make it better. I think that health care is broken. I think government is broken. Maybe human intelligence isn’t all that great after all.
But AI will not replicate the human heart. And it won’t replicate judgment and wisdom the way Chris Dede talks about. And I think it’s a much higher-order skill to have that conversation—”Derrick, tell me about how you are going to get through this”—than it is to figure out what disease you have. That’s just knowledge work. I want human work.
And I think we’d all feel better about ourselves. Young people today are chronically lonely. Seventy-two percent of them say they’re extremely lonely, defined as, I don’t have anyone I can turn to. We’ve never seen that before. The highest has always been the elderly at about forty percent, and that has stayed constant.
Twenty-five percent of teenage girls in America have said they have planned their suicide in the last 12 months, and thirteen percent of those have tried it. Our young people are suffering.
I was on a panel today with somebody from MIT having this discussion. We were both in the camp of, like, there’s a version of the world where AI allows us not to work. Which is a weird thing, because I argue that humans need purpose. But what if you could work on the thing that you love? If you could write? Or paint? Or do poetry? What if you could create amazing gardens, or create fashion and music? That feels compelling. Those are all associated with the humanities and creative arts.
I actually have argued in various settings that AI will revitalize the humanities, but not in their current form. I think the humanities and the liberal arts are pretty sclerotic in the way that they are structured, and they need to be reinvented for the time in which we find ourselves.
W4W: Let me segue from that. I think there’s one force, or movement, or direction that you are articulating very well in relation to the possibilities that are opened with AI. But then, as you also pointed out, if you look at the systems around us, look at the world around us, there are forces that are moving in the opposite direction.
I want to put that in the context of, say, the George Floyd murder, Black Lives Matter, when afterwards there seemed to be a genuine effort in looking at institutional racism, White supremacy, the incorporation of principles of DEI into a lot of institutions, such as this one, I’m proud to say.
But now we’re seeing pushback against that, where some states have outlawed discussion of history. Racism can’t be mentioned. Libraries are being attacked and shut down. I’ve been reading a book recently by a scholar named Joy Buolamwini, and she points out the ways—I’m sure you know all about this—that technology itself encodes systemic biases. So as we bring AI on board, how are we able to, number one, work against those regressive forces in society that seem to be gaining the upper hand, and, number two, keep from slipping back when AI itself is incorporating some of that same power structure just because it’s more or less inherited?
PL: I think social media took a terrible toll on our society, especially on our young people. And we’re paying a very high price for it. A lot of the mental health stuff I cited is tied to social media. But we had no smart policy discussion about it when it was created. We just thought, this is cool. We’re going to connect everyone. And everyone’s going to talk to more people so we won’t be in our little echo chambers. And it went absolutely the opposite way. We’re more connected, never more lonely. We’re in these echo chambers. And of course, all that technology got used to drive disinformation and a lot of the political strife we see in our country right now.
I think it’s exacerbated by a failure of K-12 education, underfunded, undersupported. So we don’t do critical thinking in any real way. We don’t do literacy training for kids. We don’t do civics education very much anymore, et cetera. We can unpack this at length.
AI gets trained. So there’s a lot of discussion about how we train it—algorithmic and cultural hegemonic bias. Part of my team has worked on a global data consortium for higher ed, and one of the things we’ve done is work to bring in university partners from other cultures, the global South.
I was in Rwanda meeting with the minister of education and the prime minister about how do we get Rwanda into this mix. Because if we train only on White, North American and European models, we are going to get, guess what? A very deep reflection of White, North American and European culture. Remember, ChatGPT was trained on Reddit and was trained on some white Nationalist texts. I mean, it’s all there, right? So ChatGPT is essentially repeating humans and all their messiness and flaws.
But in the end, I believe that a lot of the folks who feel who feel most disenchanted … have been made to feel like they don’t matter in the world any longer … We’ve got to figure out how to change the conversation and be engaged in conversation again with people who disagree with us.
W4W: Feeling like you matter. That’s something you bring up in “Broken.” I think it’s such an important quality of the liberal arts education, imbuing people with the ability to appreciate that other people matter. Empathy, I guess, might be another word for it.
PL: Mattering is a concept defined by Greg Elliott, a sociologist at Brown, as three things. It means I have a sense that you see me. That you recognize me. I’m seen. The second is that you invest in me. You’ve given me your time or your attention. The third is the sense that my presence helps shape the world we are in together. For example, in the case of an employee, it’s like, “Hey, it’s really nice that you feel like you know me, and it’s great that you’re sending me to conferences, but if my input never reshapes the organization, if I think there are things we could do better and I never see that heard and listened to, I don’t really feel like I matter.”
So it’s actually more complicated than just belonging. We all belong to things that don’t make us feel like we matter to them. I write a check to various associations. I like their journal. It’s important for me to go to their conference. But I don’t think I matter to them. They don’t make me feel like I matter, per se. Mattering is a whole other level of human connection.
You know whose mental health did not dissipate one bit during the pandemic? The Amish. Why? They’re connected, faith based, strong sense of community, all these things. And what we know right now from a mental health perspective is that “we” cultures are doing much better than “me” cultures. And America is a deeply, deeply seated “me” culture. Rugged individualism.
We weren’t always that way. There was a time when everyone took pride in their public school. When a town would take pride in its public school. Everyone would send their kids there.
And there’s just a whole bunch of things that were communal, where we all could come together to say, “Let’s rebuild Europe after World War II. Let’s build the interstate highway system.” When John Kennedy gets up and says, “We’re going to send a man to the moon within 10 years,” the nation rallied around that, right? And the whole nation stayed up late to watch Neil Armstrong step on the moon. When was the last time America did something, built something truly great that we all rallied behind?
We have kind of lost that swagger. And I think a lot of it was tied to this question of our common good. And I think the common good has been lost.
W4W: How do you get it back?
PL: The power of narrative. The power of story. I think Barack Obama was our last really great political storyteller. And he could tell the story of America, right? There’s no Red America. There’s no Blue America. There’s one America. He could tell the story of hope.
Now, as is said of politics, poetry meets prose when you start to govern. You run on poetry, you govern in prose. Sorry for all you prose writers. But the point here is that the big story, the story that inspires people, that makes your heart full, is so powerful. And I think we’re waiting for leaders who once again can tell stories of vision, of our future.
I think what’s happening right now is people are scared. And when people are scared, it doesn’t bring out the best in human beings.
W4W: Thank you so much for this, Paul. It’s been delightful. It’s been a privilege working with you over the years, and we look forward to seeing what you come up with next.
PL: I feel like I’m in these meetings right now, and they are in part a series of goodbyes for me as I wind down. And I always want to say thank you. It’s been such an honor to be your president. We did our last Commencement a couple of weeks ago. And it was so emotionally moving. Our students are the best. In my profession, graduation is my Christmas, my birthday and my New Year’s all rolled up into one. So thank you so much for having me. It’s been a delight. I’m going to stop now because I’ll get teary. It’s been a lovely 21 years at an amazing place with amazing students and amazing faculty and staff. So thank you.
Category: