Skip to main content

In his 1939 sermon Learning in Wartime, CS Lewis considered whether education should continue amid high-stakes global conflict. Is learning something that should be suspended during a war, saved only for times of peace and predictability? Or does the acquisition of knowledge, learning, thinking, and prudential judgment become more important during moments of upheaval?

Predictably and persuasively, Lewis argued for the latter. There is a link between Lewis’ concern for education in war and our present concern for education and Artificial Intelligence.

At the time of this writing, there is an artificial intelligence arms race among the world’s largest technology companies. Even in the shadow of our most optimistic AI forecasts, we stand at an AI inflection point, one that raises a host of questions that extend far beyond labor relevance. And, if learning in a time of AI is just as important as learning in wartime, how shall we learn?

Leisure and Learning

In his essay Economic Possibilities of our Grandchildren, the famed economist John Meynard Keynes ambitiously imagined a world 100 years into the future. The essay, which preceded Lewis’ sermon by a decade, rightly predicted that technology would bolster productivity and output per person, but wrongly predicted a 15-hour average workweek by 2028.

Keynes’ 100-year forecasts aside, one of the more interesting questions raised in his essay did not concern technology, labor hours, and mass unemployment. Keynes offered a different question: “How will we occupy the leisure?” 

The Greek word for leisure is “Schola”—etymologically tied to our word “School.” It means something like “free time.” Classically, leisure was an opportunity for thinking, discussing, and learning. School.

In reference to this, and with the conveniences afforded to us by increasingly powerful artificial intelligence, how will we leisure? How will we school? How will we learn? 

The range of technological prognostication is wide, offering forecasts that range from Gilded Age optimism to scenarios of apocalyptic doom. And this range is evident in education, including Christian colleges and universities, where some argue for the unqualified embrace of AI and others for its outright rejection.

In this spirit, I want to offer five campus virtues that will best serve us in an AI era. I offer these virtues in the spirit of addressing the question: In a time of AI, how shall we learn?

5 Virtues in an AI Era

Pause. Pause is not the same as “slow.”  Nor is pausing equivalent to accepting mediocrity. Pausing means asking what the adoption of a given technology may do to us individually and communally. What are the spillover effects or unintended consequences?

One of the reasons we pause is because there is an above and beyond effect when we pair things with technology. Neil Postman made this point decades ago with television. When you combine television and politics, you don’t get television with politics, you get something new altogether.1You get a different kind of politics. Similarly, when you combine food consumption with DoorDash, dating and relationships with Tinder, political elections with X, teenage bullying with Instagram, and education with generative AI—you are getting something new altogether.

So, “pausing” is a virtue so that we can think about what we might be creating as we introduce advanced technology into existing domains of our lives. In her thought-provoking article, We Should Be More ‘Amish’ About Technology, author Tish Harrison Warren highlights the importance of this virtue in a world of dynamic technological growth. The Amish, she writes, are often more discerning as to how a given technology will help or harm their community.

Pain. In 2017, Richard Thaler won a Nobel Prize for his pioneering work in Behavioral Economics, which can be summarized as follows: “If you want someone to do something, make it easy.”  Put another way, we can encourage and reinforce desirable behaviors by removing impediments—removing friction.

This is intuitive. Some things should not be hard. However, while minimizing if not eliminating friction in some areas of our lives is good—in other areas, friction is fundamental to our growth, resilience, meaning, and even delight.

For example, we need friction for intellectual growth and competence. Economist Kyla Scanlon has discussed two worlds being created right now: the friction-less world of digital and the friction-full world of physical. The former is about minimizing, if not eliminating, any kind of pain, friction, stress, tension, or struggle.

As an example, she references Cluey, a tech startup that encourages you to outsource your thinking and “cheat on everything.”  But outsourcing thought atrophies cognitive muscles otherwise meant to grow with sustained intellectual tension. Since thinking, and thinking well, is fundamental to our humanity, outsourcing this capacity literally de-humanizes us.

And importantly, we will not experience spiritual maturation without friction. In Romans  5, Paul writes, “[B]ut we also boast in our sufferings, knowing that suffering produces endurance, and endurance produces character, and character produces hope, and hope does not disappoint us.”  In James 1, believers are encouraged to consider “trials of many kinds” pure joy because “the testing of your faith produces perseverance” (vv. 2-4). Similarly, Amy Carmichael’s famous poem reminds us, “[C]an he have followed far
Who has no wound nor scar?”

In the context of a Christian university, I want to remove friction associated with applications, transcript evaluations, navigating our Learning Management System, onboarding new employees, paying student bills, etc. In Thaler’s language, I want to make these and many other tasks “easy.”

But for missional institutions of higher education, we cannot remove friction or, to use an expression by Stanford Professor Rick Reis, productive discomfort, from a student’s intellectual, social, moral, physical, and spiritual experience, since those domains are fundamental to their flourishing.

Purpose. Medical Ethicist Lydia Dugdale helpfully distinguishes between patient care that is holistic and works symphonically across a broad range of patient goals, and medical practice that is icy, algorithmic, and procedural. In other words, at its best, medical practice should be informed by a patient’s larger questions of meaning and purpose.

Similarly, to appropriately utilize technological tools, we must first settle questions of human anthropology and teleology.

I remember watching a middle school student tackle a math packet they had to complete outside of the classroom. Utilizing an AI app on their phone, they quickly burned through the packet. For this student, the goal was to provide a solution to each problem. In contrast, a purposeful, holistic approach asks questions about the nature of the assignment, what the problems represent, how those mathematical tools can be applied to real world scenarios, or opportunities for new knowledge gained. This approach does not seek to “get the right answer”; it seeks to learn, apply, and understand; it seeks meaning.

Prudential Judgment. AI grants us new capacities, but as those capacities grow, so does the importance of moral discernment and ethical application.

The Evangelist D. L. Moody once said, “If a man is stealing nuts and bolts from a railway track, and, in order to change him, you send him to college, at the end of his education, he will steal the whole railway track.”  Similarly, ethicist Martha Nussbaum reminds us that a good doctor is also a good poisoner.

So, our concern is not simply the skillful efficiency of AI, but more importantly, its moral application.

Technological superpower at our disposal is not enough. Under malformed intentions, this represents enhanced capacity for harm and destruction.

Proximity. Any Christian school worthy of the name must continually practice embodied community. This means being proximately present—to our students; to each other. That makes us, us.

Our work is fulfilled through the currency of relationship, trust, and goodwill. When something is important, hard, or worthy of celebration – we show up. We will not outsource our care for one another and our care for students to AI. My own university, for example, is unlikely to replace trained personnel with AI chatbots for counseling or spiritual guidance.

Where AI bolsters opportunities to be proximate and present, this is a welcome gain for student-centric Christian schools. More specifically, if AI can upgrade university efficiencies, minimize busy work, accelerate productivity, and enhance the end user experience, this additional margin for value-added presence is welcomed. However, as an embodied, story-formed community, where AI minimizes, discourages, or abrades proximity, we should be wary of its application.

Virtues of Learning

In his prescient 1995 book The End of Education, Neil Postman outlines four “great experiments” in American Education. The fourth, he writes, raises the question: “Is it possible to preserve the best of American traditions and social institutions while allowing uncontrolled technological development?”

The American experiment of democratic self-governance is itself presupposed and dependent upon a virtuous citizenry marked by responsibility, discipline, and moral fortitude. Democracy does not produce virtue so much as it requires it.

Similarly, education-schola-school in an AI era depends upon an environment characterized by virtues of continual assessment (Pause), productive discomfort (Pain), questions of meaning (Purpose), practices of moral excellence (Prudential Judgment), and people-centric embodied community (Proximity). This is how we “leisure”; this is how we learn in this moment.

Christian Education is multi-dimensional web of practices and outcomes. It has a diversity of aims organized around the idea that learning is fundamental to a good life and that our Christian faith gives shape and meaning to that learning process.

We are story-formed Christ followers. And we are educators. We are committed to learning. Learning in peace time. Learning in wartime. Learning in AI time.

Footnotes

  1. This is an idea from his book “Amusing Ourselves to Death” (I 1986 was the original, this link is from the 2005 reprint: https://www.amazon.com/Amusing-Ourselves-Death-Discourse-Business/dp/014303653X)

Kevin Brown

Asbury University
Kevin Brown is the 18th President of Asbury University.

One Comment

  • Joseph 'Rocky' Wallace says:

    Dr. Brown, thank you for your discernment and wisdom shared here. It is indeed a slippery slope–capable of redefining “normal life” like nothing we’ve seen in our lifetime, and in a matter of months from what is being forecast by some…And the university will need to be out in front, not lagging behind, on this grand adventure (or misadventure).

Leave a Reply