Skip to main content

“Someone designed the furnaces of the Nazi death camps.”1 With this sentence, Roger Forsgren opens his article titled “The Architecture of Evil”. Although it was Hitler and his henchmen who unleashed death and destruction during the Second World War, it required railways, factories, warehouses, and machinery to enable the war effort. The article goes on to describe the life and work of Albert Speer, Adolf Hitler’s “chief architect,” underscoring that Hitler did not work alone. The truth is that there were engineers who designed the technology that enabled the Nazi brutality. Speer later wrote “my obsessional fixation on production and output statistics, blurred all considerations and feelings of humanity. Speer continues, “An American historian has said of me that I loved machines more than people. He is not wrong.”2

While Speer’s situation may appear to be a special case, the truth is that all engineering work involves moral choices and responsibility. Even programmers writing logical, mathematical code need to recognize that their creations are not neutral. Cathy O’Neil makes this case powerfully in her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. O’Neil worked as a math professor until 2007 when a lucrative opportunity arose to apply her PhD training in mathematics at a hedge fund. Shortly thereafter, the financial crisis occurred and O’Neil found herself pondering the implications of her role as a “quantitative analyst” (or what is often referred to as a “quant”) in the finance industry. Reflecting on this, she later wrote: “The housing crisis, the collapse of major financial institutions, the rise of unemployment – all that had been aided and abetted by mathematicians wielding magic formulas.”3 Her disillusionment led her to participate in the “Occupy Wall Street” movement and eventually to writing her book.

The fact is that technology is not neutral. Even computer algorithms, which may initially appear cold and neutral, reflect the values and assumptions of the people and organizations that construct them. “Big data” sifts through vast oceans of data to find patterns that are then used to inform decisions in areas as diverse as finance, banking, hiring, marketing, policing, education and politics. While mathematical models allow decision making to be more efficient, it can hurt the poor, target predatory ads, perpetuate racism, discriminate against minorities, while serving to make the rich richer.

For example, predictive policing algorithms trained on arrest data will focus policing efforts on already over-policed communities. Automated job recruiting systems which use data taken from current employee profiles can perpetuate hiring biases and further reduce diversity in the workplace. Using algorithms to predict recidivism and inform bail and parole decisions can also amplify existing bias. In the words of O’Neil, mathematical models are “opinions embedded in mathematics.”4

The training data used by machine learning system can often reinforce or even exacerbate existing biases. Furthermore, the training algorithms themselves have issues. For instance, the errors for minority populations are often larger since statistical models are mostly determined by the patterns in majority populations. Researchers are finding that fairness in machine learning is extremely challenging. I suspect this reflects a larger issue: it is not possible to completely describe fairness and justice in strictly mathematical terms.5

Yet another issue is that many machine learning systems are proprietary systems in which the inner workings are protected by intellectual property and trade secrets. Such opaque systems are built on hidden assumptions and values embedded within them. This leaves those affected by such systems without any explanation or recourse for unfair decisions which may impact them.

As Christians, the fact that algorithms are not neutral should come as no surprise. We live coram Deo, before the face of God, and everything we do is a response of obedience or disobedience to him. In this sense, all of life is religious. Even our technical and mathematical work is a response to God with moral and ethical implications. Our big data algorithms and mathematical models can be directed in ways that are more obedient or less obedient to God’s intents for his world. As more decisions are informed by number-crunching computers, we will need to make sure that justice and transparency are maintained. A wise data scientist will heed the call of Proverbs 31:8 to “speak up for those who cannot speak for themselves” and to create software systems that “defend the rights of the poor and needy.”

“The Architecture of Evil” not only tells the story of Albert Speer but goes on to suggest that “Today’s engineers need a more well-rounded education – one that stresses not only the analytical skills necessary to be a good engineer but also the liberal arts that are necessary . . . for young students to grow and mature as citizens with responsibilities beyond the immediate technical concerns of their work.”6 Our computer science and engineering schools need to attend to such ethics and values if we hope to build a more just society.

As Christian colleges scramble to launch data science and analytics programs, we have a unique opportunity to create distinctive programs that go beyond churning out data wizards. Purely technical training in data science can lead to type of tunnel vision which David Brooks has described as dataism.7 This reductionistic ideology is described by Yuval Noah Harari as follows: “You may not agree with the idea that organisms are algorithms, and that giraffes, tomatoes and human beings are just different methods for processing data. But you should know that this is current scientific dogma, and that it is changing our world beyond recognition.”8 The truth is that creation is far more than data, it is both diverse and integral with normative implications. Christian colleges are well-positioned to train faithful and responsible computer scientists who recognize that not everything that counts can be counted, even with the most complex algorithms.

Christian engineers and computer scientists must see their technical work as a response to God, one in which even our mathematical models, algorithms, and digital designs need to enhance justice and show mercy as we walk humbly with our God (Micah 6:8).

An earlier version of this article by the same author originally appeared in Christian Courier.


  1. Roger Forsgren, “The Architecture of Evil”, The New Atlantis, Summer 2012.
  2. Albert Speer, Inside the Third Reich (New York: Simon & Schuster; 1997), 375.
  3. Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown, 2017), 2.
  4. O’Neil, Weapons of Math Destruction, 21.
  5. Derek C. Schuurman, Shaping a Digital World: Faith, Culture and Computer Technology (Downer’s Grove: IVP Academic, 2013), 105.
  6. Forsgren, “The Architecture of Evil”
  7. David Brooks, “The Philosophy of Data”, The New York Times, February 4, 2013.
  8. Yuval Noah Harari, Homo Deus: A Brief History of Tomorrow (New York: HarperCollins, 2017), 373.

Derek C. Schuurman

Calvin University
Derek C. Schuurman is Professor of Computer Science at Calvin University in Grand Rapids, MI.