Jack Bandy is a Ph.D. student in Northwestern's TSB program studying Computer Science, society, and the complex interrelationship between the two.

What is Technology and Social Behavior?

In less than a couple decades, we went from a world without Facebook, Google, Spotify, Netflix, iPhones, etc., to a world where most of us use these algorithm-based tools and services for hours each day. Despite their widespread use, we are only beginning to understand their social and ethical implications. In the same way that ecologists must study biological properties of animals and emergent animal behaviors, it is important to study technical aspects of algorithms and their emergent behavior within society. As a recent paper in my field explained, to fully understand something like the Netflix recommendation algorithm demands that we study "the social environments in which algorithms operate."

Here are a few questions that people in this field are thinking about:

  • What happens when an algorithm chooses the news, movies, and music that a culture consumes?
  • Should an algorithm choose where police cars patrol? What are the implications of using such an algorithm?
  • How can we mitigate the human biases that are embedded not only in the data we use to train algorithms, but also in the structure of the algorithm itself?
  • How can we design algorithms that embed positive human values?

While many find these questions "interesting," they have real-world impact beyond mere amusement. Biased algorithms have been deployed for facial recognition, recidivism prediction, credit scores, hiring, and more -- Cathy O'Neil's book provides an alarming collection of real-world examples. People often assume life with these algorithms is better, but as Margaret Atwood put it poignantly, "better never means better for everyone. It always means worse, for some." Many algorithms that appear to make life better actually have a disparate impact on society: they favor some people, and leave others out to dry.

One of the biggest reasons I am pursuing my Ph.D. is to wrestle with the following question: how might we design algorithms to make life better for more people?

Research Interests

I have a broad interest in the way modern technology affects people's minds and bodies at various scales: the individual, the group, and society. A few specific topics include:


Quotes I Like

A few quotes that inspired or influenced me in some capacity:

"The human dilemma is as it has always been, and we solve nothing fundamental by cloaking ourselves in technological glory."

from Neil Postman's 1990 speech, Informing Ourselves to Death

"Orwell feared that what we fear will ruin us. Huxley feared that what we desire will ruin us."

from Neil Postman's prophetic 1985 book, Amusing Ourselves to Death

"But I don't want comfort. I want God, I want poetry, I want real danger, I want freedom, I want goodness. I want sin."

John the Savage in Aldous Huxley's Brave New World

"Whether we and our politicians know it or not, Nature is party to all our deals and decisions"

Wendell Berry

"The adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered"

from Jared Diamond's famous essay, The Worst Mistake in the History of the Human Race

"Strap myself into a small rocket-room that is powered by the burnt remains of prehistoric kelp, in which I avoid dying by spinning a plastic circle wrapped in optional cow skin."

Zach Bornstein's description of driving a car, from A Typical Day


Find a PDF of my untrimmed CV here.


Some things you may or may not care about: