Call for Papers

In our conference at Purdue next July, we would like to open a conversation between Girardian thinking—especially René Girard’s ideas about mimetic desire, sacrificial violence, and scapegoating—and issues that arise in connection with artificial intelligence. Theorists of AI sometimes speak of a singularity by which they designate an anticipated moment when such systems become self-aware. As AI assumes increasing prominence in our lives, a host of questions arise for those of us who regard Girard’s ideas as important. Does self-awareness come with mimetic desire the way Girard claims it does for humans? If robots do become self-aware, and do desire, does that awareness and desire necessarily entail conflict or violence the way it does in human communities for Girard? Are we sure mimesis presupposes self-awareness? Could machines be hyper-mimetic without being self-aware? If we imagine machines modeling others, do they model others the way Girard shows humans do or more aligned with programmed algorithms? If we imagine machines as appropriating desire, could humans begin taking machines as their models? We know humans already sometimes take machines as desirable objects. If machines borrow models, what are they? What will self-aware machines imitate? Other machines? Humans? Nearby or remote objects? A transcendental intelligence of some kind? Does consciousness presuppose mimesis or vice versa?

We invite papers that probe these and related questions from a wide variety of disciplines. We require only that some serious engagement with Girard’s ideas be a part of the mix. For example, Girard suggests that humans desire not according to objects or subjects but other individuals who model those objects and those subjects for us, and that such borrowed or appropriated desire almost always leads to violence. Or Girard suggests human communities are constituted by nature and origin as systems of management for such borrowed desires (and attendant conflict), and that the primary means for such social control is a multifarious variety of exclusionary behaviors—from individual projection to surrogate victimage and everything in between—and that a primary concern today remains how to avoid or dismantle such sacrificial lynching behavior. A third strain in Girardian thinking is the recourse to certain important texts—religious, literary, and the like—that expose such scapegoating practices and their history for us.

In this spirit, we invite papers from the fields of AI, robotics, theology, philosophy, anthropology, literary criticism, women’s studies, historical studies, physics, biology, sociology, film studies, cognitive science, psychology, religious studies, environmentalism, political science, the internet of things, and any other fields or disciplines that touch upon (or re-conceptualize) these issues in such a way that might help us advance serious reflection within the conversation we propose.

Abstracts (of at least 150 words) should be sent to Sandor Goodhart at goodhart@purdue.edu by March 15, 2020.