Skip to content

AI in anthropology education: Between threat and opportunity

By Pablo Ampuero-Ruiz – I remember very clearly the first lecture of my Master’s degree at Peking University in China. Professor Pan Wei was giving an introductory lecture on the cultural foundations of China’s political system, which started with a summary of historical materialism. ‘The introduction of a new technology,’ he said, ‘triggers sociocultural transformations.’ I remember feeling confused by such a bold statement, which went against everything I had learned during my undergraduate studies. As my research has moved towards studies of green technologies, I keep going back to this statement, which, in hindsight, seems much more revelatory. This has become particularly explicit these days, when AI seems to be the wherewithal of hypermodernity.

Artificial intelligence is one of those topics that tends to polarize. In conversations with colleagues, I often hear a deep concern that reflects a broader cultural change: if students can use AI to write their essays, what will happen to the quality of our teaching? Will ethnography and theory still matter if “ChatGPT” can deliver answers in seconds? These questions are legitimate—but they also risk framing AI as an enemy, when in fact it might be better understood as a tool.

We have been here before. When personal computers and word processors first entered universities, critics worried that students would stop learning how to write properly. Typing instead of handwriting, cutting and pasting instead of carefully drafting, spell-checkers instead of dictionaries—each of these was initially seen as a potential threat to “real” writing. And yet, over time, computers became indispensable tools for scholarship. They did not replace the intellectual labor of thinking, but they transformed the practices of editing, revising, and publishing. AI might be provoking similar anxieties today.

Of course, there is the possibility of cheating with AI. But then again, there has always been a possibility of cheating, whether through copy-pasting from the internet, buying essays, inventing data or other shortcuts. The real issue is not the technology itself, but how we as educators shape our expectations and assignments in ways that keep learning at the center.

At the same time, AI offers enormous potential to “level the playing field.” Many of our students—especially those writing in English as a second or third language—struggle with copy-editing and proofreading. Heck! Even we face structural challenges when trying to publish in the top journals, where English is the hegemonic language. AI tools can provide affordable and accessible assistance, allowing all of us to focus on the substance of our arguments rather than spending disproportionate energy on polishing grammar. In this sense, AI can help make higher education more inclusive and democratic, which, I hope we all agree, is a positive cultural change.

Yet, there is an important caution. No matter how “smart” AI may seem, it cannot replace the act of reading. Reading ethnography and theory is not just about extracting information; it is about following stories, encountering other worlds, and witnessing how anthropological arguments are constructed step by step. This process cannot be outsourced to a machine. To write anthropology is to engage—painfully, joyfully, and creatively—with texts, with field notes, and with the art of revision. AI may help us polish words, but it cannot teach us how to think through them.

Anthropologists studying AI—such as Nick Seaver on algorithmic cultures, Noortje Marres on testing and automation, or Jenna Burrell on opacity in machine learning—remind us that technologies are never neutral. They come with assumptions about efficiency, labor, and value, shaping the very ways we imagine knowledge. Their work shows that AI should not be taken at face value as a “tool” but should be seen as embedded in wider social, economic, and political relations. As teachers, we need to keep these reflections in mind when deciding how to use AI in our classrooms. In this sense, we should remain mindful of AI’s resource-intensive nature. Training and running large models require enormous amounts of energy and water, with significant socio-environmental consequences. If we, as anthropologists, are critical of extractive industries and unsustainable consumption, we should also reflect on the socio-ecological footprint of the technologies we adopt in our classrooms.

So how do we proceed? Perhaps the best way forward is not resistance, but thoughtful integration. We can experiment with AI in our teaching—for example, by using it to generate summaries that students then critique, or by asking students to reflect on the limits of AI when applied to ethnographic material. We can also use it ourselves to help with routine tasks, freeing time for what really matters: reading, discussing, and mentoring.

Rather than fearing AI, we might see it as an invitation to rethink what we value in anthropology education. Machines can help us with grammar and formatting. But they cannot read for us, they cannot feel the weight of a story, and they cannot replace the intellectual labor of learning to write and argue as anthropologists. That, still, is our craft.

Pablo Ampuero-Ruiz is Assistant Professor at the Department of Social and Cultural Anthropology of the VU.

The caption of the picture is: “Malinowski writing his notes in a laptop inside his tent during his fieldwork” (ChatGPT generated)

One Comment

  1. Freek Colombijn Freek Colombijn

    Thank you, Pablo. Your blog helps me finding a way in/with AI.
    Best wishes,
    Freek Colombijn

Leave a Reply

Your email address will not be published. Required fields are marked *