Generative AI causes tremendous hidden harms to both society and the planet. How can we better understand the harms of this industry? How can we shift our technological realities into desired futures?
Over the course of my design master thesis, I have been exploring the potential tabletop role playing games (TRPGs) have in facilitating transformational experiences. Meaning TRPGs that enable a sustained change which alters a person’s view of themselves, others, and the world in significant ways to build toward greater awareness, peace, and justice1. Within this endeavour, I have been making solo and collaborative prototypes to explore the affordances and limitation of the genre. Inside Precarity2 is a TRPG that explores the question, “Who Pays the Cost of the AI Revolution?” I had the honour of presenting and playtesting the game at the 2025 Umweltbeobachtungskonferenz (Environmental Observation Conference) organized by the Swiss Federal Office for the Environment. This year the theme of the conference was Künstliche Intelligenz in der Umweltbeobachtung und Berichterstattung: Innovation und Verantwortung (Artificial Intelligence in Environmental Monitoring and Reporting: Innovation and Responsibility). I was excited and nervous to attend with my prototype, as the conference had a distinctive optimistic attitude towards the gen AI industry, while my personal outlook and the perspectives in my prototype are highly critical of its unethical practices. I was however, delighted to be greeted by a keynote speech in the introduction of the first day by Dr. Dorothea Baur which shed light on both the potential of gen AI and succinct coverage of the many risks it poses to society and the environment. This marked the official kick off of the conference, and it was time to man my booth and introduce guests to my work.

Over the course of the afternoon, I played Inside Precarity with three different groups of conference guests, and three other groups came in between to discuss the project and ask questions about it.
During the game, players took on the role of stakeholders affected by the gen AI industry, each with their own goals, motivations, and fears. These stakeholders navigated a situation of rising tension within the small community of Ashkin Oak: an existing AI data centre campus planned for expansion to maximize computational cycles and improve performance. The town and stakeholders in the game are inspired by towns in the United States and Ireland that face these very real negative impacts of data centres built in their communities. The game is partly a simulation of these situations to expose realities that most of us don’t think of when we interact with gen AI systems. But additionally, the game intends to provoke people to think how they would react in such a situation. The behaviour of different stakeholder groups is distinctly shaped by their agendas, power dynamics, and potential gains or harms, and the game is designed for players to unquestionably feel this. Rather than being told about these situations, they have the opportunity to embody those affected and understand the risks in a simulated first-hand like experience.


I found that attending a conference that explored innovative technology with a purely analogue and socially interactive prototype was highly provocative on its own. The booth stood in the middle of the room unconnected to any power and was the only booth where guests could sit around a table together. The reaction of just these factors was something I didn’t anticipate. Lots of discussion emerged around the fact that people could simply come and sit to take a break, play something fun without a screen or bot interaction, and connect with the other guests directly. The presence and form of such a media in this space had a powerful effect on its own.
Many discussions were also sparked by those who played the game. To my surprise, many of the conference guests remarked that they had no idea these were the kinds of infrastructures and situations that took place behind gen AI technology. A lot of them claimed that this was the first time they had ever heard of or thought about these potential problems, and it changed the way they looked at services like ChatGPT. Though it wasn’t brought up, I was curious if their experiences changed the way they looked at other projects at the conference that rely on gen AI with data centres behind them. These discussions proved to me that the game accomplished its goal in exposing these situations, and provoking people to think of the implications. The shortfall was that once the guests left my booth, the conversation ended. I had (and as of writing, still have) no way to continue the conversation or see if there are long term reflections or changes in behaviour that occur. It’s delightful to see the short term impacts in players, and engage in thoughtful conversations because of it, but I don’t yet have a way to measure what kind of long term repercussions the games have. As with many small scale interventions or awareness projects, it takes more than one small product to bring about sustained change. Games on their own will not bring about long term and wide reaching change. But how can they be optimized to provoke the first steps? Could the games systems or communities surrounding them be designed for mid to longer term transformations? These are the next questions to explore in my work.
https://sylvanmcgowan.carrd.co/
https://www.linkedin.com/in/sylvan-mcgowan/
Footnotes
1. Bowmann, S. L., Diakolambrianou, E., & Brind, S. (Eds.). “Transformative Role-playing Game Design.” Uppsala University Publications, 2025.
2. Sylvan McGowan, “Inside Precarity” Itch.io, 2026