Home Integrating Ethics of Technology into a Serious Game: The Case of Tethics
Article Open Access

Integrating Ethics of Technology into a Serious Game: The Case of Tethics

  • Giannis Perperidis EMAIL logo , Iason Spilios , Manolis Simos and Aristotle Tympas
Published/Copyright: July 17, 2025

Abstract

In the face of what is called “Existential risks/threats” or “polycrisis” – where digitalization, biomedicalization, and environmental degradation intertwine as existential threats – this article argues for a critical philosophy of technology as a necessary framework for ethical engagement. To address this polycrisis, we elaborate on an approach that integrates ethics and technology through serious gaming. Specifically, we introduce Tethics, a board game designed to engage players in ethical reflection on technological governance. By embedding critical philosophical insights into interactive gameplay, Tethics fosters a dynamic, experiential understanding of ethical dilemmas. Drawing on Andrew Feenberg’s critical theory of technology, we outline how game design can serve as a method for both refining philosophical inquiry and fostering public engagement. We argue that serious games may offer a unique means of interrogating and reconfiguring our relationship with technology, or can successfully act as educational tools for the ethics of technology today. Finally, we argue that Tethics performs a philosophically therapeutic function, debunking the metaphysical underpinnings of an instrumentalist and determinist viewpoint that dominates the current understanding of technology.

1 Introduction

In this article, we argue for two interrelated points. First, we elaborate on a critical philosophy of technology to address the three main contemporary existential threats. More specifically, we implement this critical philosophy of technology into the creation of a serious game, the Tethics Board Game (henceforth Tethics), which we developed in the context of an Erasmus+ EU project (https://tethics.eu/). In this way, we attempt to show how a critical political philosophy of technology can function as the bedrock for an ethics of technology and provide an approach that fosters deeper understanding and experiential engagement with such an ethical stance. Second, we argue for a metaphilosophical point. Namely, we attempt to show how Tethics can be understood to belong to the genre of a dialogical interactive philosophical example that debunks the metaphysical, ahistorical underpinnings of an instrumentalist and determinist viewpoint that dominates the current understanding of technology.

In the paragraphs of this section that follow, we provide an introduction to the constitutive notions of the two aforementioned points. First, we adumbrate the contemporary existential threats that constitute the main thematic axes of the serious game. Second, we introduce the aforementioned, precritical philosophie spontanée regarding technology, and third, we summarize the premises of the critical philosophy of technology we employ as an antidote, along with the merits of its being translated into a serious game.

It has become an unfortunate platitude to state that rampant digitalization, biomedicalization advances, and ongoing environmental degradation constitute three existential threats [1] that are constitutively coextensive with technological civilization itself. The notion of “existential” encapsulates succinctly the three aspects of this contemporary danger, or as it is called: “polycrisis.”[2] First, the gravitas; as existential, these threats do not have to do with parts of our well-being, but with our being simpliciter. Second, the scope; they do not concern a particular group of people, but rather humanity as a whole. Hence, third, the urgency; the need for a reaction in a transformative way at all levels has already become more than imminent.

Namely, contemporary artificial intelligence technology – digital networks, algorithms, Big Data – is “an epochal technology now colonizing an increasing number of domains,” from cell phones and delivery trucks, to banking and health care systems. It’s “ubiquitous …: a taken-for-granted feature of modernity like running water or electricity.”[3] As such, it challenges entrenched interpretations of core moral values: surveillance systems compromise privacy,[4] manipulative use of information compromises autonomy and freedom,[5] the opacity of AI systems engenders biases that, in turn, compromise justice,[6] and social networking systems promote a radical individualism.

Moreover, the landscape of contemporary biomedicalization reflects the almost self-same ethical issues. Taking, for example, the so-called “genetic revolution” alone and the technologies that it “brought us … such as genetic screening, genetic pre-implantation and pre-natal diagnosis, gene therapy, cloning and genetic pharmacology,”[7] we understand how these technologies challenge – along with fundamental ontological and epistemological issues – the same core moral values in a similar way.

And last, but in no way least, there is the “ecological wounding,” about which Peter Sloterdijk has written almost twenty-five years ago; a wound that “sets about to demonstrate that human beings … over the long term only misconstrue and ruin complex environmental systems, but can neither understand nor protect them.”[8] Similarly, already in the same year, Mike Davis, in his more than eloquently titled Late Victorian Holocausts,[9] argued for the direct link between; on the one hand, the droughts and famines caused by El Niño in the last third of the nineteenth century and, on the other, the colonial policies of that period, thus demonstrating the constitutive character of science, technology and politics in the creation of the contemporary climate condition. This idea is once again perfectly encapsulated by Sloterdijk’s recent imagery of a remorseful Prometheus; having bestowed fire and technic onto the constitutively lacking humans thrown into a hostile world, he regrets his gifts for turning humans into a collective of arsonists who set fire to ground and underground resources, thus undermining the possibility of their own existence.[10]

The critical philosophy of technology that we employ aims to debunk a precritical, entrenched, widespread philosophie spontanée regrading technology, according to which technology is identified with applied science, “[t]echnological artifacts are neutral tools, passively to be used by humans,” and “[t]echnological development is determined by an inner logic.”[11] The more robust versions of this instrumentalist and determinist viewpoint adhere to a metaphysical – that is, ahistorical and decontextualized – underpinning of the above theses, and understand the categories of the human, the world, and the technological in a more or less essentialist way.

Traditional moral philosophy is based on this viewpoint and therefore seems inadequate to address the contemporary existential threats. It constitutes an abstract, top-down, principles-based approach; as such, it comes either too early – focusing on issues like “the technoscientist’s responsibility” – or too late – focusing on issues of use. A critical philosophy of technology suggests a morally informed, bottom-up, social diagnosis of technology in the making; it seeks to open the black box of specific technologies and investigates the power relations at play among different actants in concrete contexts.

Integrating philosophy and games is crucial because it allows abstract philosophical theories to be tested, experienced, and internalized in a dynamic and interactive manner. Philosophical concepts, particularly those concerning ethics and technology, can often be difficult to grasp in purely theoretical terms. A board game provides a tangible, participatory framework through which players can engage with these ideas, confront ethical dilemmas, and witness the consequences of different philosophical perspectives in action. This approach fosters deeper understanding, critical reflection, and engagement with philosophical inquiries beyond the constraints of traditional academic discourse. By building a board game out of a philosophical theory, we transform philosophical reflection into an immersive, accessible, and dialogical experience, bridging the gap between theoretical knowledge and lived experience.[12]

In Section 2, we present the content of a critical philosophy of technology, which, at the same time, constitutes the theoretical foundation of Tethics. We focus on Andrew Feenberg’s critical theory of technology; we show in which way it has been informed by Science and Technology Studies, and argue that it provides the best theoretical foundation for the ethics of design approaches. In Section 3, we present the methodology for creating Tethics; namely, here, we explain how the aforementioned insights of a critical philosophy of technology have been implemented and translated into gameplay, while, in Section 4, we provide a general description of Tethics. Finally, we conclude by showing how the implementation of a critical philosophy of technology in the creation of Tethics contributes to a metaphilosophy of gaming.

2 A Critical Philosophy of Technology: The Theoretical Foundations of Tethics

The ethics of design has emerged as a crucial interdisciplinary field at the intersection of engineering and philosophy, addressing the moral responsibilities embedded in technological development. Within this discourse, various approaches seek to navigate the ethical implications of design.[13] These perspectives often materialize in methodologies like Value Sensitive Design, responsible research and innovation, and participatory design, all of which attempt to integrate ethical considerations into technological development.[14] Meanwhile, philosophical approaches within the scope of the so-called “empirical turn,”[15] particularly those rooted in phenomenology and critical theory, emphasize the social and political dimensions of technology, arguing that design is never neutral but always shaped by underlying values and power structures. Among these perspectives, Andrew Feenberg’s theory of the politics of technology stands out as a compelling foundation for an ethics of design. Feenberg demonstrates that different social actors impose distinct meanings on technology, revealing its inherently contested and value-laden nature. This insight provides a productive starting point for developing an ethical framework that accounts for the diverse social forces shaping technological artifacts. In this section, we are going to present the reason we turned to Andrew Feenberg’s philosophy of technology for the development of our serious game on the ethics of technologies – apart from the fact that his theory has been utilized in order to develop a discourse on games, and in this form it has been applied in World of Warcraft (WoW).[16]

Andrew Feenberg begins his approach to technology by asserting that modern technological advancements have elevated human beings to a quasi-divine status.[17] This does not imply, of course, that humanity has achieved immortality through technology. Rather, it suggests that technological development has enabled individuals to operate within their environments as if they were external to them, akin to deities who do not depend on their surroundings for continued existence. One of the defining features of modern technology is its capacity to reduce feedback, that is, the response of materials to the user. This renders the individual an autonomous agent, seemingly immune to the consequences of their actions. In this way, modern technology appears to challenge Newton's third law of motion – the principle that every action has an equal and opposite reaction – by redirecting any consequences in such a manner that they never reach the user.

In modernity, humans have come to see themselves as fundamentally distinct from nature, assuming a unique capacity to alter the natural environment without being affected in return. Feenberg, however, contends that this one-sided influence is an illusion, which he terms the “illusion of technique.”[18] Through this concept, the American philosopher of technology identifies a fundamental division between the natural and the human that characterizes modern societies. In pre-modern times, artifacts were imbued with communal values and fostered a reciprocal relationship with their users. By contrast, modernity, characterized by extreme specialization and disciplinary segmentation in technological production, has led to a disconnection between different fields of expertise. This separation has resulted in a conceptual division between values and so-called objective facts, stripping technology of any intrinsic ethical dimension.[19] The outcome is the neutralization of technology, which is now perceived as a mere tool for the arbitrary objectives of its users.[20] Consequently, the primary concern in technological development is no longer the ethical implications of an artifact but rather its aesthetic appeal, market distribution, and advertising strategy. This process reinforces the notion that technology is neutral, with efficiency emerging as the paramount value it must embody.

Feenberg takes as his starting point the premise that technology has been rendered neutral due to the complete dissociation of values from factual data in modernity. However, he argues that one way or another, the effects will manifest.[21] The most pressing consequence today is the environmental destruction caused by technological advancement. While feedback suppression may be possible on a micro-level, Feenberg asserts that when the temporal frame is expanded, the repercussions of technological action become apparent. In this context, the illusion that humanity can act upon nature without facing any counteraction is ultimately disrupted.

The illusion of technique isolates humans from their environment by severing the connection between facts and values. This division allows facts to dominate the discourse on technological progress, creating a fertile ground for essentialist perspectives that predict the inevitable overtake of humanity by an uncontrollable, self-perpetuating technology.[22] Feenberg, however, does not advocate for a reactionary rejection of technological development, nor does he support a neo-Luddite call for the dismantling of modern technology. Instead, he offers a rational critique of rationality itself, seeking to introduce social and political constraints into what has traditionally been considered a purely technological domain. His approach is informed by Science and Technology Studies (STS), which reveals the social foundations underlying technological advancement.[23] These studies refute the deterministic and technophobic narratives that depict technology as an autonomous force spiraling out of control. Instead, technological outcomes are shown to be the result of social and political choices rather than an inevitable fate.[24] Consequently, responsibility for technological destruction does not lie with technology itself but with its human creators and decision-makers.

Feenberg's argument dismantles the idea that technology follows a deterministic, linear trajectory toward the eradication of human agency. Since technological development is shaped by socio-political choices, each innovation embodies the interests of particular social groups that influence its design. Feenberg aligns with Norbert Wiener in recognizing the political nature of technology but reframes this concept in terms of bias: every technological artifact is predisposed toward the interests of specific social groups that played a role in its creation.[25] As a result, technology is never truly neutral, as it evolves based on deliberate political decisions that have material consequences for those excluded from the design process. This explains why merely increasing the quantity of technology within the same value framework does not rectify existing societal issues. Instead, Feenberg argues that what is necessary is a transformation in the values embedded within technology itself. Such a transformation entails incorporating a greater diversity of values from a broader range of social groups into technological design. By doing so, technological artifacts will become biased toward the needs of a larger segment of the population, reducing the number of marginalized individuals excluded from their benefits. This process aligns with Feenberg's overarching vision for the democratization of technology.

At the heart of Feenberg's proposal is the concept of participant interests,[26] which represent the values that different social groups introduce into technological design. The broader the participation of different groups in shaping a technology, the less likely it is to exclude individuals from its benefits, making it more democratically oriented. Feenberg draws on social constructivism to “open the black box” of technological design, exposing the hidden biases embedded in its development. He demonstrates that technology is determined by political choices rather than following an inexorable linear progression. Each design embodies specific values that become embedded in what he terms the “technological unconscious.” Over time, people forget that certain technological functions were originally responses to particular social values that were translated into technical specifications. This process underlies Feenberg's assertion that “values are the facts of the future.”[27]

Feenberg's analysis of technological development also leads him to explore the potential for social resistance among marginalized groups whose interests are not reflected in current technological designs. Here, his critique intersects with Michel Foucault's approach of power and knowledge.[28] Foucault, according to Feenberg, demonstrated that artifacts and technological mechanisms do not merely function as neutral tools but actively shape social reality and influence human subjectivity. The formation of subjectivity occurs through implicit mechanisms that operate within the technological framework of a given society. Those who control technological design also shape individual agency and self-perception. This raises a critical question: To what extent can the average, non-expert individual influence the technological structures that shape their reality and, consequently, their ability to act as autonomous agents?

Feenberg addresses this question by exploring the possibility of resistance to technological domination. He draws on Michel de Certeau's insights, agreeing that every system of power creates its own spaces for maneuvering.[29] Non-expert users can shape technological development through everyday interactions, infusing technologies with meanings not initially intended by their designers. Over time, these meanings may gain widespread acceptance and become dominant values, influencing future technological designs through public discourse and social struggle. This process, which Feenberg terms “secondary instrumentalization,” is central to his “theory of instrumentalization,” whereas a “primary instrumentalization” means that the necessary properties for an artifact's creation are extracted, and their sources are decontextualized; while the secondary stands for a systematization, mediation, orientation, and concretization which recognize the capacity for users to redefine technological meanings.[30]

Feenberg does not discuss ethics in his theory much, because ethics are the values that participants insert into the technological design. Thus, we argue that a politics of technology, as the one Feenberg develops, takes into account the ethical judgments as the values of the participants who influence technologies. But in order to become specific about the ethical implications of designs, we incorporated the “by design” approaches to ethics from the field of philosophy and engineering, especially the Value Sensitive Design. This approach to ethics, engineering, and technology emphasizes the importance of integrating human values into the design of technologies. It recognizes that technologies have the potential to influence human behavior, relationships, and well-being, and therefore, their design should take into account ethical and moral considerations.

Accounts on this direction have been developed by Rob Kling,[31] Batya Friedman and Peter H. Kahn Jr.,[32] Wendell Wallach and Colin Allen,[33] and Batya Friedman and David Hendry.[34] Such approaches diverge from the conventional post-production ethical assessment of technical artifacts, specifically at the deployment and utilization phases of the production. Instead, Value Sensitive Design proactively instates ethical values during the design phase by formulating a policy framework grounded in specified values and elucidating methodologies for their translation into technical specifications. Significantly, this investigation addresses both the theoretical foundations and practical instantiation of Ethics by Design, thereby contributing to a nuanced comprehension of its emerging role within the ethical discourse surrounding technologies, both analogue and digital. A notable aspect of this theoretical framework lies in its acknowledgement that an artifact's design, while influential, remains subject to potential technical alterations through diverse user interactions and audience engagement.

3 Methodological Considerations: Translating a Critical Philosophy of Technology into Gameplay

The design of the Tethics serious game presented a significant challenge. A key question was how to integrate the theoretical perspectives of Feenberg, STS, and in-design ethics into a single game that both addresses future existential risks and teaches the ethics of technology in an engaging and interactive manner. Such methodological concerns (how to bridge theory with game design) arise whenever one tries to build a serious game. Gladly, scholars within this field have provided the blueprints of how to methodologically proceed with the designing of the serious game.[35] The structuring of Tethics game followed some of the principles found in such literature. In this section, we outline the process of transforming the theoretical frameworks presented in the previous part into a playable and intellectually stimulating serious game that not only conveys ethical considerations in technology design but also highlights potential future risks to humanity.

At the core of Tethics is the concept of player choice, which serves as a key mechanism for engaging participants. This feature directly translates theoretical insights from Feenberg and STS into gameplay mechanics. Specifically, it draws on Feenberg’s notion of “participant interests” and the STS concept of “relevant social groups” (Pinch & Bijker, Law, and Hughes). According to Feenberg, technological development is not merely a matter of explicit individual or collective selection between available features; rather, it is shaped by sociopolitical and ethical values that influence the design and adoption of technology. Social groups form around these values, each seeking to shape technological development in ways that align with their interests. To reflect this dynamic within the game mechanics, we designed four distinct social groups, each representing a different set of values that could theoretically emerge at any point in socio-political struggles over technology. These struggles, in Feenberg’s terms, contribute to the formation of a “technical code” that defines the dominant technological paradigm of an era.[36]

As well as deciding on how to translate the participant's interests into a playable element, we needed to identify which existential threats and technologies to include. To this end, we turned to Ethics4Challenges: Innovative Ethics Education for Major Technological and Scientific Challenges (https://ethics4challenges.eu/), another Erasmus+ EU project of ours. This project provided us with a comprehensive account of the contemporary technological existential threats, as adumbrated in our introduction earlier, along with all the specific technological configurations and ethical issues, we needed in order to create the specific game elements that follow.

A third key methodological aspect of Tethics was the development of in-game content, specifically the writing of the narrative texts associated with different gameplay outcomes. Since the game incorporates four social groups, six existential risks, and twenty-four key technologies, we created a total of 96 outcome texts. Each text explores the potential consequences of a given social group achieving dominance in shaping the values embedded in a particular technology. Additionally, 24 broader scenario texts were written to depict the overarching societal conditions that emerge when a specific group attains dominance. The questions guiding this process included: How would reproductive technologies or artificial intelligence (AI) evolve if market-driven interests dictated their design? Or, how would a technology incorporating transparency ethically address the existential risks posed by AI and environmental crises? These questions informed the creation of scenario-based texts that illustrate the potential ethical and societal implications of technological development under different value-driven influences.

Through these methodological choices, we sought to develop a game that demonstrates both the inherent politics of technology – stemming from the struggles between different social groups (Feenberg, STS) – and the integration of ethical considerations into technological design. More specifically, these methodological choices reveal the specific principles on which the specific game design is based. First, Tethics is a simplified simulation of opening up the black box of specific technologies, and the display of the different interests and values at play. As we saw toward the end of the previous section, at the heart of Feenberg’s approach is the concept of participant interests, which represent the values that social groups introduce into technological design. In this way, the game makes it possible to understand a twofold thesis: that the responsibility for both technological development and destruction lies not with the technology itself but with its human creators and decision-makers; and that those who control technological design also shape individual perception and self-perception. Thus, in light of this opening up of the technological blackbox, and as mentioned in our introduction earlier, an ethical intervention can now take place in the making of specific technologies: the broader the participation of different groups in shaping technology, the less likely it is individuals will be excluded from its benefits, thus making it more democratically oriented. Moreover, an additional merit of such an approach is that it recognizes that the design of an artifact, while influential, remains subject to potential technical alterations through diverse user interactions and audience engagement (Value Sensitive Design).

Finally, Tethics was designed to be adaptable. Educators and researchers can modify the game according to their specific teaching objectives by adjusting the selection of existential risks, social groups, and narrative texts. By doing so, the game can be tailored to address a wide range of social and ethical issues related to technological development, enabling players to critically engage with the power dynamics and ethical considerations that shape the technologies of the future.

4 Describing Tethics: A Critical Philosophy of Technology at Play

Tethics is a board game designed to engage players in the ethical challenges associated with the development and deployment of emerging technologies. In the process of designing it, we attempted to follow certain Game Studies dimensions, drawn from the classical literature of the field. Thus, this game blends strategic decision-making, role-playing, and philosophical dialogue, creating a dynamic environment for exploring the societal, cultural, legal, and moral implications of technological progress.[37] Set in a world where rapid innovation is reshaping all aspects of human life, the game immerses players in roles that represent key stakeholders – including technology developers, policymakers, corporate executives, researchers, and civil society representatives – each with their own set of interests, priorities, and ethical considerations.

A Tethics game round is a structured and interactive session in which players, representing different societal forces, compete to influence how a specific technology is designed and implemented based on various values. The round begins by choosing a Technology Card. This card represents a real-world technological domain (e.g., AI and the Environmental Crisis, Deep Brain Stimulation, Pandemic Testing, etc.). Each technology comes with a scenario that introduces an existential risk, such as environmental degradation or biomedicalization.

4

Each Technology has Aspect Cards, typically four per round. Each card presents a specific ethical or societal aspect of the technology.

4

Each player or team (representing a societal force: State, Market, Research, or Public) is dealt Importance Cards. These cards indicate how much importance they assign to each aspect based on their group's values (e.g., profit for Market, control for State).

4

Each player secretly takes all the influence tokens, if any, and selects any number they want to use for the round. They add the number of the tokens to their Importance card’s value. The player with the greater total value is considered the winner. The Aspect card’s outcome for the winning Player is announced. Similarly, the other outcomes are read so that the Players understand the different potential results. Each player takes turns explaining why their societal force should dominate a particular Aspect, based on the group’s ethical standpoint. This is a key discussion and learning moment where players must: Justify their values, connect their priorities to the aspect, and possibly role-play their societal force.

A key future possibility lies in the game’s flexible design, which allows it to be shaped by each group that plays the game according to their specific needs. Specifically, we recognize the complexity of these issues and intentionally avoid giving specific answers or limiting the dialogue. The game does not oversimplify the topics; rather, it serves as a starting point for bringing out different perspectives, shaped by the roles players take on. While we have included selected texts related to particular technologies, the game is intentionally open to reconfiguration; it is, in essence, an open tool. This openness reduces future limitations and expands the range of possible uses. This flexibility is most evident in the discussion phase, which follows the thought-provoking questions we have included, and the role-playing structure boosts this direction; this is where much of the actual gameplay takes place. However, one limitation is that, as a serious game that also needs to be enjoyable so each play session cannot go on for hours. Since the game is also designed for students and younger participants, it cannot fully represent the theoretical framework in its entirety.

Tethics integrates a highly interactive system, with specific rules that place it at the “game” taxonomy,[38] in which the impact of decisions extends beyond the immediate outcomes of a single round. Actions taken in one round can lead to unforeseen challenges or opportunities in subsequent rounds, reinforcing the interconnected nature of ethical decision-making in the context of technological governance. This structure encourages players to think critically about the long-term societal and ethical implications of their decisions, as well as their evolving role within a rapidly changing technological landscape. The game’s design ensures that no two playthroughs are the same, creating a highly replayable experience that remains relevant and engaging across multiple sessions.

A key feature of the game is its card-based mechanic, which adds an element of unpredictability to the gameplay. Players draw cards that introduce variables such as public opinion shifts, corporate strategy changes, technological breakthroughs, or new regulatory developments. These cards compel players to adapt their strategies and reconsider their ethical stances, promoting a deeper engagement with the complex realities of technological ethics. Additionally, players can influence debates by appealing to reason, emotional arguments, or expert opinions, enhancing the intellectual and emotional investment in the game’s ethical discussions.

Tethics’ gameplay revolves around the delicate balance between cooperation and competition. While players may share common goals, such as addressing ethical dilemmas in technology, each player also pursues their own interests by representing the state, the public, the researchers, or the market, which may conflict with the collective objectives. For example, some players may prioritize corporate profits, while others might emphasize public welfare, environmental sustainability, or regulatory oversight. This creates a nuanced interaction between different stakeholder priorities, mirroring the complexities of real-world technological governance. Strategic alliances form and dissolve throughout the game as players navigate these competing interests, fostering negotiation, diplomacy, and social dynamics within the group. Stakeholders are competitive, as each wants their values to shape the technology, but outcomes are not inherently mutually exclusive. The game models dominance for educational clarity, but real-world scenarios and deeper gameplay discussions acknowledge intertwined influences and shared responsibilities. This nuance is part of what makes Tethics an effective ethical training tool rather than just a game.

Role-playing is central to the game’s immersive experience, allowing players to embody their assigned roles and engage in narrative-driven decision-making.[39] Through storytelling and argumentation, players justify their actions, defend their perspectives, and engage in ethical debates. This process not only fosters critical thinking but also promotes empathy, as players are encouraged to consider the viewpoints of others, including those with differing priorities or worldviews.

Designed for a wide range of audiences, the game is suitable for academic settings, corporate environments, and casual gaming sessions. It serves as a powerful educational tool for educators, policymakers, and professionals who seek to explore and reflect on the ethical dimensions of technological development. The game can be used in classrooms to teach responsible innovation, in corporate environments to reflect on ethical leadership, and in research settings to evaluate the societal impacts of technology. The modular structure of the game allows facilitators to customize scenarios based on industry, ethical frameworks, or specific technological concerns, enhancing its adaptability and relevance.

In the educational context, the trainer plays an essential role in maximizing the game’s potential for learning and ethical reflection. Acting as a facilitator, the trainer encourages players to critically examine the risks, benefits, and societal impacts of their decisions. The trainer guides discussions after each round, helping players evaluate the outcomes of their choices, and ensures that the game’s educational objectives are met. The trainer helps steer the narrative, prompting deeper exploration of the ethical and societal dimensions of the scenarios. Although the game can be played without a trainer, their involvement significantly enhances the depth of discussion and the learning experience.

Tethics addresses a wide range of pressing ethical issues related to emerging technologies, through well-structured decision making and non-verbal negotiations, in accordance with Ian Bogost’s notion of “procedural rhetoric.”[40] These include topics in artificial intelligence, climate change, energy, and biomedicine, with a strong emphasis on the intersection of technology and societal values. Players take on roles that represent different societal forces – State, Market, Research, and Public – each with distinct priorities and perspectives. The scenarios are designed to explore issues such as the ethical governance of AI, the role of technology in addressing environmental challenges, and the moral considerations surrounding biomedical innovations like artificial reproduction. Through these dilemmas, the game invites players to critically evaluate the societal impacts of technological advancements and the ethical frameworks that should govern them.

Tethics’ design ensures that it remains relevant and engaging over time, with high replayability driven by the evolving nature of technological and ethical dilemmas. The game’s expandable structure allows for future updates, including new stakeholder roles, advanced cards, and complex decision trees that reflect emerging technological issues. This ongoing evolution ensures that Tethics will continue to challenge players with new moral quandaries and provide a lasting platform for exploring the ethical dimensions of technology in an ever-changing world.

It is a tool for fostering critical thinking, empathy, and dialogue about the ethical implications of technological development. By challenging players to consider the moral dimensions of innovation, the game promotes a deeper understanding of the societal and environmental impacts of technology. It bridges the gap between entertainment and education, offering players an immersive experience that combines strategic gameplay, role-playing, and philosophical reflection. Whether played in an academic setting, a professional environment, or as a casual game, Tethics empowers participants to engage thoughtfully with the ethical challenges of the modern world and consider their role in shaping the future of technology.

5 A Critical Philosophy of Technology as a Metaphilosophy of Gaming

In the previous sections, we introduced the three existential threats, adumbrated their content, and argued for a critical philosophy of technology that can address them at root. Furthermore, we presented how this critical philosophy has been translated into a serious game, and we elaborated on its very content. In this way, we attempted to show, on the one hand, how a critical political philosophy of technology can function as an ethics of technology, and, on the other, how this ethics of technology – simulated in the context of a game – can inform the premises of the very philosophy of technology on which it is, in turn, based. Thus, in light of the above, we can now understand the metaphilosophical significance of a philosophical serious game. We aim to present this point by responding to two interrelated metaphilosophical questions. The first question regards the philosophical genre to which a philosophical serious game belongs, and the second regards the philosophical function the specific philosophical game performs. In the few paragraphs that follow, we address these two questions.

Regarding the first one, a serious game of the kind previously analyzed can be understood as a distinct philosophical genre, a distinct style of philosophical practice. If this is the case, then the question arises as to which type of philosophical practice this game belongs, which place it occupies within the spectrum of philosophical genres. According to MacIntyre’s typology, philosophy has appeared in an array of different literary genres throughout its history: dialogue (Plato), prayer (Augustine, Anselm), intellectual debate (Aquinas, Scotus), poetry (Dante, Pope), geometry (Spinoza), history (Hegel), novel (George Eliot, Dostoevsky, Sartre), and “that most eccentric latecomer of all the philosophical genre forms, the article contributed to a professional journal.”[41] Of course, and as MacIntyre would himself admit, this typology is only indicative and necessarily incomplete qua open-ended. For example, we could see the essay (Montaigne), the aphorism (La Rochefoucauld, Lichtenberg), the fragment (Fr. Schlegel, Nietzsche), and the short story (Diderot, Voltaire) coming to mind as possible additions. According to this line of thinking, the genre of the literary example – either confined into a fragment or ensconced into a Platonic dialogue in the form of a myth, or elaborated into a short story, or expanded into a novel – could be considered a further specific case.

In light of the above, the philosophical serious game can be seen to occupy a distinct place as the combination of the genre of the example and the genre of the dialogue. Namely, on the one hand, the game’s focus on a specific technological scenario that emerges from the gameplay as a result of both the four roles of key stakeholders promoting different objectives, and the card mechanic, constitutes the equivalent of the genre of the example. On the other hand, the communication between the players/stakeholders in the form of storytelling, argumentation, negotiation, alliance formation, and conflict constitutes the equivalent of the genre of the dialogue.

After the discussion of the first question – that is, the philosophical genre of the serious game – we can now turn to the second one, and understand, in light of the above, the game’s philosophical function. Although it would be more than futile to even start mapping the variety of different philosophical approaches, methods, and traditions, we can, though, following Richard Rorty’s thought, roughly distinguish between two different conceptualizations of philosophy. According to the first, philosophy is considered an epistemologically foundationalist and metaphysically realist, quasi-scientific, autonomous intellectual enterprise. According to the second, philosophy is understood as a therapeutic activity; it can be understood as a contextualist stance aiming at debunking ahistorical presuppositions. In light of this distinction, the philosophical function of our game can be understood as a therapeutic one.[42]

Namely, the philosophical significance of this serious game lies at a metalevel: it lies in one’s very participation in it, and not in the outcome of any individual game match. In other words, it is not to be found in the, say, final victory of the “market” stakeholder/player or the “research” stakeholder/player. Rather, it is to be found in the very game’s role as an interactive example, that is, as a dialogue and a representation of the conflicting values and interests at play in a concrete situation.

More specifically, by the very playing of the game, the players distance themselves from the precritical understanding of technological development in terms of a necessary trajectory, itself grounded in a unique ahistorical value. Rather, the game itself is an immersion into a world of different forces, factors, values, configurations, and contingencies. In this way, the game has a therapeutic function. The scenario discussed throughout each individual game match is the equivalent of a philosophical example. This example does not work as an intuition pump, that is, as a thought experiment designed to elicit a unique moral intuition from the player’s ahistorically conceived rationality in response to a specific moral dilemma.[43] Rather, the game puts forward the idea of an irreducible context, of a world of embedded conflicting forces, and as such, it debunks the metaphysical assumptions in which the idea of a unique solution is grounded. This therapeutic effect is thus the result of an ongoing askesis: the more we play the game, the more we get accustomed to the idea of different values and interests at play, to the idea of a defining context, to the practice of seeing things differently in sharp contrast to the way we used to see them.

According to Levinas,[44] Socrates, in Plato’s Republic, is happy that Thrasymachus, though initially refusing to speak, at least does not leave the discussion. Similarly, playing the game means being in the discussion, and being in the discussion means being exposed (again and again) to the idea that the context has an overriding importance. And this importance seems to apply to every philosophical question and dilemma far beyond technology and the threats it poses. Hence, a critical philosophy of technology is a metaphilosophy of gaming.[45]

6 Conclusion

In this article, we argued for two interrelated points. First, we argued that a critical philosophy of technology can address the three main contemporary existential threats – the rampant digitalization, the unprecedented biomedicalization advances, and the ongoing environmental degradation. More specifically, in the context of this point, we adumbrated the entrenched, instrumentalist, and determinist philosophie spontanée regarding technology, and argued that the traditional moral philosophy that is based on this conception seems inadequate to address these existential threats. We argued that Andrew Feenberg’s critical theory of technology constitutes the best alternative to these traditional approaches. By explicating how it has been implemented and translated into the creation and gameplay of the serious game Tethics, we showed how this critical philosophy of technology provides, at the same time, the best theoretical foundation for an ethics of design; how this ethical approach works in concrete contexts addressing the contemporary existential threats; and how game design can refine philosophical inquiry, and foster deeper understanding of and experiential engagement with such an approach. Furthermore, in this way, we showed that an ethics of design can inform the critical philosophy of technology on which it is based.

Second, we argued for a metaphilosophical point. Namely, we argued that the philosophical serious game of Tethics can be seen to occupy a distinct place in the spectrum of the possible philosophical genres, as the combination of the genre of the example and the genre of the dialogue. Following Richard Rorty’s thought, we argued that the philosophical function of the game can be understood as a therapeutic one. Arguing that the philosophical significance of the game lies in one’s very participation in it, and not in the specific outcome of each game match, we showed that the game puts forward the idea of an irreducible context, of a world of embedded conflicting forces. Thus, by underscoring the primacy of the context, it aims at debunking the metaphysical – that is, ahistorical and decontextualized – underpinnings of an instrumentalist and determinist viewpoint that dominates the current understanding of technology.

Acknowledgments

The authors would like to thank Katerina Vlantoni, Marianthi Gritzioti, Apostolos Spanos, Giannis Koukoulas, Konstantinos Konstantis, Marilena Pateraki, Antonis Faras, Kornilia Papanastasiou-Toli, Elli-Danae Vartziotis, Maria Amiridi-Wiedenmayer, Panos Kazantzas, Tracey Strange, Tassos Frintzos-Vavlis, Ioannis Panousis, Aspa Panomeriti, Konstantinos Lekkas, Cristina Morar, Maria Jalwan, and Iro Dioti for their contributions to the successful completion of the TethicsGame project, which gave the idea for the publication of this article. The authors would like to thank the anonymous reviewers for their excellent comments. They would also like to express their sincere gratitude to Open Philosophy [journal name in italics please] for waiving the article processing charges associated with this publication. This support is deeply appreciated.

  1. Funding information: The research presented in this article was undertaken with support from the EU ERASMUS+ Project ‘TethicsGame: A Serious Game for Innovative Education in Ethics of Technology’ 2023–2025.

  2. Author contributions: Giannis Perperidis, Iason Spilios, Manolis Simos, and Aristotle Tympas contributed to the conceptualization, methodology, writing of the original draft, and review and editing of the manuscript. Aristotle Tympas additionally provided supervision for the project. All authors have read and approved the final version of the manuscript.

  3. Conflict of interest: The authors declare that there are no conflicts of interest relevant to the content of this article.

References

Achterhuis, Hans. American Philosophy and Technology; The Empirical Turn. Bloomington: Indiana University Press, 2001.10.2979/1092.0Search in Google Scholar

Akrich, Madeleine. “The De-Scription of Technological Objects.” In Shaping Technology/Building Society, edited by W. E. Bijker and J. Law, 205–24. Cambridge: MIT Press, 1992.Search in Google Scholar

Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge: Polity, 2019.Search in Google Scholar

Bogost, Ian. Persuasive Games: The Expressive Power of Videogames. Cambridge: MIT Press, 2007.10.7551/mitpress/5334.001.0001Search in Google Scholar

Brey, Philip and Brandt Dainow. “Ethics by Design for Artificial Intelligence.” AI Ethics 4 (2024), 1265–77. 10.1007/s43681-023-00330-4.Search in Google Scholar

Caillois, Roger. Man, Play and Games. Champaign: University of Illinois Press, 2001 [1958].Search in Google Scholar

Coeckelbergh, Mark. AI Ethics. Cambridge: MIT Press, 2020.10.7551/mitpress/12549.001.0001Search in Google Scholar

Coeckelbergh, Mark. The Political Philosophy of AI. An Introduction. Cambridge: Polity, 2022.Search in Google Scholar

Davis, Mike. Late Victorian Holocausts: El Niño Famines and the Making of the Third World. London: Verso, 2000.Search in Google Scholar

de Certeau, Michel. L’invention du quotidien. Paris: UGE, 1980.Search in Google Scholar

Dennett, Daniel. “The Milk of Human Intentionality.” Behavioral and Brain Sciences 3:3 (1980), 428–30.10.1017/S0140525X0000580XSearch in Google Scholar

Dignum, Virginia. “Ethics by Design: Necessity or Curse?.” In Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society, 60–6, 2018.10.1145/3278721.3278745Search in Google Scholar

Feenberg, Andrew. Questioning Technology. London & New York: Routledge, 1999.Search in Google Scholar

Feenberg, Andrew. Transforming Technology. A Critical Theory Revisited. Oxford: Oxford University Press, 2002.10.1093/oso/9780195146158.001.0001Search in Google Scholar

Feenberg, Andrew. Between Reason and Experience. Essays in Technology and Modernity. Cambridge, MA: MIT Press, 2010.10.7551/mitpress/8221.001.0001Search in Google Scholar

Feenberg, Andrew. Technosystem. The Social Life of Reason. Cambridge: Harvard University Press, 2017.10.4159/9780674982109Search in Google Scholar

Feenberg, Andrew. “A Critical Theory of Technology.” In Handbook of Science and Technology Studies, edited by Ulrike Felt, Rayvon Fouché, Clark A. Miller, and Laurel Smith-Doerr, 635–63. Cambridge: MIT Press, 2017.Search in Google Scholar

Feenberg, Andrew. “Ten Paradoxes of Technology.” In Technology, Modernity and Democracy. Essays by Andrew Feenberg, edited by Eduardo Beira and Andrew Feenberg, 37–54. Lanham: Rowman & Littlefield, 2018.10.5040/9798881816759.ch-002Search in Google Scholar

Friedman, Batya, Peter Kahn, and Alan Borning. Value Sensitive Design: Theory and Methods. Seattle: University of Washington Press, 2003.Search in Google Scholar

Friedman, Batya and David G. Hendry. Value Sensitive Design: Shaping Technology with Moral Imagination. Cambridge: MIT Press, 2019.10.7551/mitpress/7585.001.0001Search in Google Scholar

Garvey, Shunryu Colin. “Unsavory Medicine for Technological Civilization: Introducing ‘Artificial Intelligence & its Discontents.’” Interdisciplinary Science Reviews 46:1–2 (2021), 1–18.10.1080/03080188.2020.1840820Search in Google Scholar

Grimes, Sara M. and Andrew Feenberg. “Rationalizing Play: A Critical Theory of Digital Gaming.” The Information Society 25:2 (2009), 105–18. 10.1080/01972240802701643.Search in Google Scholar

Halpern, Orit, Robert Mitchell, and Bernard Dionysius Geoghegan. “The Smartness Mandate: Notes toward a Critique.” Grey Room 68 (2017), 106–29.10.1162/GREY_a_00221Search in Google Scholar

Hansson, Sven Ove. “The Ethics of Doing Ethics.” Science and Engineering Ethics 23 (2017), 105–20. 10.1007/s11948-016-9772-3.Search in Google Scholar

Holtug, Nils. “Genethics.” In A Companion to Philosophy of Technology, edited by Jan Kyrre Berg Olsen Friis, et al. Malden, MA: Blackwell, 2009.10.1002/9781444310795.ch78Search in Google Scholar

Huizinga, Johan. Homo Ludens: A Study of the Play-Element in Culture. London: Routledge & Kegan Paul, 1960 [1950].Search in Google Scholar

Juul, Jesper. Half-Real: Video Games between Real Rules and Fictional Worlds. Cambridge: MIT Press, 2005.Search in Google Scholar

Kling, Rob, (ed.). Computerization and Controversy. Value Conflicts and Social Choices. Burlington, MA: Morgan Kaufmann, 1996.10.1016/B978-0-12-415040-9.50085-3Search in Google Scholar

Latour, Bruno. “Where Are the Missing Masses? The Sociology of a Few Mundane Artifacts.” In Shaping Technology/Building Society, edited by W. E. Bijker and J. Law, 225–58. London: MIT Press, 1992.Search in Google Scholar

Levinas, Emmanuel. Liberté et commandement. Montpellier: Fata Morgana, 1994.Search in Google Scholar

MacIntyre, Alasdair. “The Relationship of Philosophy to its Past.” In Philosophy in History, edited by R. Rorty, J. B. Schneewind, and Q. Skinner, 31–48. Cambridge: Cambridge University Press, 1984.Search in Google Scholar

McGonigal, Jane. Reality is Broken: Why Games Make Us Better and How They Can Change the World. London: Penguin, 2011.Search in Google Scholar

Mitcham, Carl and Katinka Waelbers. “Technology and Ethics: Overview.” In Companion to the Philosophy of Technology, edited by J. K. B. O. Friis, et al. Malden, MA: Blackwell, 2009.10.1002/9781444310795.ch64Search in Google Scholar

Morin, Edgar and Anne Brigitte Kern. Homeland Earth: A Manifesto for the New Millenium. New York: Hampton Press, 1999.Search in Google Scholar

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018.10.18574/nyu/9781479833641.001.0001Search in Google Scholar

Perperidis, Giannis. “The Politics of the City: Critical Theory of Technology and Urban Design(s).” Technology in Society 74 (2023), 102263. 10.1016/j.techsoc.2023.102263.Search in Google Scholar

Perperidis, Giannis. “Designing Ethical A.I. Under the Current Socio-Economic Milieu: Philosophical, Political and Economic Challenges of Ethics by Design for AI.” Philosophy & Technology 37 (2024), 84. 10.1007/s13347-024-00766-4.Search in Google Scholar

Perperidis, Giannis. “Openness of Designs and Ethical Values: Outlining a New Ethical Framework for Our Technological Future.” Ariadne 30 (2025), 299–318. 10.26248/ariadne.v30i.1898.Search in Google Scholar

Pinch, Trevor J. and Wiebe E. Bijker. “The Social Construction of Facts and Artifacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other.” In The Social Construction of Technological Systems. New Directions in the Sociology and History of Technology, edited by W. Bijker, T. P. Hughes, and T. Pinch, 11–44. Cambridge: MIT Press, 2012.Search in Google Scholar

Rorty, Richard. Philosophy and the Mirror of Nature. New Jersey: Princeton University Press, 1979.Search in Google Scholar

Rorty, Richard. Objectivity, Relativism, and Truth: Philosophical Papers, Volume 3. Cambridge: Cambridge University Press, 1991.10.1017/CBO9781139173643Search in Google Scholar

Rorty, Richard. Truth and Progress: Philosophical Papers, Volume 2. Cambridge: Cambridge University Press, 1998.10.1017/CBO9780511625404Search in Google Scholar

Rorty, Richard. Philosophy as Cultural Politics: Philosophical Papers, Volume 4. Cambridge: Cambridge University Press, 2007.10.1017/CBO9780511812835Search in Google Scholar

Rye, Sara, Micael Sousa, Carla Sousa. “Designing Effective Learning Games.” In Transformative Learning Through Play. Cham: Palgrave Macmillan, 2025. 10.1007/978-3-031-78523-8_4.Search in Google Scholar

Schuster, Joshua and Derek Woods. Calamity Theory. Three Critiques of Existential Risk. Minneapolis, MN: University of Minnesota Press, 2021.10.5749/9781452967004Search in Google Scholar

Sicart, Miguel. Play Matters. Cambridge: MIT Press, 2014.10.7551/mitpress/10042.001.0001Search in Google Scholar

Simos, Manolis, Konstantinos Konstantis, Konstantinos Sakalis and Aristotle Tympas. “‘AI Can Be Analogous to Steam Power’ or From the ‘Post-Industrial Society’ to the ‘Fourth Industrial Revolution’: An Intellectual History of Artificial Intelligence.” ICON: Journal of the International Committee for the History of Technology 27:1 (2022), 97–116.Search in Google Scholar

Simos, Manolis. “Uncanny Traces: Villiers de l’Isle-Adam’s Critique of the Metaphysics of Selfhood.” In Routledge International Handbook of Psychoanalysis, Subjectivity, and Technology, edited by D. Goodman and M. Clemente, 302–22. London: Routledge, 2024.10.4324/9781003195849-30Search in Google Scholar

Sloterdijk, Peter. “Wounded by Machines.” In Not Saved. Essays After Heidegger, translated by Ian Alexander Moore and Christopher Turner. Cambridge: Polity, 2017 [2001].Search in Google Scholar

Sloterdijk, Peter. Prometheus’s Remorse. From the Gift of Fire to Global Arson, translated by Hunter Bolin. Cambridge: MIT Press, 2024 [2023].Search in Google Scholar

Squire, Kurt. “Video Games in Education.” International Journal of Intelligent Games and Simulation 2:1 (2003), 49–62.Search in Google Scholar

Suits, Bernard. The Grasshopper: Games, Life and Utopia. Toronto, CA: University of Toronto Press, 1978.10.3138/9781487574338Search in Google Scholar

Verbeek, Peter-Paul. Moralizing Technology: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press, 2011.10.7208/chicago/9780226852904.001.0001Search in Google Scholar

Wallach, Wendell and Colin Allen. Moral Machines: Teaching Robots Right from Wrong. Oxford: Oxford University Press, 2009.10.1093/acprof:oso/9780195374049.001.0001Search in Google Scholar

Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Public Affairs, 2019.Search in Google Scholar

Received: 2025-03-31
Revised: 2025-06-03
Accepted: 2025-06-20
Published Online: 2025-07-17

© 2025 the author(s), published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Special issue: Sensuality and Robots: An Aesthetic Approach to Human-Robot Interactions, edited by Adrià Harillo Pla
  2. Editorial
  3. Sensual Environmental Robots: Entanglements of Speculative Realist Ideas with Design Theory and Practice
  4. Technically Getting Off: On the Hope, Disgust, and Time of Robo-Erotics
  5. Aristotle and Sartre on Eros and Love-Robots
  6. Digital Friends and Empathy Blindness
  7. Bridging the Emotional Gap: Philosophical Insights into Sensual Robots with Large Language Model Technology
  8. Can and Should AI Help Us Quantify Philosophical Health?
  9. Special issue: Existence and Nonexistence in the History of Logic, edited by Graziana Ciola (Radboud University Nijmegen, Netherlands), Milo Crimi (University of Montevallo, USA), and Calvin Normore (University of California in Los Angeles, USA) - Part II
  10. The Power of Predication and Quantification
  11. A Unifying Double-Reference Approach to Semantic Paradoxes: From the White-Horse-Not-Horse Paradox and the Ultimate-Unspeakable Paradox to the Liar Paradox in View of the Principle of Noncontradiction
  12. The Zhou Puzzle: A Peek Into Quantification in Mohist Logic
  13. Empty Reference in Sixteenth-Century Nominalism: John Mair’s Case
  14. Did Aristotle have a Doctrine of Existential Import?
  15. Nonexistent Objects: The Avicenna Transform
  16. Existence and Nonexistence in the History of Logic: Afterword
  17. Special issue: Philosophical Approaches to Games and Gamification: Ethical, Aesthetic, Technological and Political Perspectives, edited by Giannis Perperidis (Ionian University, Greece)
  18. Thinking Games: Philosophical Explorations in the Digital Age
  19. On What Makes Some Video Games Philosophical
  20. Playable Concepts? For a Critique of Videogame Reason
  21. The Gamification of Games and Inhibited Play
  22. Rethinking Gamification within a Genealogy of Governmental Discourses
  23. Integrating Ethics of Technology into a Serious Game: The Case of Tethics
  24. Battlefields of Play & Games: From a Method of Comparative Ludology to a Strategy of Ecosophic Ludic Architecture
  25. Research Articles
  26. Being Is a Being
  27. What Do Science and Historical Denialists Deny – If Any – When Addressing Certainties in Wittgenstein’s Sense?
  28. A Relational Psychoanalytic Analysis of Ovid’s “Narcissus and Echo”: Toward the Obstinate Persistence of the Relational
  29. What Makes a Prediction Arbitrary? A Proposal
  30. Self-Driving Cars, Trolley Problems, and the Value of Human Life: An Argument Against Abstracting Human Characteristics
  31. Arche and Nous in Heidegger’s and Aristotle’s Understanding of Phronesis
  32. Demons as Decolonial Hyperobjects: Uneven Histories of Hauntology
  33. Expression and Expressiveness according to Maurice Merleau-Ponty
  34. A Visual Solution to the Raven Paradox: A Short Note on Intuition, Inductive Logic, and Confirmative Evidence
Downloaded on 27.10.2025 from https://www.degruyterbrill.com/document/doi/10.1515/opphil-2025-0078/html?lang=en&srsltid=AfmBOooMvW1zHzMCreBXKPCCi8bJA59JaLo1xsuZey8K0Cxl1oRtKVW_
Scroll to top button