Skip to main content
SearchLoginLogin or Signup

Thinking about Technology

An Introduction to Philosophy of Technology

Published onFeb 15, 2023
Thinking about Technology

1 Introduction

The importance of technology in the modern world can hardly be overstated. Our everyday life has changed immensely due to the advent of technology. Think about it: technology affects almost everything we do and influences most of our plans for the future. We use smartphones to keep in contact with our friends, we rely on laptops to study and work, we travel by cars and trains, use vacuum cleaners and dishwashers to keep our house clean. Not to mention all the technology that keeps our bodies healthy: we track our steps, heartbeats or fertility using our phones. All these technologies – small or big ­– have changed our lives and the way we live together in society. How can we best analyze and understand the impacts of technological innovation on our lives? This is the realm of the philosophy of technology.

Utopia and Dystopia: From the Age of Optimism to Technology Critique

The history of thinking about technology in the Western world is a long, rich and complicated one. It is possible to cluster thinking about technology in several broad thematic and historic moments. During the Enlightenment, thinking about technology has been mostly optimistic. In the 17th and 18th centuries many scientific discoveries were made and technology was generally understood as a force for positive change by the dominant European intellectual movement with key figures such as Francis Bacon, Rene Descartes, John Locke, Gottfried Wilhelm Leibniz, Voltaire and Immanuel Kant. For example, Karl Marx saw technological innovations such as the steam engine or the spinning mill as necessary steps toward a better socialist and communist society (Franssen et al. 2018). These thinkers shared a sense of technological optimism, and began to understand technological progress as social progress. They believed that technological solutions could deal with a wide variety of social problems, which led to a widespread admiration for science and all the technology that came with it.

In the 19th century, during the Industrial Revolution, thinking about technology changed. Technologies like the steam engine, telegraph and radio meant that traditional methods and ways of working had to change. This had a lot of consequences. For example, agricultural workers lost their jobs and fled to the cities for new opportunities, ending up in unhealthy, overpopulated slums. Critical thinkers such as Samuel Butler (1872) and Ernst Kapp (1877) pointed out the negative sides of technology and began to resist technological change. And when technologies such as atomic bombs, gas chambers and all sorts of weapons were extensively used in the two World Wars, this technological critique increased in strength. Philosophers such as Theodor Adorno and Max Horkheimer (1947), Jacques Ellul (1954), Martin Heidegger (1977), Herbert Marcuse (1964) and Lewis Mumford (1934) began to theorize technology as a dangerous force that threatens our lives and societies, the use of which should be restricted as much as possible.

Contemporary Views within Philosophy of Technology

Contemporary philosophers of technology still draw a lot of inspiration from the work of these earlier thinkers. At the same time, they problematize particular aspects of these earlier works. For example, many contemporary philosophers of technology do not understand technology as a monolithic force. They argue that it is difficult to talk about “technology” in general (also known as “Technology-with-a-capital-T”) as there is a wealth of different technologies with often highly situated effects. Windmills, smartphones and brain implants are all very different technological objects with specific consequences and thus need to be studied separately.

In addition, contemporary philosophers of technology do not believe that technological innovations affect human lives and society in a one-directional manner for either the good (like many of the Enlightenment thinkers) or the bad (like many classical philosophers of technology). Instead, they emphasize that technological innovations on the one hand and human lives and society on the other co-shape each other in ways that will both be positive and negative. Many contemporary philosophers of technology work with the concept of technological mediation (Latour 1994) and ‘mediation theory’ (Verbeek 2005) to analyze the specific influence of particular technologies. The central idea is that technologies help shape the relations between us, as humans, and our world. Technologies “mediate” how we experience and act in our world. We will come back to the concept of technological mediation later in this chapter.

As a result, many philosophers of technology now reflect on different technologies and use empirical research (such as interviews and observations) to study the specific impacts of technologies on our lives and our societies (also known as “the empirical turn”). They started borrowing ideas from and collaborating with disciplines beyond philosophy, such as engineers and computer scientists who develop technological objects, and psychologists and sociologists who have more experience with observing and measuring our behavior and interactions. Moreover, they started drawing from many different philosophical traditions including critical theory, ethics, logic and epistemology to name a few.

In this chapter, we introduce you to three (partly overlapping) perspectives within contemporary philosophy of technology: (1) technologies and human experience, (2) technologies and human action, and (3) technologies and our society. Each of these perspectives is used by philosophers of technology to reflect on our lives with technologies. In addition, many contemporary philosophers of technology actively contribute to developing better technologies. At the end of the chapter, we introduce two ways in which philosophers of technology contribute to finding solutions to the problems that arise with particular technologies.

It is impossible to discuss all the subfields of this rich and ever-developing field, but we aim to touch on some of the influential ideas within the field to inspire and invite you to think (and continue to read!) about living with all those interesting technologies around us. Since technologies are omnipresent and influence almost every aspect of our lives, we must think critically about how technologies work, how they influence our lives and our societies, and how we can possibly alter their designs. To clarify and illustrate our story, we use FemTech apps as an example throughout this chapter. We start with a brief introduction to this case.

The case of FemTech Apps

The technologies that contemporary philosophers of technology study do not always have to be grand or radically innovative; also seemingly mundane examples are interesting to study further and help us learn how technologies influence our lives and society and vice versa. We illustrate this point in this chapter with the case of FemTech apps that assist users in tracking their periods, ovulations and pregnancies on their smartphones.

Nowadays, there are hundreds of apps to track your menstrual cycle and reproductive health at large. Often called FemTech – short for female technology – these apps assist users in tracking their menstrual cycle, fertility or pregnancy. Users enter personal data in the app and learning algorithms create predictions on a their menstrual cycles or fertile windows. With these learning algorithms, FemTech apps are able to tell users the start of their next menstrual cycle, their expected dates of ovulation, their fertile windows, as well as whether they might experience bloating or a skin rash during their cycles. In recent years, FemTech apps have become the fourth most popular category of health apps amongst adults (Moglia et al. 2016). A popular fertility app like Clue, for example, has 12 million active monthly users. Already, FemTech has proven to be a major disrupter in the global healthcare and technology markets and it is projected that the FemTech market will reach a share size of 50 billion US dollars by 2025 (Taylor 2021).

2 Three perspectives within contemporary philosophy of technology

In what follows, we introduce three perspectives used by contemporary philosophers of technology to reflect on our lives with technologies. The first perspective discusses technologies and human experience, while the second zooms in on technologies and human action and the third on technologies and societies.

Perspective 1. Technologies and human experience

The first perspective that contemporary philosophers of technology use to analyze technologies such as FemTech apps, is ask how particular technologies influence human experience. Focusing on the experience of particular technologies is a tradition in the field of philosophy of technology we call ‘post-phenomenology’ as coined by the philosopher Don Ihde (1990) and later on elaborated on by philosophers such as Peter-Paul Verbeek (2005; 2006; 2011; 2015), Robert Rosenberger (Rosenberger 2017), and Galit Wellner (Wellner 2015).

The use of technologies has implications for the way we get to know and understand ourselves. Or as these philosophers also say, technologies mediate human experience and our interpretation of reality and vice versa (cf. Verbeek 2005). We do not perceive our smartphones as separate objects; we experience the world, others, and ourselves through these technologies. Think of a pair of glasses that many of us use to see the world clearly. Philosophers emphasize that these technologies become an extension of ourselves and our bodies (Ihde 1990). In regular day-to-day life you do not experience the glasses themselves anymore; the technological object becomes invisible when living with it. The more ‘frictionless’ the experience, the less we experience objects as separate from us.

More debates on Posthumanism, Transhumanism and Moral Enhancement (1)

Related debates include the discussion on what it means to be ‘human’ or an ‘agent’ in a technological age such as the debate on posthumanism (see: Haraway 1991). In the ethical debate on transhumanism philosophers argue that we can become ‘better’ humans by (morally) technologically enhancing ourselves (Persson & Savulescu 2008; Bostrom and Savulescu 2009), while bio-conservatists argue against forms of technological human enhancement (Sandel 2009; Habermas 2003). Other debates focus on artificial intelligence and issues surrounding singularity, the point where computer intelligence exceeds human intelligence (Bostrom 2014; Kurzweil 1999; 2005).

This also works the other way around. Philosophers emphasize that technologies also transform what we perceive. According to Don Ihde, technologies amplify specific aspects of our reality while reducing others (1990). Tracking your menstruation using a period tracker, in which time registration is central to the design, leads you to focus more on the timing of your menstruation (when it starts, how long it takes). And at the same time, it backgrounds other less prominently tracked aspects such as possible mood swings or other bodily changes. Post-phenomenologists argue that such a technology thus plays an active role in how you perceive your menstruation.

Many philosophers in this tradition study the multiple ways in which technologies influence our experience. The empirical and pragmatic focus on concrete practices of technologies shows that technologies do not have one ‘essence’. Rather, they can be used in multiple ways and are therefore always connected to the human cultural context. Ihde calls this ‘multistability’. He gives the example of a hammer that can be used to drive nails into wood. This is its most dominant form of use; its dominant ‘stability’. However, the hammer is multistable: it could also be used as a murder weapon or art object. These are alternative ‘stabilities’ (Ihde 1999: 46).

This post-phenomenological perspective has important implications. It emphasizes that technologies help determine how reality is presented to and interpreted by people. Technologies help to shape what counts as “real”. This has important ethical consequences because it implies that technologies can actively contribute to the moral decisions that humans make. Peter-Paul Verbeek gives the famous example of the introduction of the ultrasound. This technology mediates a new experience of pregnancy as the ultrasound enables you to look into your womb and see the fetus. The ultrasound also mediates new choices, actions and responsibilities. After all, the ultrasound enables us to know whether a fetus is growing at a normal rate and monitor its movements, breathing and heart rate. But if something is wrong, we now also face the question whether to terminate a pregnancy.

Perspective 2. Technologies and human action

The second perspective contemporary philosophers of technology use to analyze technologies focuses on the relation between particular technologies and human action and behavior. A famous example that is often used in this context is that of gun ownership in the United States. Think about the well-known slogan for gun ownership: 'Guns don't kill people, people kill people.’ It captures the widely believed idea that technologies are value-neutral instruments and subservient to our beliefs and desires. This position has been criticized by many contemporary philosophers of technology, such as Bruno Latour (1992, 1994) and Madeleine Akrich (1992), who argue that technologies are far from neutral and mediate human actions.

More debates on value-neutrality and (moral) agency (2)

In presenting this perspective, we have to make an important distinction between multiple debates. Some debates critique technology’s supposed neutrality by investigating whether technologies can actually be moral agents. This debate tries to judge whether technologies can act autonomously and should be held accountable for their actions. This often requires rethinking notions such as ‘agency’, ‘freedom’, ‘autonomy’ and ‘responsibility’, since these are properties that are usually attributed to human personhood (Latour 1993; Floridi & Sanders 2004; Verbeek 2011). But whether technologies are neutral can also be answered without referring to moral agency (Winner 1980; Johnson 2006; Illies & Meijers 2009; Peterson & Spahn 2011) and this is the line of reasoning that we build upon in this chapter.

The work of Bruno Latour helps to understand how technologies mediate action (cf. Latour 1992, 1994). Latour pointed out that what humans do is often co-shaped by the technologies they use. He declared that things can “authorize, allow, afford, encourage, permit, suggest, influence, block, render possible, forbid and so on” human action (Latour 2005:72). This means that our actions result not only from individual intentions and the social structures in which we find ourselves, but also from technologies themselves. Technological objects always determine how and to which goal they are to be used (at least to a certain extent). Of course, a gun can be used in different ways to accomplish various goals. They can be used to dig through dirt like shovels or mounted on the wall as a piece of art or tossed around like frisbees, but while all these options are possible, they are not very likely (cf. Selinger 2012). The shape of the gun and its specific characteristics make it much more likely that it will be used in the way it was intended: to shoot.

Material objects prescribe certain actions to us and this is what Madeleine Akrich and Bruno Latour call a (technological) ‘script’ (1992). According to them, the influence of technology on human actions can be compared to movie scripts in the sense that they also prescribe to their users how to act and when they use them. These scripts steer and nudge users to interact with technologies in a particular way; by suggesting specific actions and discouraging others. Latour gives the example of a speed bump, which carries the script “slow down when you approach me”, or otherwise you will damage your car. Or take a plastic coffee cup that embodies the prescription to throw it away after use by the way it is designed (Latour 1992, 1994; Akrich 1992). The design of health apps, such as FemTech apps, often require users to track their physical changes and mental moods by selecting an emoticon to represent their current state, thereby steering people to express themselves in this limited range of emoticons.

Debate on the normative ethics of technologies: nudging and manipulation (3)

Many authors have engaged with the question whether technologies undermine or strengthen our autonomy. In recent years, the question whether technologies interfere with our decision-making in unwanted ways has become prominent (see: Lanzing 2019; Sax 2021; Susser et al. 2019).

Sometimes these scripts are actively designed with the technologies, but unexpected scripts often arise when people start to use the technological objects in unexpected ways. There sure is a lot of evidence of counter-scripts or hacks, and “revenge effects”. People start to use technologies in their own ways. They adapt and resist technological developments. Latour (1992) gives the example of seat belts in his car, that start to beep if he doesn’t buckle his belt before starting the engine of his car. He goes to his garage to have the sensor removed. Such an “anti-program” runs counter to the original script of the technology.

All these insights emphasize that technologies themselves are far from neutral objects, but are embedded with values and scripts that determine how we use such technological objects. This is important because it gives creators, designers and manufacturers of technologies a moral obligation and turns the design of technology into an object for philosophical study. Philosophers can help to unravel the features and meaning of the design with a critical eye. In addition, they study what kind of actions are (implicitly) scripted by technologies and what possible counter-scripts can arise.

Perspective 3. Technologies and our societies

The third perspective focuses on the relationship between technologies and our societies. This perspective is even more political and critical and asks questions such as: How do technologies influence public values that we consider important in our society such as transparency, privacy, solidarity? What are the societal forces behind a technological object? Who controls technologies? Who are included and excluded as a result of technologies?

In 1980, philosopher Langdon Winner wrote that we need to look closely at the properties of technologies, as “the issues that divide or unite people in society are settled not only in the institutions and practices of politics proper but also in tangible arrangements of steel and concrete, wires and transistors, nuts and bolts” (Winner 1980: 128). Technologies embody social relationships. In fact, technologies can even have political properties because they are infused with norms and values about what is considered right and wrong by for example, the designers that design technologies and the investors that decide to pay for the development of particular technologies. Sometimes these inscriptions are very clear; for example, the value of sustainability that is inscribed into energy-saving light bulbs. But at other times, these inscriptions are less obvious.

This becomes clear from Winner’s example of the Long Island Overpasses. The Overpasses are a set of bridges designed by the architect Robert Moses that separate the Long Island beaches from the city of New York in the middle of the twentieth century. Moses has been accused of deliberately designing the bridges very low. The effect of these low bridges was that only rich, predominantly white people who could afford a car could access the beaches of Long Island. The poor, predominantly black population that used public transport could not reach the beaches as public buses could not pass the low bridges.

Winner explains that racism in this example is embodied by the height of the bridges. Whether Moses’ motives were explicitly racist is controversial. But for the point that Winner wants to make, this does not matter much. Even if Moses’ intentions were not explicitly racist, the design of the technology can still be culpable when it discriminates and excludes. Whatever the motive, the result is similar: technological objects that exacerbate structural discrimination and inequalities.

Take again the example of FemTech apps. Most FemTech apps approach their users as ‘girls’; enforcing stereotypical gender norms by being flowery, purple and pink. They indicate the days when you may be fertile (without the option to turn that feature off) and are usually heteronormative in their depictions of (sexual) relationships. This means that FemTech apps often do not consider people who are e.g., in a non-heteronormative relationship; or who cannot or do not want to have children, or who do not identify themselves as girl nor woman – but who do menstruate. Design choices – such as the stereotypical heteronormative depiction of sex with emoticons of an eggplant and a peach, or an overly flowery and pink lay-out – enforces certain feminine and heteronormative stereotypes. In turn, such design choices result in an app that signals to anyone who does not identify as heterosexual or stereotypical feminine that they are not intended as users of these apps (Hendl & Jansky 2021; Jacobs & Evers 2019).

Debates on the reproduction of existing power relations (4)

Technologies can, consciously or unconsciously, embody systematic social inequalities and political agendas. Technologies are political in the sense that they not only establish certain (dominant) power relations in a certain context, but can also support, strengthen, and even reproduce them – think about the earlier examples of the energy-saving lightbulbs and the Long Island Overpasses. Technology scholars have argued this from various theoretical perspectives including feminism (see: D’Ignazio & Klein 2019; Wacjman 2004), critical race studies (see: Benjamin 2020; Noble 2012), decolonial studies (see: Arora 2018), capitalist critique (see: Morozov 2019; Zuboff 2015; 2019) and critical theory (see: Feenberg 2003, Fuchs 2012).

Contemporary philosophers of technology also emphasize that technologies shape our social norms more broadly, so the informal, unwritten rules that define what is acceptable and appropriate within a given group or community. For example, although keeping track of your menstrual cycle is certainly not a new practice – in fact, evidence from antiquity has been found of women tracking their cycles in stone –, the introduction of digital FemTech apps has contributed widely to normalizing the practice of tracking one’s cycle (see also: Foucault 1977). This has contributed to more openness and transparency about female reproductive health amongst many (young) women and has encouraged many to become (more) informed and knowledgeable about their reproductive health.

This might be considered a beneficial change, but FemTech apps also normalize the practice of sharing intimate data with commercial enterprises and third parties. As pointed out, FemTech apps stimulate users to enter personal data of the most intimate sorts into an app. Reports by Privacy International show that most FemTech apps sell and share this intimate data with third parties and target their users with personalized advertisements (Privacy International 2020). By appearances, these FemTech are “approximating, and perhaps impersonating, healthcare”. Yet, “these apps are not qualified healthcare providers, their only reliable function is to convert individuals’ health needs and bodily data into profit” (Gross et al.2021). Their data are shared and sold to third parties for profit while the people that generate these data do not part in the profits. This leads to questions such as: What do we want our society to look like and how can we govern these technologies and the companies behind those technologies better?

Debates on the growing power of technology companies (5)

Nowadays, many technologies are developed and distributed by private actors. At the same time, these technologies are indispensable for many societal domains to function. What would the healthcare field be without its CT scans and other technologies? Can we still teach students without the digital educational environments that need to be accessed through a laptop? The result is that technology companies gain increasingly powerful positions within fields such as health and medicine, and education. Philosophers critically study and question the powerful role of the private actors in such societal domains. Especially, because a couple of “big tech” companies (such as Amazon, Alphabet, Apple, Microsoft and Meta) gain particularly important positions in many of these societal domains. They become so dominant that they can enforce particular policy measures and demand societal changes while they cannot be held accountable in ways public organizations are (cf. Lopez Solano et al. 2022; Sharon 2016; 2020; 2022; Taylor 2021).

3 How philosophers contribute to building better technologies

Contemporary philosophers of technology draw from the three perspectives to reflect on our lives and societies with technologies. But many philosophers feel that mere reflection is not enough and develop methodologies to actively contribute to better lives with technologies. These philosophers argue that it is necessary to get their hands dirty and help build better technologies! From the variety of different approaches and methodologies that philosophers have developed, we will discuss two approaches that philosophers of technology use.

In the first approach, philosophers focus explicitly on the practice of designing technologies with particular societal values in mind. These philosophers realize that thousands of design decisions underpin current technologies and that those decisions matter for the societal impacts of these technologies. For example, whether one chooses to design a pink and flowery layout for a FemTech app affects which people will feel included by the app’s look and who will feel excluded, touching upon the value of inclusivity. Or, whether a smartphone is designed with sustainable resources and e.g., with an easily replaceable battery or not, affects the environmental sustainability of the technology. The many decisions that designers make during a design process affect not only the technology’s functionality, usability and aesthetics, but also shape and affect the interactions and constraints of a technological object to its users (Van den Hoven et al. 2015). In relation to the examples above that is: who feels included and spoken to by the design of the FemTech app? And how do we interact with our smartphone; do we discard the entire phone when the battery breaks down or can we easily replace certain features and make long-term, sustainable use of it?

The realization that designers actively influence values, supported or undermined by their technology design, has led to the development of multiple ‘ethics by design’ approaches. The aim is to provide designers with practical ethical advice on how to consciously and deliberately take (moral and societal) values into account during their design processes. Famous examples include Value Sensitive Design (Friedman et al. 2013; Friedman and Hendry 2019) and Privacy by Design (Hoepman 2018). Despite the diversity of ethics by design approaches, they have three characteristics in common. First, they all share the idea that values can be expressed and embedded in a technological object. Second, they share the claim that it is morally significant to consciously and explicitly think about the values that are embedded in our technological inventions. And third, such moral deliberation and value considerations should be articulated at the very start of a new design process, when the design and development of the technology can still make a difference (Van den Hoven et al. 2015).

When we look once more at the example of FemTech apps, we can see that ethics by design approaches can assist designers in mitigating some of the ethical concerns currently arising with the design of these apps. These approaches create awareness among designers with regard to the values at stake in their design and possible conflicts between those values. Moreover, they help to guide designers from abstract values to concrete design decisions that support the values central to their design. As we explained earlier, FemTech apps have been harshly criticized for reproducing and reinforcing troubling binary sex-gender norms and sexist stereotypes (Hendl and Jansky 2021), as well as for violating the privacy of users. However, by making designers attentive to moral values such as equality, bodily integrity, fairness, and privacy, designers could redesign these apps to be more inclusive and privacy-friendly, for example by using different color pallets and by ending the use of eggplant and peach-emoticons when referring to sexual intercourse. Of course, we should keep in mind that we do not solve moral issues such as injustice by simply tweaking the features of a technological object. Technology is embedded in society. Changing technology for the better also requires fundamental changes in the business models behind technologies and an inclusive environment of designers and users that shape these technologies.

The second approach to building better technologies that we want to highlight is that of philosophers who study technologies ‘in the wild’. With their findings, they try to steer the use of technologies in beneficial directions for society. These philosophers focus on the actual use of technological objects, because technological objects (how well designed they are) often influence our perceptions, actions, and society in ways that are hard to predict upfront. In important ways, this second approach adds to the ‘ethics by design approaches’, described above.

These philosophers use empirical research methods to learn how people live with technologies. For example, they observe technologists and medical professionals as they develop artificial intelligence systems that support medical doctors to arrive at medical diagnoses (e.g., Stevens et al. 2020). The insights from such observations can be questioned and compared between different settings (cf. Hämäläinen 2016; Mol et al. 2010; Pols 2016) and used to anticipate, identify and address ethical issues that arise. Many philosophers of technology consider it crucial that artificial intelligence systems are transparent and can be properly explained and communicated. Philosophers can study how transparency in different initiatives is achieved. Consequently, these philosophers contribute to better technologies by presenting their results to the people that work with technologies and reflect with them on the ‘good’ use of technologies. They can also use their findings to advise, for example, ethical committees in healthcare organizations that develop rules for the use of a technology. In a similar way, these philosophers also stimulate (public) discussions about the use of technologies and have these public insights inform (supra)national policymakers about adequate regulation.

Debates on Risk, Societal Impact and Responsibility (6)

For debates associated with the risks and societal impact of new technologies, see: Hansson 2003. For an example of a current debate on the (distribution of) responsibility, see: Nyholm 2018. For a more radical approach of studying technologies ‘in the wild’, see: Van de Poel on ‘societal experimentation’ (Van de Poel 2011; 2013).

4 Conclusion

Over the years, contemporary philosophers of technology have developed themselves as socially engaged, interdisciplinary thinkers that collaborate with different disciplines within and beyond philosophy. In this chapter, we have introduced three perspectives within contemporary philosophy of technology: (1) technologies and human experience, (2) technologies and human action and (3) technologies and societies. Each of these perspectives is used by philosophers to reflect on our lives with technologies. In addition, we introduced two ways in which philosophers of technology also try to improve our lives with technologies. This chapter is by no means a complete overview of the rich and developing field of philosophy of technology so we invite you to think (and continue to read) about the many ways in which we are and should be living with all those technologies around us.


We are very grateful to Bart Engelen, Esther Keymolen and Tamar Sharon for their editorial comments and very useful suggestions to further improve this chapter.


Adorno, T. & Horkheimer, M. (1947 [2002]). The Dialectic of Enlightenment. Stanford: Stanford University Press.

Akrich, M. (1992). Beyond Social Construction of Technology: The Shaping of People and Things in the Innovation Process. In: Dierkes, M. & U. Hoffmann (eds.), New Technology at the Outset: Social Forces in the Shaping of Technological Innovations. Westview Press.

Arora, P. (2018). Decolonizing Privacy Studies. Television & New Media. Vol. 20. No.4. pp. 366-378.

Benjamin, R. (2020). Race After Technology. Cambridge: Polity Press.

Butler, S. (1872). Erewhon. London: Trubner and Co.

Bosschaert, M.T. & Blok, V. (2022). The ‘Empirical’ in the Empirical Turn: A Critical Analysis. Foundations of Science.

Bostrom, N. (2005). The Fable of the Dragon Tyrant. Journal of Medical Ethics. Vol.31. No.5. pp 273-277.

Bostrom, N. (2014). Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press.

Bostrom, N. & Savulescu, J. (2008). Human Enhancement. Oxford: Oxford University Press.

D’Ignazio, C. & Klein, L. (2019). Data Feminism. Cambridge: MIT Press.

Ellul, J. (2003). The ‘Autonomy’ of the Technological Phenomenon. In: Scharff, M. (ed.). Philosophy of Technology: The Technological Condition. Malden: Blackwell Publishing.

Ellul, J. (1954). La Technique ou L'Enjeu du Siècle. Paris: Armand Colin.

Feenberg, A. (2003). Critical Theory of Technology: An Overview. Tailoring Biotechnologies. Vol.1. pp.47-64.

Feenberg, A. & Callon, M. (2010). Between Reason and Experience: Essays in Technology

and Modernity. Cambridge: The MIT Press.

Floridi, L. & Sanders, J.W. (2004). On the Morality of Artificial Agents. Minds and Machines. Vol. 14. pp.349-379.

Foucault, M. (1995 [1977]). Discipline and Punishment: The Birth of the Prison. New York: Vintage.

Franssen, M., Lokhorst, G., van de Poel., I. (2018). Philosophy of Technology. The Stanford Encyclopedia of Philosophy.

Friedman, B., Kahn, P. & Borning, A. (2006). Value Sensitive Design and Information Systems. In: P. Zhang & D. Galletta (eds.), Human-Computer Interaction in Management Information Systems: Foundations. New York: M.E. Sharpe, Inc.

Friedman, B., Kahn, P. H. Jr., Borning, A., & Huldtgren, A. (2013). Value Sensitive Design and Information Systems. In: N. Doorn, D. Schuurbiers, I. van de Poel, M. E. Gorman (eds.), Early Engagement and New Technologies: Opening Up the Laboratory. Dordrecht: Springer.

Friedman, B. & Hendry, D.G. (2019). Value Sensitive Design: Shaping Technology with Moral Imagination. Cambridge: The MIT Press.

Fuchs, C. (2012). Political Economy and Surveillance Theory. Critical Sociology. Vol. 39. No. 5. pp.671-687.

Gross, M.S., Hood, A. & Corbin, B. (2021). Pay No Attention to That Man Behind the Curtain: An Ethical Analysis of the Monetization of Menstruation App Data. International Journal of Feminist Approaches to Bioethics. Vol 14. No. 2. pp.144-156.

Habermas, J. (2003). The Future of Human Nature. Cambridge: Polity Press.

Hämäläinen, N. (2016). Descriptive Ethics: What Does Moral Philosophy Know About Morality? Dordrecht: Springer.

Hansson, S. (2003). Ethical Criteria of Risk Acceptance. Erkenntnis. Vol. 59. No. 3. pp. 291–309.

Haraway, D. (1991). A Cyborg Manifesto: Science, Technology, and Socialist Feminism in the

Late Twentieth Century. In: Haraway, D. (ed.). Simians, Cyborgs and Women: The Reinvention of Nature. New York: Routledge.

Heidegger, M. (1977). The Question Concerning Technology and Other Essays. New York: Garland Publishing.

Hendl, T. & Jansky, B. (2021). Tales of Self-Empowerment Through Digital Health Technologies: A Closer Look at ‘Femtech’. Review of Social Economy. Vol. 80. No. 1. pp. 29-57.

Hosseinpour, F., Vahdani Amoli, P., Plosila, J., Hämäläinen, T., & Tenhunen, H. (2016). An Intrusion Detection System for Fog Computing and loT Based Logistic Systems Using a Smart Data Approach. International Journal of Digital Content Technology and its Applications. Vol. 10. No. 5. pp. 34-46.

Hough, A., Bryce, M. & Forrest, S. (2018). Social Media and Advertising Natural Contraception to Young Women: The Case for Clarity and Transparency With Reference to the Example of ‘Natural Cycles.’ BMJ Sexual & Reproductive Health. Vol. 44. pp. 307-309.

Van den Hoven, J., Vermaas, P., & van de Poel, I. (eds.) (2015). Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains. Dordrecht: Springer.

Ihde, D. (1990). Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana University Press.

Ihde, D. (1999). Technology and Prognostic Predicaments. AI and Society. Vol. 13. pp 44-51.

Ihde, D. (2009). Postphenomenology and Technoscience: The Peking University Lectures. State University of New York Press.

Illies, C., & Meijers, A. (2009). Artefacts Without Agency. The Monist. Vol. 92. No. 3. pp. 420–440.

Jacobs, N., & Evers, J. (2019). De Kwetsbaarheid van Femtech. Podium voor Bio-ethiek. Vol. 26. No. 4. pp. 13-15.

Jasanoff, S. (2019). Can Science Make Sense of Life? Cambridge: Polity Press

Johnson, D.G. (2006). Computer Systems: Moral Entities But Not Moral Agents. Ethics and Information Technology. Vol. 8. pp.195-204.

Kapp, E. (1877). Grundlinien Einer Philosophie Der Technik: Zur Entstehungsgeschichte Der Cultur Aus Neuen Gesichtspunkten. Braunschweig: Westermann .

Kurzweil, R. (1999). The Age of Spiritual Machines: When Computers Exceed Human Intelligence. London: Penguin.

Kurzweil, R. (2005). The Singularity is Near: When Humans Transcend Biology. New York: Penguin.

Lanzing, M. (2019). “Strongly Recommended” Revisiting Decisional Privacy to Judge Hypernudging in Self-Tracking Technologies. Philosophy and Technology. Vol. 32. pp.549–568.

Latour, B. (1992). Where Are the Missing Masses? The Sociology of A Few Mundane Artifacts. In: Bijker, W.E. & J. Law (eds.), Shaping Technology/Building Society: Studies in Sociotechnical Chance, pp. 225-259. Cambridge: MIT Press.

Latour, B. (1993). We Have Never Been Modern. Cambridge: Harvard University Press.

Latour, B. (1994). On Technical Mediation. Common Knowledge. Vol. 3. No. 2. pp. 29-64.

Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford: Oxford University Press.

Lopez Solano, L., Martin, M., Ohai, F., de Souza, S., & Taylor, L. (2022). Digital Disruption or Crisis Capitalism? Technology, Power and the Pandemic.

Marx, K. (1973 [1858]). Grundrisse: Foundations of the Critique of Political Economy. Harmondsworth: Penguin Books.

Marx, K. (1964). Capital: A Critique of Political Economy. Vol 1. Harmondsworth: Penguin Books.

Marcuse, H. (1964). One-Dimensional Man: Studies in the Ideology of Advanced Industrial Society. Boston: Beacon Press.

Moglia, M. L., Nguyen, H. V., Chyjek, K., Chen, K. T., & Castaño, P. M. (2016). Evaluation of Smartphone Menstrual Cycle Tracking Applications Using an Adapted Applications Scoring System. Obstetrics and Gynecology. Vol. 127. No. 6. pp.1153–1160.

Mol, A., Moser., I., & Pols, J. (2010). Care in Practice. Berlin: Walter de Gruyter.

Morozov, E. (2019). Digital Socialism? New Left Review. Vol.116.

Mumford, L. (1934). Technics & Civilization. New York: Harcourt, Brace & World.

Noble, S. (2012). Missed Connections: What Search Engines Say About Women. At:

Nyholm, S. (2018). Attributing Agency to Automated Systems: Reflections on Human–Robot Collaborations and Responsibility-Loci. Science & Engineering Ethics. Vol. 24. pp1201–1219.

Persson, I. & Savulescu, J. (2008). The Perils of Cognitive Enhancement and the Urgent Imperative to Enhance the Moral Character of Humanity. Journal of Applied Philosophy. Vol. 25. No.3. pp. 162-177.

Peterson, M., & Spahn, A. (2011). Can Technological Artifacts Be Moral Agents? Science and engineering ethics. Vol. 17. No. 3. pp.411–424.

Pitt, J.C. (2000). Thinking About Technology: Foundations of the Philosophy of Technology. Seven Bridges Press.

Pols, J. (2016). Towards an Empirical Ethics in Care: Relations With Technologies in Health Care. Medicine, Health Care and Philosophy. Vol. 18. No.1. pp.81-90.

Privacy International (2020). No Body’s Business but Mine: How Menstruation Apps Are Sharing Your Data. At:

Rosenberger, R. & Verbeek, P.P. (2015) (ed.) Postphenomenological Investigations. Essays on Human-Technology Relations. Lexington Books.

Sandel, M. (2009). The Case Against Perfection. Cambridge: Harvard University Press.

Sax, M. (2021). Optimization of What? For-Profit Health Apps as Manipulative Digital Environments. Ethics and Information Technology. Vol. 23. pp.345–361.

Selinger, E. (2012). The Philosophy of the Technology of the Gun. The Atlantic.

Sharon, T. (2016). The Googlization of Health Research: From Disruptive Innovation to Disruptive Ethics. Personalized Medicine. Vol. 13. No. 6. pp.563-574.

Sharon, T. (2020). Blind-Sided by Privacy? Digital Contact Tracing, the Apple/Google API and Big Tech’s Newfound Role as Global Health Policy Makers. Ethics and Information Technology Vol. 23. No. 1. pp.45-57.

Sharon, T. (2021). From Hostile Worlds to Separate Spheres: Towards a Normative Pragmatics of Justice for the Googlization of Health. Medicine, Healthcare & Philosophy. Vol. 24. pp.315-327.

Susser, D., Roessler, B., & Nissenbaum, H. (2019). Online Manipulation: Hidden Influences in a Digital World. Georgetown Law Technology Review. Vol. 4. No. 1. pp.1-45.

Stevens, M., Wehrens. R., & de Bont, A. (2020). Epistemic Virtues and Data-Driven Dreams: On Sameness and Difference in the Epistemic Cultures of Data Science and Psychiatry. Social Science & Medicine. Vol. 258. pp.1-8.

Taylor, L. (2021) Public Actors with Public Values: Legitimacy, Domination and the Regulation fo the Technology Sector. Philosophy & Technology. Vol. 34. No.1. pp.872-933.

Van de Poel, I. (2011). Nuclear Energy as a Social Experiment. Ethics, Policy and Environment. Vol. 14. No. 3. pp 285-290.

Van de Poel, I. (2013). Why New Technologies Should Be Conceived as Social Experiments. Ethics, Policy & Environment. Vol.16. No. 3. pp 352-55.

Verbeek, P.P. (2005). What Things Do ­– Philosophical Reflections on Technology, Agency, and Design. Penn State University Press.

Verbeek, P.P. (2006). Materializing Morality: Design Ethics and Technological Mediation. Science, Technology and Human Values. Vol. 31. No. 3. pp.361-380.

Verbeek, P.P. (2011). Moralizing technology: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press.

Wacjman, J. (2004). Technofeminism. Cambridge: Polity Press.

Wellner, G. (2015). A Postphenomenological Inquiry of Cell Phones: Genealogies, Meanings, and Becomings. Lexington Books.

Winner, L. (1980). Do Artifacts Have Politics? Daedalus. Vol. 109. No. 1. pp.121–136.

Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology. Vol. 30. pp.75–89.

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight For a Human Future at the New Frontier of Power. New York: Public Affairs.

Recommended further reading

Akrich, M. (1992). Beyond Social Construction of Technology: The Shaping of People and Things in the Innovation Process. In: Dierkes, M. & U. Hoffmann (eds.), New Technology at the Outset: Social Forces in the Shaping of Technological Innovations. Boulder: Westview Press.

Brey, P. (2010). Philosophy of Technology After the Empirical Turn. Techné: Research in Philosophy and Technology. Vol. 14. No. 1. pp.36-48

Friedman, B. & Hendry, D.G. (2019). Value Sensitive Design: Shaping Technology With Moral Imagination. Cambridge: The MIT Press.

Van den Hoven, J., Vermaas, P., & van de Poel, I. (eds.) (2015). Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains. Dordrecht: Springer.

Ihde, D. (1990). Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana University Press.

Latour, B. (1994). On Technical Mediation. Common Knowledge. Vol. 3. No. 2. pp.29-64.

Sismondo, S. (2010). An Introduction to Science and Technology Studies. Chichester: John Wiley & Sons.

Swierstra, T., Lemmens, P., Sharon, T. & Vermaas, P. (eds.) 2022. The Technical Condition. The Entanglement of Technology, Culture and Society. Amsterdam: Boom Uitgevers.

Vallor, S. (ed.) (2020). The Oxford Handbook of Philosophy of Technology. Oxford: Oxford University Press.

Verbeek, P.P. (2005). What Things Do ­– Philosophical Reflections on Technology, Agency, and Design. Penn State University Press.

No comments here
Why not start the discussion?