by David Roden
loading...
In the philosophy of technology, substantivism is a critical position opposed to the common sense philosophy of technology known as “instrumentalism”. Instrumentalists argue that tools have no agency of their own – only tool users. According to instrumentalism, technology is a mass of instruments whose existence has no special normative implications. Substantivists like Martin Heidegger and Jacques Ellul argue that technology is not a collection of neutral instruments but a way of existing and understanding entities which determines how things and other people are experienced by us. If Heidegger is right, we may control individual devices, but our technological mode of being exerts a decisive grip on us: “man does not have control over unconcealment itself, in which at any given time the real shows itself or withdraws” (Heidegger 1978: 299).
For Ellull, likewise, technology is not a collection of devices or methods which serve human ends, but a nonhuman system that adapts humans to its ends. Ellul does not deny human technical agency but claims that the norms according to which agency is assessed are fixed by the system rather than by human agents. Modern technique, for Ellul, is thus “autonomous” because it determines its principles of action internal to it (Winner 1977: 16). The content of this prescription can be expressed as the injunction to maximise efficiency; a principle overriding conceptions of the good adopted by human users of technical means.
In Chapter 7 of Posthuman Life, I argue that a condition of technical autonomy –self-augmentation – is in fact incompatible with technical autonomy. “Self-augmentation” refers to the propensity of modern technique to catalyse the development of further techniques. Thus while technical autonomy is a normative concept, self-augmentation is a dynamical one.
I claim that technical self-augmentation presupposes the independence of techniques from culture, use and place (technical abstraction). However, technical abstraction is incompatible with the technical autonomy implied by traditional substantivism, because where techniques are relatively abstract they cannot be functionally individuated. Self-augmentation can only operate where techniques do not determine how they are used. Thus substantivists like Ellul and Heidegger are wrong to treat technology as a system that subjects humans to its strictures. Self-augmenting Technical Systems (SATS) are not in control because they are not subjects or stand-ins for subjects. However, I argue that there are grounds for claiming that it may be beyond our capacity to control. This hypothesis is, admittedly, quite speculative but there are four prima facie grounds for entertaining it:
If enough of 1-4 hold then technology is not in control of anything but is largely out of our control. Yet there remains something right about the substantivist picture, for technology exerts a powerful influence on individuals, society, and culture, if not an “autonomous” influence. However, since technology self-augmenting and thus abstract it is counter-final – it has no ends and tends to render human ends contingent by altering the material conditions on which our normative practices depend.
References
Esposito, E., 2013. The structures of uncertainty: performativity and unpredictability in economic operations. Economy and Society, 42(1), pp.102-129. Ellul, J. 1964. The Technological Society, J. Wilkinson (trans.). New York: Vintage Books. Heidegger, M. 1978. “The Question Concerning Technology”. In Basic Writings, D. Farrell Krell (ed.), 283–317. London: Routledge & Kegan Paul. Roden, David. 2014. Posthuman Life: Philosophy at the Edge of the Human. London: Routledge. Winner, L. 1977. Autonomous Technology: Technics-out-of-control as a Theme in Political Thought. Cambridge, MA: MIT Press.
taken from:
0 Comments
Leave a Reply. |
Science&TechnologyArchives
March 2020
|