continues from Part 1
by Obsolete Capitalism
:: Dromology Archive 6 :: The socialist time machine: Red London and Marxist topology
London, summer 2015. We are looking at ‘The London Bookshop Map – 104 independent bookshops’ in order to find a bookshop specialized in communist and accelerationist publications. Finally, we choose Bookmarks, 1 Bloomsbury Street, which defines itself as a socialist library. It is a renown bookshop that constitutes an important junction on the axis of London radical and antagonistic thought – being also a honoured and militant publishing house of the ‘Red London’. It is located two steps away, northbound, from the University College London where Nick Srnicek conducts his PhD, in that same Bloomsbury which halfway through the nineteenth century welcomed the epistemic wandering of Karl Marx, near-by the British Museum, and in the first half of the twentieth century the ‘Bohème’ of the Bloomsbury Group by Keynes, Woolf and Forster, while nowadays it finds itself dangerously close to that stronghold of turbo-capitalism and class enemy that is the London School of Economics. The books by and on Marx dominate the scene, together with texts that recall workers’ struggles, and particularly the struggle of coal miners in the eighties, Unionism, and the Russian Revolution. In addition to that, texts about Trotsky, Luxemburg, the Cuban Revolution, Chavez, anti-Fascism, anti-Nazism, anti-Racism. Nothing that cannot be found in any English or European militant socialist and communist bookshop, and that in Italy could be found, with some small variation across the peninsula, on the stalls of the Italian newspaper Manifesto or of the political party Rifondazione Comunista. So, our quest for cyber-Marxist texts and works by the abstract Communist school of thought ends poorly with the recovery by the nice staff of only one volume by Nick Dyer-Witheford (Cyber-proletariat), which explains a great deal about the range and the impact of Marxist accelerationism on the massive and granitic body of classical Marxism and orthodox Communism. In fact, we find ourselves out of place, as if we had been relocated temporally in a different politico-philosophical space. It seems to us as if we were inside a museum of socialist history, we are aliens to them, or at best the winners of the ‘customer of the day’ award for the oddness.
A Thousand Marxes: Choruses, non-linear Marxism and speculative Marxism
Which Marx are we talking about? The political and intellectual universe that relates itself to Marx is kaleidoscopic, and rather different positions coexist, within the so-called Marx Renaissance. The Marx recalled here is the writer of the Grundrisse and The Capital, that is, the Marx in his speculative maturity: the most quoted text by the various essayists is the notorious ‘Fragment on Machines’, which is included as a compulsory passage in every reference of the book (with the exception of the essay by Mercedes Bunz) as if it was a refrain recurring in the text, essay after essay. It is a sci-fi and quasi-oracular Marx who is evoked to seal the descent of accelerationist thought. ‘It is Marx, along with Land, who remains the paradigmatic accelerationist thinker’ (Williams and Srnicek, Manifesto, 2013). Avanessian and McKay instead open their own ‘Accelerate’ with a chronicle of accelerationist thought, starting from the fathering Marxian year, 1858, and the passage on the alien power of the machine – The machine as an alien power – drawn from that same ‘Fragment on Machines’ of the Grundrisse. In the anthology edited by Pasquinelli, one of the leading essayists is Nick Dyer-Witheford, author of the essential ‘Cyber-Marx’ (1999). Hence, we are dealing with a non-linear Marx, hanging between steam and cyberpunk, read through the Deleuzian lens of the machinic surplus, perhaps nearer to William Gibson rather than Friedrich Engels. If Pasquinelli rightly evokes a speculative neo-Marxism, and if we appreciate it as the withdrawal of the whole Marxian thought from the pretension of orthodoxies to uphold it as true philosophy and ‘scientific thought capable of affirming itself as different from ideologies’, then the enemies of Marxist accelerationism within the communist movement will be free to brand these positions as a degradation of Marxism itself. An example can be the notorious position of Lukàcs during his phase of maturity, when he inveighed ‘the degradation of Marxism to speculative thought’ (Perlini, 1968), which is a contradiction for orthodox Marxists. Grabbing on to splinters of Marxism blown up and thrown in the air by the collapse of Marxism as a state ideology, the authors of ‘The algorithms of capital’ advance instead a new knowledge. Pasquinelli affirms in his theses that ‘there is no original class to be nostalgic for’, where by ‘class’ he means the working class, which continues to experience a post-human development. They are indeed fragments of Communism which can be positioned between Marx, Dick and Nexus-6: does the working class dream of electric sheep? :: Dromology Archive 7 :: The birth of the Robo Sapiens: automation and the crisis of Taylorism
In 1961 General Motors installed the first industrial robot, Unimate, in the assembly line of its own plant in Ewing Township, New Jersey, USA. Patented by George Devol and Joe Engelberger in 1954, the machine was perfected from that year until its deployment in the factory at the dawn of the sixties. That is how the third industrial revolution began, a development which in a few decades would have completely reshaped the system of production in the manufacturing sector. Nowadays the first ’robo sapiens’ installed in the assembly line would result particularly clumsy at our eyes in its esthetic dimension: it was provided with only one mechanical arm capable of moving up to two tons of scorching steel and then welding it onto the car body, and it rested on a metal box, directed by a computer closed in a squared box too, from which it silently guided the mechanical arm. Likewise, the industrial robots have silently substituted human labour. Even in our collective imaginary, the alienated and enslaved blue-collars, so empathically described by Fritz Lang in Metropolis (1927), are now substituted by the cyber-metallized soft shapes of the Sapphic robots of Chris Cunningham (1999). From an environment where the human worker stood beside the machine, caught in the brutal nineteenth-century concatenation of industrial man-machine process, we move to one, in the second half of the twentieth century, where we find only machines beside other machines, tangled in algorithmic processes of automation between robots. In 1982 Japan was already producing 24,000 industrial robots per year. China in 2014 produced 40,000 of them: they may not seem many, but it is as twice as much of what it produced in 2011 and nearly three times more of what it produced in 2010. In 2014, in the whole world, 228,000 robots have been sold. This year the Robo Sapiens deployed in the production lines by the world industry are over 1.5 million: they are products of a fine synthesis of mechanics, electronics, mathematics and software. However, a second robotic revolution has already been triggered: in fact, the future of automation will be the ‘service robots’ not spendable in industrial production anymore but rather in the service sector and in residential dromo-tronic. It is the future challenge. Rustling algorithms, automated solutions and mechanical labour: what would Marx say of the ‘Fragment of Machines’? How is it possible to challenge the ‘vast inhuman power’ engineered by the capital?
Marxist axiomatic and the empirical tumble of the ‘tendency of the rate of profit to fall’
The bunch of Deleuzian-Guattarian questions which the accelerationists want to answer to are questions from the future, that is, questions who come from the future – not last the specific and very controversial question on which process is to be accelerated among those designated by the two Parisian philosophers. The answers to such questions could well be other questions, this time more Derridian, such as ‘where will we go tomorrow? where, for example, is Marxism going? where are we going with it?’ (Spectres of Marx, 1994). In fact, if we exclude the ‘politics of remembrance, heritage and generations’, the future of accelerationist politics, even in that version influenced by post-workerism, could remain uncertain if it persisted in grounding itself to Marxist axiomatics, which show the innate limit of the context in which they were conceived, and limit in particular their dual prospect. Since such a reconstruction, especially on the tendency of the rate of profit to fall, exceeds the boundaries of our work, we will only linger shortly on that crucial point contained in thesis number six, in which Pasquinelli rightly regards the Marxian equation of the tendency of the rate of profit to fall as the identification, within the Marxist thought, of the mechanism principally responsible for industrial capitalism. Then, a clear mutual understanding from the beginning is crucial in order to allow for history to be actually handed to History, while for living future to be carried into the future as a valuable treasure. The inefficacy and disputing of the dogma of the tendency of the rate of profit to fall, including in this inefficacy also the Marxian formula conceived as original trigger of the repeated crises and temporal collapses of capitalism, was already known at the time of the Anti-Oedipus (1972). In our opinion, it is too much of a ‘mechanical’ stretch the fact that the interpretation and the reference to the acceleration of the capitalistic process, in Deleuze and Guattari, are initiated by the full restoration of the notorious formula on the tendency for the rate of profit to fall. Such formula, or at least its epistemic foundation, is opposed even by contemporary economists as Thomas Piketty and by Marxist intellectuals like David Harvey and Michael Heinrich, to name just a few of them. Marxist economic analyses which adopted the same economic data, but aggregated them differently, reached diametrically opposed conclusions: i.e. that the Marxian equation is correct and then functional, or on the contrary that it is only partially valid and hence inapplicable and dysfunctional from the perspective of the capitalist economy. Then again the classic academic economics already surpassed the Marxian formula in the decades that opened in 1870, from The Theory of Political Economy by William Stanley Jevons onwards. :: Dromology Archive 8 :: Robo-Trading Era, high-frequency Algorithms and the twinkling of the Crash
Over 5,000 operations per second. Totaling an operative value per single operator greater than 500 million dollars; and all that for positions which will be closed anyway by the end of the day. That is how far the HFTs go, the high-frequency traders – often regarded also as algorithmic traders – who have been developing themselves as a new ‘toxic’ or ‘endemic’ species in the world of global finance since 2009-2010. Such a form of schizophrenic trading has reached alarming proportions: some mature and highly competitive markets, such the American, English or German stock exchanges, host HFTs who all together conduct a third of the global daily operations. Facing such a volume of operations, tremendous for number of orders and total aggregate value, the authorities who supervise the markets took action trying to limit the use of such instruments, still in a neo-liberal and pro-finance perspective, by imposing protocols and guide lines on the several market operators. Nowadays, since IT dominates financial markets, it is still impossible to distinguish between algorithmic low-speed trading and high-frequency trading, because the latter could well simulate the former in order to escape the technical mechanisms put in place by the authorities. But how does the ‘silent and rapid beast’ operate? HFT firms work in highly accelerated markets, through extremely sophisticated instruments, ad-hoc algorithms which operate in advanced software and hardware. Here the competitive advantage is given, first of all, by the speed of execution – which often amounts to fractions of a second – and of evolution, in fact the products used in algo/HFT trading have to be constantly updated in order for them to be dynamic and flexible. This kind of operational capacity necessitates, first, of liquid markets, that is markets able to ‘support’ such productivity in quantity and quality, second, of the endogenous latency of arbitrations, in fact the algorithmic trader works with the spread that is generated on a single stock, in the ratio between bid and ask – i.e. demand and supply. Anyway, systemic imbalances of such dromo-technology are clear and have already manifested themselves in the ‘Flash Crash’ of the 6 May 2010 when the NYSE lost in only 36 minutes 10% of its value, previously generated by massive commissions on a future contract of Procter & Gamble, listed in the futures exchange in Chicago. The P&G title at the NYSE suddenly collapsed by 37%, due to the automatic spread of panic, from Chicago to New York. Here the HFT played a fundamental role in unfolding the abrupt breakdown of the whole stock list of the NYSE. A typical setting of the Virilian accident: to a new technique corresponds an accident enlarged by the intrinsic characteristics of the technique itself. The quick intra-daily falls of financial markets – the Flash Crashes – are related to high-frequency trading, as much as cyclical or daily falls are related to low-speed IT trading. Only five years after the terrible and sudden collapse, on the 21 April 2015, the US Department of Justice charged with 22 counts an Anglo-Punjabi citizen, Mr. Navinder Singh Sarao, an algorithmic trader based in the multiethnic West London who, from his parents’ house, issued on the 6 May 2010 the monstrous shower of commissions, for a total value of 200 million dollars on a single P&G future contract. The commissions were modified or substituted 19,000 times within few frantic minutes, before being completely deleted by Singh himself. However, the collapse by contamination between different but hyper-connected stocks and markets, amplified by the others HFT, caused losses amounting to billions of dollars, and demonstrated the systemic instability of global finance. No longer the market ‘invisible hand’, as Adam Smith wrote over two hundred years ago, but rather the schizophrenic voracity of highly specialized markets which operate on systemic imbalances mitigated by automatized circuit breakers. Acceleration and Collapse in a mortal chain of shrill and merging rings…
‘The gravedigger of capitalism’ (Marx, Grundrisse) and its political use
The value of the tendency of the rate of profit to fall lies then in its political use and, in particular, in being at the heart of the Marxian analysis of the future of capitalism, considered as contingent, a historical moment of transition within the communist vectoriality. Marx, in his drafts of the Grundrisse considered such ‘law’ as the definitive gravedigger of capitalism; in The Capital such law becomes a ‘tendency’, not anymore an unavoidable law, thanks to the counter-tendencies which are triggered during a period of crisis, carrying the gravedigger effect, terminal in the long run. Being it an equation, a law, or a tendency, the tendency of the rate to profit to fall remains central. However, is this Marxist eighteenth-century heart, elaborated in a context of great industrial development, no longer realistic today, helpful in the twenty-first century? And, above all, was this eighteenth-century Marxist heart that Deleuze and Guattari used as generating impetus and directional order of the accelerated process of capitalist collapse? On this controversial point, the bright Deleuzian scholar Christian Kerslake (Marxism and Money in Deleuze and Guattari, 2015), suggests that the fulcrum of the analysis contained in the Anti-Oedipus reflects the influence of the theories of De Brunhoff and Schmitt on currency and on the streams of financial capitalism. According to Deleuze and Guattari, currency and capital streams have become the heart of global capitalism, the civilised machine, thanks to their hegemonic role of intersection, control and regulator of the monetary streams of commercial and central banks, with the aim of recover and stabilise the crises of market economies. For post-1945 capital in the twentieth and twenty-first century, crucial elements are not production, active labour, and the tendency for the rate of profit to fall, but rather the management of the monetization of the economic system as fundamental negentropic factor of the system. The reference to Nietzsche and not to Marx should already make the accelerationists and the editors of The Algorithms of Capital think about this specific point, controversial but at the same time crucial and defining. To end the discussion on the ‘historical substance’ of the tendency for the rate of profit to fall, we must recall that that propensity, defined by Marx as ’the synthesis of contradictions present in the mode of production of capital’, triggered a first wave of “Marxist catastrophism” already during the nineties of the nineteenth century, and the Italian workerist and unionist movement struggled substantially in discrediting that fatal mix of economic determinism, messianic thought and oracular glow. How can a reprise of the concept of that Marxian equation be useful to accelerationism when Italian Marxists and socialists had already overcome it over a hundred years ago? Do we want to go back to the times of the debate pro and against the Marxist axiomatics elaborated by Benedetto Croce for the Pontanian Academy? :: Dromology Archive 9 :: Hyper-Search and Destroy: Extraction and destruction of Net’s value
“A frightful giant stalks throughout Europe”: it is second for capitalization at the stock exchange among the IT titles – valued 465.5 billion USD at the Nasdaq, behind Apple which was valued 660 billion USD, according to the data of August 2015; second brand in the world for its value – 173.6 billion USD, again behind Apple, according to May 2015 data; most bargained title in the world amounting at 2 billion USD a month according to the December 2014 data. Amazing results for a firm who opened at the stock exchange in 2004 with one share valued 100 USD. After inventing the Pagerank algorithm in 1997, and founding the firm Google in 1998, after few months the founders Page and Brin tried to sell it for 1 million USD, in order to have more time for their studies at university and to then graduate in mathematics at Stanford. Nobody wanted to buy it. In the early months of 1999, Google was offered at the reduced price of 750,000 USD to the main buyer, George Bell, CEO of Excite, one of the big Internet company at the time. George Bell asked himself why buying a company based on a new model of research engine, in a world that already had Altavista, Excite and Yahoo?
Google’s algorithmic machine and cybernetic acceleration
“The hastening that marked the development of technology until the revolution in telecommunications and transports has been overtaken by the cybernetic acceleration, which goes beyond and makes obsolete the concept of movement itself. It is then more appropriate to postulate the idea of a virtual space characterised in its nature by the time of acceleration, a time which does not necessitate of movement but rather manifests itself.”
(Tiziana Villani, Il Tempo della Simulazione, p. 10)
In the sharp critique of algorithmic governance conducted by Pasquinelli, it is possible to spot between the lines the other great danger that humanity is going to face, a disturbing road that leads straight to technological singularity, up to the limit of an erratic alien development of machines with an infinite calculus capacity which in turn accelerate together with other structured systems of machines, whose final users could be human or non-human, as foreseen, in an early stage, by the Internet of things. At the end of the millennium (1999) the first truly post-human corporation appeared: Google. It embodies the new stage of meta-linguistic and meta-mathic capitalism which following the analysis of Guattari, could be seen as the collapse of the anthropological semiotics which developed up to the twentieth century. Thanks to “human knowledge mines” on which its inhuman calculating leverage is based, Google surpasses with mathematical elegance and crystalline firmness any previous axiomatics of social conflict of the nineteenth century. Brin & Co. in fact value at zero both the first line of their workforce, that is the producers of contents in the infosphere, and the raw material of the product, that is the knowledge of the human race, activating what Bernard Stigler calls, referencing Simondon, ‘the new stage of the process of capitalistic transindividualisation’. To think Google, that is the new task! Pasquinelli has already started doing so thanks to his essay on the surplus of the Web (Machine capitalism and the surplus of the Web, 2011) but it is crucial to continue inspect Google now, due to the alien thinking that operates on ‘the process of transindividualisation, that is the way of collectively producing ourselves as subjects’ (Stiegler, 2012). To think Google, according to Stiegler, means ‘to render Google a critical space and not just the target of a critique’. It also entails overtaking Marxist axiomatics, despite their richness, and the conflictual thought of the Industrial age, in order to lay the basis for a thought able in its complexity to take over the algorithm that connects an infinite number of soft machines and millions of clouds. This dawn of a new thinking is the task, the key point of the book The algorithms of capital, and particularly of the most solid essay of the anthology, Machine capitalism and the surplus of the Web (Pasquinelli, 2011). However, in our opinion, the accelerationist momentum, due to single critical elements, makes its trajectory epicyclical in relation to Marxism, considering Marxism itself as a deferential trajectory, which means that, in the face of an apparent present earning, a backward motion will exhaust itself in the future, if the main system of reference of the whole accelerationist thought won’t be adjusted.
:: Dromology Archive 10 :: The future of Communism is Marx or Mach?
The rhetorical and paradoxical question is the following: what if the future of Communism, conceived as the ethical and reformatory nucleus of our society, was Mach’s thought and not the dialectical materialism of Marx and Engels as abrasively advocated by its dogmatic preacher, that is, Lenin? Hence, what if the future of the critique to algorithmic surplus lay in the past or at least in the critical reconsideration of the speculative fundamentals of socialism? These questions are less abnormal than what they may seem at first sight, if we consider that Lenin sent Materialism and Empirio-criticism to the printers in 1909. That is a pamphlet which, in line with the worst Communist tradition, was used as an instrument to struck a political target, instead of being a theoretical book polemical towards materialist and antagonistic politics. The chief aim for Lenin – the text had been instigated and suggested by Plechanov, an orthodox Marxist as well as scholar of Engels – was to denigrate and intellectually destroy Alexander Bogdanov. He was a philosopher and a scientist, the main leader of Russian Bolsheviks after the failed revolution of 1905, as well as supporter of a form of materialism profoundly in debt with the theories of the Austrian scientist and physicist Ernst Mach, the theorist and scholar of the speed of propagation of a sound in the air and in fluids. Lenin stigmatized the ‘empirio-critical’ theories for being too reactionary, and for a certain transitive property of insult, the Marxist-Machist Bogdanov found himself to be labelled as the reincarnated quintessence of the reactionary thinker, when on the contrary the Bogdanovist wing of the party was the most leftist one. In a recent superb publication entitled Molecular Red (Verso, 2015) by McKenzie-Wark, the New York-based intellectual retraces in details the relationship between Lenin and Bogdanov in light of their 1908-1909 political, economic and philosophical controversy. The Leninist methods, sadly renown in all their theoretical, organizational and historical aspects, have then greatly contributed towards the tragedy of ‘real socialism’ and towards undermining forever the historic opportunity to achieve a Soviet revolutionary power, less militarized and less totalitarian. Nowadays, all that is left from Leninism is nothing but ruins; the last thing that can be done is to recall Mach and Bogdanov and what is left of Marx. Therefore Molecular Red is necessary because it presents again and contextualizes on the global stage of ideas, and thus of politics, exactly the character and the intellectual stature of Bogdanov, Platonov and of the Prolekult movement.
From red wealth to the red “stack”: a proposition for a new balance of powers.
We are approaching the end of this analysis of the brilliant anthology The algorithms of capital, which we regard as the most advanced text in Italy on the topics discussed here. Inside it we have found important and necessary considerations, extremely actual, but sometimes combined with extremely old passages – we apologise to the authors for our bluntness. Sometimes the new makes its way even with such methods: the acceleration – or in Deleuze’s terms the “escape line” – ballasts itself with the archaic or the neo-primitive and perhaps it would be odd to find the ‘newest’ already pack and ready for use. In time being for sure the accelerationist thinkers will refine and develop their own thought paths. We finally need to make some additions concerning single essays present in the book. The first is related to ‘Red Plenty Platforms’ by an author, Nick Dyer-Witheford, who we have been appreciating for some time and of whom we would like to signal the recent publication of Cyber-proletariat (Pluto Press, 2015). In ‘Platforms for a red wealth’ the author explores the famous Chilean experiment ‘Cybersyn’, conducted at the time of Salvador Allende and conceived as a way to cybernetically optimize the socialist planning, intersecting it, with the usual virtuosity, with the sci-fi of Francis Spufford in Red Plenty. The main theme of the essay could be ‘calculous and Communism’: Dyer-Witheford skilfully combines Soviet cybernetics with the Marx of the Grundrisse, the catallaxy of the liberal economist Frederick Hayek and the theorists of the computerised economic planning, in an evocative and pleasant ride which stimulates both a political reading and a philosophic reflection. The second essay that we would like to signal is by the German Mercedes Bunz, ‘How the Automation of Knowledge Changes Skilled Work”. It is a text less linked to post-workerist thought and closer to the area of the automation of knowledge in the factories of wisdom. Nowadays, according to Mercedes Bunz, Western universities have structured themselves as ‘knowledge industries’ in which experts – or the new class of educated people – are losing their privileged expertize to a new copious knowledge well distributed in the social pattern. The overload of information, Internet and the new soft machines, digitalised knowledge, algorithms and apps, all of them place the role of the expert – and hence the role of intellectuals and of academic scientific specialists– under attack, externalising so their specific competence. Bunz argues in her essay for radical change in the way we approach technology, through an alliance between algorithmic intelligence and operative humanism inspired to Simondon’s philosophy, author who is becoming increasingly central for such analyses. At last, we signal the work of Tiziana Terranova, an Italian philosopher of the latest generation, skillful examiner of the digital world and of its most heterodox practices since the nineties. Her essay ‘Red Stack Attack’ is, starting from the title, a sort of propositional and combative response to the famous essay by Benjamin Bratton ‘The Black Stack’ (e-Flux, 2014) in which the American theorist investigates the normative status of those ‘unexpected’ mega-structures of the present global system of calculous. Terranova, in fact, regards her essay as the result of a social wisdom built within and by the Net, whose analysis’ cornerstone is the relationship between capitalism and algorithms. Hence, the algorithm is seen from a political, economical and financial point of view; she deals with topics like Bitcoin and other cripto-currencies, interfaces between individual, data and cloud, the algorithm as ‘fixed capital’, and the absorption of the excesses in wealth and power in the productive cycle of capital. Compared to other post-workerist thinkers, Terranova shows and structures a kind of wisdom that is more prominent in the digital world and the network cultures, which allows her to leave an evident expositional and intellectual mannerism, peculiar to other authors of that circle, and makes her text the most advanced in terms of underway considerations between potential and criticality of the algorithmic reason. :: Dromology Archive 11 :: Leibniz’s silence at 7:00 PM
“…quando orientur controversiae, non magis disputatione opus erit inter duos philosophus, quam inter duos computistas. Sufficiet enim calamos in manus sumere sedereque ad abacos, et sibi mutuo (accito si placet amico) dicere: calculemus”
“[...] if controversies were to arise, there would be no more need of disputation between two philosophers than between two calculators. For it would suffice for them to take their pencils in their hands and to sit down at the abacus, and say to each other (and if they so wish also to a friend called to help): Let us calculate.”
(Leibniz, Logical Papers, [1689] 1962, p. 237)
Final Cut: Calculemus!
We would like to end this quasi-review on a cheerful note borrowed from the Logical Papers by Leibniz. In this vexata quaestio, the last word paradoxically belongs to the number and to the action that mostly adheres to reality, calculous. Actualising the image, to this virtual table, in a scattered and anachronic temporal sequence, we would like to see convened and sat a greater number of friends of Accelarationism. Together with Leibniz, we would summon Marx, as well as Deleuze and Bogdanov, and certainly Mach and Al Khwarizmi. To each one of them we would assign a pocket calculator capable of working at exponential time. Then, as impartial judges, we would invite all the authors of the anthology The Algorithms of Capital. And Pasquinelli, master of ceremonies in front of such comrades, ready to settle all mounting controversies, would peremptorily command with a neat gesture and a febrile look: Let’s calculate!
Obsolete Capitalism, agosto 2015
taken from:
0 Comments
Leave a Reply. |
Steven Craig Hickman - The Intelligence of Capital: The Collapse of Politics in Contemporary Society
Steven Craig Hickman - Hyperstition: Technorevisionism – Influencing, Modifying and Updating Reality
Archives
April 2020
|