by Achim Szepanski Issues such as satellite monitoring, enormous computing power on silicon chips, sensors, networks and predictive analytics are the components of digital systems (monitoring capital) that are currently extensively tracking, analyzing and capitalizing on the lives and behaviors of populations. For example, under the pressure of the financial markets, Google is forced to constantly increase the effectiveness of its data tracking and analysis generated by machine intelligence and, for that very reason, to combat every user's claim to privacy by the most diverse means. Thanks to a series of devices such as laptops and smartphones, cameras and sensors, today's computers are ubiquitous in capitalized everyday life. They are character-reading machines, the algorithms (unconditionally calculable, formal-clear procedural instructions) and develop their full power only in the context of digital media networking, for which the programmatic design, transformation and reproduction of all media formats is a prerequisite. In particular, the social networks in this game provide a kind of economy that has strangely established new algorithmic governance by extracting personal data that leads to the construction of metadata, cookies, tags, and other tracking technologies. This development has become known primarily as "Big Data," a system based on networking, databases, and high computer performance and capacity. The processes involved are, according to Stiegler, those of "grammatization". In the digital stage these lead to that individuals are led through a world in which their behavior is grammaticalized by interacting with computer systems operating in real time. The grammatization, for Stiegler, already begins with the cave paintings and leads through the media cuneiform, photography, film and television finally to the computer, the Internet and the smartphone. The result of all this is that the data paths and tracks generated by today's computerization technologies constitute ternary attention-reducing retentions or mnemonics that include specific time processes and individuation processes, i.e, "industrialization processes of memory" or "political processes of memory." and industrial economics based on the industrial exploitation of periods of consciousness «. With the digitization of the data paths and processes, which today are urgently required by sensors, interfaces and other means and are basically generated as binary numbers and calculable data, as Stiegler said, creating an automated social body in which even life is transformed into an agent of the hyper-industrial economy of capital. Deleuze anticipated this development in his famous essay on control societies, but control comes only when the digital calculation integrates Deleuze's modulations of control techniques into algorithmic governance that also automates all existences, ways of life, and cognition included. The underlying power technologies of the protocols are a-normative because they are rarely widely debated in the public domain, but rather appear to be inherent in algorithmic governance. The debate now is: Do you create data digital logs or digital logs data? Or even more closely: are data digital protocols? In any case, their setting has a structuring character, not only the results. Like any governance, if we think of it in terms of Foucault, algorithmic governance also implements specific technologies of power, which today are no longer based on statistics related to the average and the norm, instead we have an automated, to do atomic and probabilistic machine intelligence,1The digital data machines that continuously collect and read data tracks mobilize an a-normative and an a-political rationality, consisting of the automatic analysis and the monetary valorisation of enormous amounts of data by modeling, anticipating and influencing the behavior of the population. One calls this today trivializing ubiquitous Computing, in which, and it must be pointed out again and again, the monitoring capital the extraction of the behavior of the users and the prediction products based on it, which develop by the algorithms of the monitoring capital, no longer only in the Internet, but in the real world and then diversify prediction products by special techniques. Everything, whether animated or inanimate, can be verdant, connect, communicate and calculate. And from the automobiles, refrigerators, houses and bodies, etc., signals are constantly flowing through digital devices that are based on activities of human and non-human actors that take place in the real world as data in the digital networks and serve there the transformation into forecasting products that are sold to advertisers who target accurate advertising (Zuboff 2018: 225), which are sold to advertisers who are targeting advertising. (Zuboff 2018: 225). To put it in more detail: The monitoring capital of the companies Google or Facebook automates the buying behavior of consumers, channels it through the famous feedback loops of their AI machines and binds it purposefully to companies that are advertising customers of the monitoring capital. The promotional behavioral modifications to be achieved by users are based on machine processes and techniques such as tuning (adaptation to a system), herding (conditioning of the mass), and conditioning (training of stimulus-response patterns) that determine the behavior of the body of Directing Users, so that the engineered prediction products actually drive users' behaviors towards Google's guaranteed intentions. (Ibid. The maximum predictability of users' behavior is now a genuine source of profit: the consumer who uses a fitness app should best buy a healthy beverage product at the moment of maximum receptivity, such as after jogging, that has previously been made palatable by targeted advertising has been. Sporting goods manufacturer Nike has bought the data analysis company Zodiac and uses them in its stores in New York. If a customer enters a store with their Nike app on their smartphone, they will immediately be recognized and categorized by the geofencing software. The homepage of the app changes immediately and instead of online offers, new features appear on the screen, which of course includes special offers tailored to the customer and recommendations that are currently being offered in the store. The surveillance capital has long ceased to be advertising-only, it quickly became a model for capital accumulation in Silicon Valley that was adopted by virtually every startup. But today it is not just limited to individual companies or the internet sector but has spread to a large number of products, services and the economic sector, including insurance, healthcare, finance, cultural industries, transportation, etc. Almost every product or service that begins with the word "smart" or "personalized," any Internet-connected device, any "digital assistant" is an interface in the enterprise supply chain for the invisible flow of behavioral data on the way to predicting the future the population in a surveillance economy. As an investor, Google quickly quashed the stated antipathy to advertising, instead opting to increase revenue by giving exclusive access to user data logs (once known as "data exhaust") data in combination with substantial analytical capacity and maximum Computer power was used to generate predictions of users' clickthrough rates, which are seen as a signal of the relevance of an ad. Operationally, this means that Google transformed its growing database to "work in" as a surplus of behavioral data while developing new ways to aggressively search for sources of surplus production. The company developed new methods for seizing the secret surplus by exposing data that users considered private and extensively personalizing users' information. And this data surplus was secretly analyzed for the importance it had for predicting user click behavior. This data surplus became the basis for new forecasts called "targeted advertising". Here was the source of surrogate capital, behavioral surplus, material infrastructures, computer power, algorithmic systems, and automated platforms. As clickthrough rates shot through the ceiling, advertising for Google became as important as the search engine, perhaps the entry point for a new kind of e-commerce that relied on broad online monitoring. The success of these new mechanisms became apparent when Google went public in 2004. The first supervisor capitalists first enacted declarations by simply considering the users' private experiences as something that can be taken to translate them into data and to use them as private property and exploit them for the private gain of knowledge. This was dubbed with a rhetorical camouflage and secret explanations that no-one knew as themselves. Google began unilaterally to postulate that the Internet was merely a resource for their search engine. With a second declaration, they claimed that users' private experience served their rewards by selling personal fortunes to other companies. The next step was those surplus operations should move beyond the online milieu into the real world, where personal behavior data is considered free to be easily stolen by Google. This was a normal story in capitalism, finding things outside the market sphere and generating them as goods. Once we searched for Google, now Google is looking for us. Once we thought the digital services were free, now the surveillance capitalists think we are fair game. The surveillance capital no longer needs the population in their function as consumers, but the supply and demand orient the surveillance companies to transactions that are based on the anticipation of the behavior of the consumer Populations, groups, and individuals. Surveillance companies have few employees in relation to their computer power (and unlike the early industrial companies). The surveillance capital is dependent on the erosion of individual self-determination and autonomy, as well as the right of free choice, to generate an unobserved stream of behavioral data and to feed the markets that are not for but against the population. It is no longer enough to automate the streams of information that illuminate the population, but rather the goal is to automate the behavior of the population itself. These processes are constantly redesigned to increase the ignorance that affects individual observability and to eliminate any possibility of self-determination. The surveillance capital puts the focus away from the individual users up populations such as cities or even the economics of a country, which is not insignificant for the capital markets when the predictions about the behavior of populations gradually approach the certainty of their arrival. In the competition for the most efficient prediction products, supervisor capitalists have learned that the more behavioral surplus they acquire, the better the predictions are, which encourages capitalization through the economies of scale to ever new efforts. And the more the surplus can be varied, the higher the predictive value. This new drive of the economy leads from the desktops via the smartphones into the real world - you drive, run, shop, find a parking space, circulate the blood and you show a face. Everything should be recorded, localized and marketed. There is a duality in information technology to announce, to automate its capacity, but also to computerize, that is, to translate things, processes and behavior into information, i.e new territories of knowledge are being produced on the basis of informational capacity, which may also be the subject of political conflicts; this concerns the distribution of knowledge, the decision about knowledge and the power of knowledge. Zuboff writes that the surveillance capitalists have the right to know, to decide who knows and to decide who decides to claim alone. They dominate the automation of knowledge and its specific division of labor. Zuboff goes on to say that one can not understand the surveillance capital without the digital, but the digital could also exist without the surveillance capital: The surveillance capital is not pure technology, rather digital technologies could take many forms. The monitoring capital is based on algorithms and sensors, artificial machines and platforms, but it is not the same as these components. A company such as Google must already reach certain dimensions of size and diversification resources when collecting data that reflects user behavior and also serves to track behavioral excesses (Google's data emissions) and then uses that data to manipulate the data through its machine intelligence, convert prediction products of user behavior and sell them targeted to advertisers, products that start like heat seekers on the user to propose him, for example, at a pulse of 78 just the right fitness product via displayed advertising. For example, with the diversification that serves to increase the quality of the forecasting products, a broad diversification of observable topics in the virtual world must be achieved, and secondly, the extraction operations must be transferred from the network to the real world. In addition, the algorithmic operations must gain in depth, that is, they must aim at the intimacy of users to actuating and controlling, yes forming their behavior intervene by the company, for example, time and target pay-buttons on the smartphone or show lock a car automatically if the person concerned has not paid insurance amounts on time. The data pool from which analysts can now draw is almost infinite. They know exactly who reclaims goods, calls hotlines or travels through online portals about a company. They know the favorite shops, the favorite restaurants, and bars of many consumers, the number of their "friends" on Facebook, the creator of ads that social media users have clicked on. You know who in recent days have visited the website of a competitor of the advertiser of an ad or has googled certain goods. They know the skin color, the sex, the financial situation of a person, his physical illnesses and emotional complaints. They know the age, the profession, the number of children, the neighborhood, the size of the apartment - after all, it is quite interesting for a company that manufactures mattresses to know whether a customer is single or, in the worst case scenario, the same five foam mats for the entire family orders. Today, the group has materialized in Facebook, in its invisible algorithms, and has evoked a largely imaginary group addiction of unimaginable proportions. And here the theory of simulation is wrong, because there is nothing wrong with the digital networks, they are quite real and create a stability for those who are connected to the networks by simply expanding things, more inquiries, more friends and more so on. With the closure of the factories came the opening of the data mines. And the associated violation of privacy is the systematic result of a pathological division of knowledge, in which the monitoring capital knows, decides and decides who decides. Marcuse wrote that it was one of the boldest plans of National Socialism to wage the fight against the taboo of the private. And privacy is today so freed from any curiosity or secret, that without any hesitation or almost avid, you write everything on your time-wall so that everyone can read it. We are so happy when a friend comments anything. And you're always busy managing all the data feeds and updates, at least you have to deviate a bit of time from your daily routines. The taste, the preferences and the opinions are the market price you pay. But the social media business model will reach its limit and be ended, though it is still pushed by the growth of consumerism. This business model is always repeated after the dotcom boom of the 1990's. If growth stagnates, then the project must be completed. Transition-free growth of customer-centric, decentralized marketing is the fuel fueled by the mental pollution of the digital environment that corresponds to that of the natural environment. For a search query, factors such as search terms, length of stay, the formulation of the query, letters and punctuation are among the clues used to spy on users' behavior, and so even these so-called data fumes are collect-able to target the user's excessive behavior advertising, whereby Google also assigns the best advertising sites to the algorithmic probability of the most paid advertisers, the prices of which are multiplied by the price per click multiplied by the probability with which the advertisement is then actually clicked on. In the end, these procedures also reveal what a particular individual thinks in a particular place and time. Singularity indeed. Every click on an ad banner displayed on Google is a signal for its relevance and is therefore considered a measure of successful targeting. Google is currently seeing an increase in paid clicks and at the same time a case of average cost per click, which equates to an increase in productivity as the volume of output has increased while costs are falling. Just as the protocols are everywhere, so are the standards. It is possible to speak of environmental standards, safety and health standards, building standards and digital and industrial standards whose inter-institutional and technical status is made possible by the protocols' functioning. The capacity of standards depends on the control of protocols, a system of governance whose organizational techniques shape how value is extracted from those integrated into the different modes of production. But there are also the standards of the protocols themselves. The TCP / IP model of the Internet is a protocol that has become a technical standard for Internet communication. There is a specific relationship between protocol, implementation, and standard concerning digital processes: protocols are descriptions of the precise terms by which two computers can communicate with each other (i.e., a dictionary and a handbook for communicating). The implementation implies the creation of software that uses the protocol, i. handles the communication (two implementations that use the same protocol should be able to exchange data with each other). A standard defines which protocol should be used for specific purposes on certain computers. Although it does not define the protocol itself, it sets limits for changing the protocol. translated by Dejan Stojkovski taken from:
0 Comments
Leave a Reply. |
MediaArchives
March 2019
|