The Cognitive Nonconscious and Automated Trading Algorithms
p. 20-36
Texte intégral
1 Hypermnesia, an excess of memory, has become a cultural condition of the twenty-first century. The more memory storage has increased, the more relational databases are needed to organize, process, search and retrieve data. The flood of information made possible by the Web makes automated cognition a necessity. Stanislaw Lem presciently made this argument in Summa Technologiae (1962/2012). During the 1960s the claim that cognition could be automated was controversial; now it seems common sense, although the work remains of providing a workable definition that would alter our sense of cognition so that it can include operations done by networked and programmable machines, ambient technologies, mobile devices, and other digital affordances. This essay sketches the argument for nonconscious cognition, indicates its scope and importance, and explores its implications by taking as a case study automated trading algorithms.
2Consider the cell phone message, “searching for signal.” Beyond your vision (and ken) is a series of protocols ( “handshaking”) between a receiving/sending tower and your phone, verifying that your device is authorized to use the networked infrastructure and establishing a connection. This everyday example illustrates so the dynamics characteristic of nonconscious cognition. Underlying nonconscious cognition are material processes such as signal transmission, electrical/electronic circuits in your phone, etc. Nonconscious cognition is distinct from the underlying material processes because it has an “intention toward,” in this case, an intention to plug your device into the networked circuits operated by the phone company.
3Nonconscious cognition achieves its “intention toward” through dynamics such as recursive feedback loops (which Andy Clark (2008) calls continuous recursive causality), characteristic of systems with the capacity for emergent results. Moreover, systems structured in this way also demonstrate adaptive behaviors, another way in which they differ from the underlying material processes. A final point is that nonconscious cognition often occurs in systems rather than individuals.
4Consider, as another example, a termite mound, an architectural structure of considerable complexity. Although each termite has only a few instincts, the collective behaviors that emerge are much more complex than the instincts themselves, expressed in the termite mound as the emergent result. Another example is a beehive, a similarly complex architectural structure. The hive is created when each bee positions itself a certain distance from the others and, moving in a circle, spits wax. The adjacent bees do the same, and the pressure of the adjoining lines results in a hexagon, the polygon with the most efficient packing ratio. Such results are underlain by material processes, but the cognitive nonconscious emerges when these processes interact to creative a constraint-driven, complex adaptive system.
5The cognitive nonconscious spans a range of lifeforms and technological devices. It underlies human consciousness, as well as the consciousness of many animal species, especially mammals. It is also present in innumerable technological instruments, from laptops to cell phones to digital cameras to many different kinds of sensors and actuators (Hayles 2014). When distinguishing between material processes and the cognitive nonconscious, we can ask such questions as the following. Is this system adaptive? Can it modify its behaviors in response to environmental changes? Does it demonstrate emergent behaviors unpredictable from the individual elements but arising from interactions between elements? Does it process information and respond to the results of this processing with changed behaviors? To illustrate, we may contrast the slow movement of a glacier as it slides downhill with insect swarming behaviors. While the glacier’s movement can be predicted if all the relevant forces are known (and thus is not cognitive), the swarm is a complex adaptive system responsive to its environment and capable of emergent behaviors. Not all cases, however, will fall neatly into one category or another. In this framework, establishing the boundary between material processes and the cognitive nonconscious becomes a site for interrogation and analysis.
6In extending the range of what counts as cognition, I reserve “thought” for the activities of what neurologists call “core” or “primary” consciousness (present in humans and many mammals), and “higher” consciousness (present in humans and some primates, although more extensively developed in humans than other species). Together, consciousness and the unconscious constitute what I call “modes of awareness.” The cognitive nonconscious applies to processes below the level of awareness (for humans and animals) or processes in nonsentient artefact (such as technical devices) that nevertheless perform cognitive tasks as constraint-driven complex adaptive systems. Thus all thought is cognition, but not all cognition is thought.
7The computer company Cisco estimates that by 2016, twenty-five billion technical devices will be connected to the Internet; by contrast, the planet’s human population is estimated at 7.2 billion. As non-cognitive devices proliferate, the interpenetration of the cognitive nonconscious with human modes of awareness is increasing exponentially. So interwoven is the (exterior) cognitive unconscious with human activities that life in developed countries has become absolutely dependent on these interactions. The two could not be disentangled for even five minutes without massive disruptions and catastrophic loss of life, including the computer systems that control traffic at our major airports, the Internet that controls data communications between humans and nonhumans, the intelligent devices that control electrical grids, factory equipment, water purification plants, and so on. This entanglement, already pervasive, will become qualitatively more complex as intelligent and ambient devices are increasingly linked together into networks that comprise much larger nonconscious complex adaptive systems. As a result, contemporary humans are facing a new kind of landscape in which the cognitive nonconscious plays indispensible, critical, and constantly expanding roles.
8What are the implications for the construction of human knowledge, particularly in the humanities? The humanities, including history, literature, philosophy, religious studies and art history, have always been deeply involved with questions of meaning. Meaning in the human sense, however, literally has no meaning for the cognitive nonconscious, which in technical devices is concerned with sensing and processing signals, generating outputs, interacting with other nonconscious agents, and adapting to changed circumstances in its environment. Among the challenges that the cognitive nonconscious poses to the humanities are re-imagining their positions within a landscape where nonconscious cognition not only has become essential for human actions but also has agency to act on its own, independent of human control or awareness.
9 These challenges are typified by the rise of autonomous trading algorithms. Trading algorithms have dramatically transformed the mixed human-machine ecology that obtained prior to 1995 in the US stock market, to the machine-machine ecology that now is the norm. As digital algorithms increasingly took control of financial transactions, not only did the relation of humans to the global economic system change, but also the dynamics of the system itself. The result has been a dramatic increase in the fragility of markets and the rise of uncontrollable instabilities that threaten to explode at any (mille) second. What our response as humanists should be to these developments, and how we might contribute positively to global stability, will comprise my conclusion.
Automated Trading Algorithms and the US Stock Market
10An example will illustrate how trading algorithms engage in continuous reciprocal causality through recursive feedback loops, sometimes with results that defy human reason. A third-party book on Amazon, entitled The Making of a Fly, shot up in ten days from an initial price of $199 to $24 million (to be precise, $23,698,655.93— plus shipping (reported on CNN April 25, 2011). How did this absurd price emerge? One seller’s algorithm priced the book at 1.27 times the price of another seller’s algorithm, which would in turn revise its price to 0.998 times the price of the first algorithm. For example, if the book was listed at $100 by the second seller, the first seller’s algorithm would price it at $127. Reacting to this revised price, the second algorithm would price it at $126.75. The first algorithm then prices the book at $160.96, and so on, up to millions of dollars. Of course a human, with a deeper knowledge of the world and wider world horizon, would have known the figure was ridiculous.
11When and why did the market revolution happen that moved the action from human traders to automated trading algorithms? Wall Street Journal writer Scott Patterson traces the history of automated trading (Patterson 2012). He recounts the origins of automated trading in the programs Joshua Levine designed for Datek, a firm that traded on Instinet, a private exchange created for firms to buy and sell stocks directly to one another, avoiding the fees charged by the NYSE. When Datek asked for lower fees from Instinet and was turned down, the company went out on its own and Levine created Island, a new kind of electronic pool in which algorithms (or algos, in Wall Street parlance) with names like Dagger, Sniper, Raider and Stealth fought each other, using cutting-edge artificial intelligences to sell and buy with reaction times measured in milliseconds. Levine sent out an email to users of his Watcher program, writing in January 1996, “We want Island to be good and fair and cheap and fast. We care. We are nice. SelectNet is run by Nasdaq. They don’t care. Instinet is run by Reuters. They aren’t nice... Won’t you join us at Island” (qtd. in Patterson 2012, 121).
12When Levine wrote that Instinet was not “nice,” he was referring to the ways in which high-frequency traders (HFT) ripped off other traders and indeed their own clients, helped by exchanges such as Instinet that cloaked their bids from public view. A trader could, for example, offer to buy a stock for a client while offering to sell the same stock over Instinet at a higher price. The client would not know about this side deal, because he could not see the Instinet side of the offer. In other practices, traders would front run a stock. For example, if a trader submitted an order to purchase 100,000 shares of Company X, an HFT with a faster algorithm would see the bid (using access to the information sold by the exchange), purchase the stock and re-list at a higher price.
13Michael Lewis (2014) documents this practice in his account of Brad Katsuyama, a trader for the Royal Bank of Canada (RBC), placing a bid and watching the market move away from him as the published price evaporated and a higher one appeared. Eventually Katsuyama realized that HFT had effectively rigged the markets so they no longer functioned as unbiased intermediaries, leading him and his colleagues to create a new exchange, Investors Exchange (IEX), designed to be as resistant as possible to HFT predations.
14Lewis’s research illustrates the complexities of trying to regulate an extremely complex system driven by the capitalist imperative to maximize profits. A case in point is the SEC’s attempt to make the market fairer by introducing Order-Handling Rules that created new entities called electronic communication networks, or ECNs, and imposing certain restrictions forcing markets such as Instinet to list their quotes on Nasdaq. Patterson summarizes the effects of ECNs:
“With the Order-Handling Rules, the entire Nasdaq marketplace would shift toward an electronic platform wide open to computer-driven trading... the phone-based system of human dealers would quickly become a screen-based cyberpunk network of computer jockeys born and bred in electronic pools such as Island.” (128)
15With the floodgates open to computerized trading, speed became the name of the game. Stock trading companies are willing to pay high fees to have their computers located on rack space within the server farms run by the major exchanges (colocation), because the proximity shaves milliseconds off the time it takes to transfer information from the exchange to the trader, time that can be turned into money by taking advantage of the small price differences that occurred in that time interval. Currently an ultrafast transatlantic cable is being constructed, at the cost of several billion dollars, to connect high-frequency traders in the US with those in Britain. The estimated time acceleration is five milliseconds, which works out to about a billion dollars per millisecond advantage (Johnson et al. 2012, 4). High-frequency traders, using sophisticated algorithms operating at microtemporalities inaccessible to humans, make thousands of trades in a day, adopting a policy never to hold any stock overnight. As a result, the average time a stock is held has plummeted across all the exchanges. After World War II, it clocked in at four years; by the turn of the millennium, it had dropped to eight months; by 2011, it had, according to some estimates, dived to an astonishing twenty-two seconds (Patterson 2012, 46). The amount of information being exchanged on stock trades correspondingly grew to gargantuan figures. Nanex, a firm that tracks speed trading, estimates that on all US stocks, options, futures and indexes on a single day, one trillion bytes of data are exchanged (Patterson 2012, 63).
16As speed trading increased, the duopoly of NYSE and Nasdaq shattered into a number of private markets such as Instinet as well as “dark pools,” trading sites where quotes are hidden from public view. Attempting to cope with this fragmentation, the SEC introduced in 2007 a new set of regulations called the National Market System, or reg NMS. The idea was to bind together all the electronic marketplaces into a single network so they would operate as a true national market. At the heart of the reg NMS was the mandate that any order to buy or sell a stock must be routed to the market that had the best price. If there was a disparity, for example, between the price a stock sold on Nasdaq and a higher price listed on the NYSE, the order would automatically be routed to the Nasdaq. To facilitate this mandate, an electronic ticker tape was instituted called the Security Information Processor, or SIP feed.
17One consequence of reg NMS was that now the trading houses had to monitor the prices in all venues, all the time, which virtually forced them to use cutting-edge algorithms. Moreover, there were also unintended consequences. Orders are executed according to their place in the “queue,” the electronic monitoring system that assigns them a priority according to the time they were placed. But, as Patterson explains, “an order in one exchange queue could be suddenly rejected, routed to another exchange, or kicked to the back of the queue if an order that beat its price” appeared (Patterson 2012, 49). This provided a host of new ways to game the system. High-frequency traders had by this time become the largest customers of the exchanges; by 2009, it is estimated that they accounted for 75 % of all trades. In addition, the NYSE had transformed from a non-profit entity to a for-profit corporation when in April 2005 it merged with a private exchange, Archipelago (modeled on Levine’s Island), and then a couple of years later with Euronext, the European combined stock market. For its part, Nasdaq had changed in 2006 from a quotation service to a licensed national exchange, and by 2011, more than two-thirds of its revenue came from HFT (Lewis, 163).
18Operating as for-profit corporations with their own stockholders to please, the exchanges instituted new order types that went far beyond the old-fashioned limit orders (orders to buy or sell within specified price limits) that were the bread and butter of exchanges in previous decades. Moreover, competition had become so fierce that high-frequency traders were making less on each deal, and as a consequence they became more dependent on “maker and taker” fees, a structure the exchanges had put into place to maximize liquidity (‘liquidity’ may be defined as that point where either a buy or sell order will not cause a change in price).
19To ensure liquidity, a trader who provides it is typically given a small rebate, while a trader who takes it has to pay a small fee (called the “maker/taker” policy). Patterson summarizes the effects when the new order types were combined with the rebate/fee structure. The new type of orders
“allowed high-frequency traders to post orders that remained hidden at a specific price point at the front of the queue when the market was moving, while at the same time pushing other traders back. Even as the market ticked up and down, the order wouldn’t move... By standing at the front of the queue and hidden as the market shifted, the firm could place orders that, time and again, were paid the [rebate or ‘make’] fee. Other traders had no way of knowing that the orders were there. Over and over again, their orders stepped on the hidden trades, which acted effectively as an invisible trap that made other firms pay the ‘take’ fee.” (Patterson 2012, 50)
20This situation makes clear the emergence of a new type of hyper-capitalism that I call vampiric capitalism. As Mark Neocleous observes (2003), Marx in Capital uses the vampire as a metaphor for capital sucking the lifeblood from the working class; by contrast, vampiric capitalism preys on other capitalistic enterprises. These practices illustrate how far the stock market had drifted from its initial purposes. As Sal Arnuk and Joseph Saluzzi (2012) explain, the stock market was originally set up to enable new companies to attract capital by issuing IPOs, thus spurring innovation and creating diversity in the marketplace. It also provided a way for ordinary people with disposable income to invest in the stock market through mutual funds, options and other investment instruments. High-frequency traders perform neither of these useful services. They justify their existence by claiming they provide liquidity for the marketplace (Perez 2011, 163), and because they trade so often, they have also driven down the spread between buy and sell orders, which they argue benefits everyone. But as we shall see, there is another side to this story. Although their commissions are smaller, because they trade so often, the pennies quickly mount into dollars and, eventually, into billions a year—money that ultimately comes out of the pockets of investors. The deleterious effects on innovation may be seen in the alarming decrease in publically traded companies: in 1997, there were 8200 public companies; by 2010, only 4000 remained (Patterson 2012, 59).
21The most disastrous effects, however, are the instabilities that high-frequency trading introduces. Nanex, the company that analyzes the algorithms high-frequency traders use, detected an algorithm it called the Disruptor. The Disruptor is designed to flood the market with so many orders that, effectively, it disrupted the market itself (Patterson 2012, 63). These instabilities became shockingly evident on May 6, 2010, when in the space of two minutes the Dow fell 700 points, and then just as quickly rebounded. How did this happen? In an already down market nervous about possible defaults by Greece and Spain, Waddell & Reed Financial from Kansas City was monitoring a massive order to sell seventy-five thousand S & P 500 E-mini futures contracts, worth about $4 billion dollars. The algorithm they were using was designed to sell at a pace that would keep it at about 9 % of the market’s overall volume, with thirty-second pauses to throw off the shark algorithms hunting for “whales” (large orders) so they could front run them. These sell orders were bought by high-frequency funds that, within milliseconds, sold them again to other high-frequency traders at a slightly lower price; these algorithms in turn sold them again. As the volume shot up and the market plunged down, the trading algorithm from Waddell & Reed sold more because its 9 % limit kept increasing, which prompted an even fiercer feedback cycle. Patterson reports that “within a fourteen second period high-frequency traders bought and sold an astonishing twenty-seven thousand E-mini contracts” (Patterson 2012, 264).
22As the market plunged, other stocks were affected as well. Accenture, a global consulting company that normally sold for about $50 share, dropped to an absurd price of a penny a share, a so-called “stub quote” that trading firms place simply to fulfill their obligations as market makers but that they never expected to have filled. Procter & Gamble plunged from its normal price of $60 a share to around $30. On the other end of the scale, some stocks, notably Apple, demonstrated an amazing upward spike, reaching $100,000 per share (likely another stub quote). As the insanity proceeded, many high frequency traders, alarmed by the volatility and concerned that transactions would be retroactively cancelled by the SEC, simply pulled the plugs on their computers. This meant that even fewer buyers were in the market, accelerating the market’s downward plunge. Only when NYSE shut down trading for five seconds was the feedback cycle broken, and at that point, because prices on many stocks were ridiculously low, the algorithms started buying and, within a couple of minutes, the market was back to where it was before. But not before inflicting real harm on many buyers. One person who was trying to sell Procter & Gamble just before its price bottomed out lost $17,000, and a hedge fund in Dallas lost several million dollars when the price of options it was buying spiked from 90 cents to $30 per contract.
23Subsequently, the SEC revoked or “broke” trades that exceeded a 60 % variation from their price before the flash crash. Then the SEC, in coordination with the Commodity Futures Trading Commission (CFTC), set about to investigate the flash crash, issuing a report in September 2010 (while the flash crash took only five minutes to plunge and recover, they took a full four months to figure out what happened). The report focused on liquidity, concluding that Waddell & Reed’s sell order was the culprit, initiating a chain of events that “sucked liquidity out of the market and allowed prices to go into freefall” (Buchanan 2011). Mark Buchanan, a theoretical physicist who blogs on financial matters, references the research done by Eric Scott Hunsader, founder of Nanex and a software engineer. Hunsader followed 6,483 trades that Waddell & Reed made that fateful day; Buchanan notes that “the company’s execution broker fed the market throughout the day—a tactic specifically designed to minimize the price impact of a large sale” (Buchanan 2011, 2). Instead of locating the problem in this sale, Hunsader’s analysis
“suggests this plunge was caused by high-frequency traders. They typically act as liquidity providers, standing ready to buy and sell at certain price levels. But the day’s volatility prompted them to dump their holdings to avoid losses... It was this selling, not Waddell & Reed’s passive orders, that caused the liquidity to disappear.” (Buchanan 2011, 2)
24The comments to Buchanan’s article are revealing. “Guest” called the SEC report a “fairy tale,” and “Fritz Juhnke” commented that
“it is about time someone pointed out that ‘making trades’ is not the equivalent to providing liquidity. The exchanges have hoodwinked the regulators into believing that it is acceptable to pay brokers for making trades. The farce is that the exchanges call it ‘providing liquidity.’ Regulators need to wake up to the fact that sales which trigger due to a price decrease are removing liquidity (i.e. exaggerating price moves) rather than providing it.” (Buchanan 2011, 3)
“SofaCall” commented,
“This is why I have been out of the market for the past six years. It’s no longer possible to handicap value. All you can do is try to catch the trend—which is not much different from casino gambling. Only the gaming commissions do a better job of assuring that casinos are honest than the SEC does with the financial markets.” (Buchanan 2011, 4)
25Calling the SEC report “a trend in finance industry public relations strategy,” “Matthew” noted that most people “won’t understand they are being ripped off if the con is cloaked in a modest level of complexity” (Buchanan 2011, 5). “H_H_Holmes” succinctly summed up the consensus: “The façade that the industry has anything to do with ‘people’ investing in ‘businesses’ is gone. All algorithms, all the time. Joe Six-Pak, meet Skynet” (Buchanan 2011, 4).
26The significance of the flash crash was underscored by Nanex’s research into other mini-flash crashes that occurred from 2006- 2010 but went largely unnoticed because they involved single stocks and unfolded too quickly to attract attention. Flash crashes tend to disappear in statistics using day-by-day figures; a finer-grained temporal metric, such as the one used by Nanex, is necessary to spot them. Nanex’s analysis found 254 mini-crashes in 2006, before reg NMS was initiated, and in 2007, when reg NMS was being rolled out over the course of a few months, 2,576. In 2008, when reg NMS was fully in effect, that number increased to 4,065. Clearly, the unintended effect of reg NMS was dramatically to increase the likelihood of flash crashes.1
27Neil Johnson, a physicist at the University of Miami, published with his colleagues “Financial Black Swans Driven by Ultrafast Machine Ecology” (Johnson et al. 2012) that confirmed Nanex’s research ( “Black Swan” denotes a highly unlikely but highly significant event and alludes to Nassim Nicholas Taleb’s book, 2010). Their abstract notes,
“We provide empirical evidence for, and an accompanying theory of, an abrupt system-wide transition from a mixed human-machine phase to a new all-machine phase characterized by frequent black swan events with ultrafast durations.”
28Analyzing 18,530 black swan events between 2006-2011, they construct a model that indicates black swan events are more likely as the time duration decreases and as the diversity of strategies diminishes. They assume that the algorithms, having to act so fast, must have only a few strategies from which to choose, and moreover that many algorithms will have very similar strategies, exacerbating the possibilities for feedback loops. The upshot is that black swan events are not anomalous; rather, they reflect the dynamics of a machine-driven trading in which humans play no part in the actual transactions. Reflecting back on the SEC report about the May 6, 2010 flash crash, we can now see that their conclusion that the Waddell & Reed transaction was the culprit is incorrect. It may have been the initiating event (i.e., the “last straw”), but it is the system dynamics that make such crashes inevitable. Lewis quotes Brad Katsuyama’s reaction when he analyzed the SEC report—what leaped out to him was its “old-fashioned sense of time” (Lewis, 81). Katsuyama concluded, “Once you get a sense of the speed... you realize that explanations like this... are not right” (81).
29For better and worse, finance capital is so deeply enmeshed with the self-organizing ecology of ultrafast machine algorithms that it has become impossible to think of our global economy without them. Regulations by the SEC and other bodies are crucially important, but reg NMS makes clear that the machine ecology is extraordinarily sensitive to small changes in its environment (i.e., the regulatory framework within which it operates). Every new regulation introduces new ways to game the system, a fact confirmed by historical research (101). Consequently, Lewis’s interlocutors conclude, “There was zero chance that the problem would be solved by financial regulation” (101). Their solution, Lewis explains, was to design an exchange that does not incentivize the predations of HFT, realized in their creation of Investors Exchange (IEX).
30Here is precisely where the humanities have a crucial role to play. Catalyzed by an idealistic desire to make the markets fairer, Katsuyama and his colleagues put everything on the line to make this vision a reality. Their commitment illustrates that in a large sense, the situation is not strictly a technological problem but one of value and meaning (recall Heidegger’s famous argument that he essence of technology is nothing technological). One of the few humanists to tackle the issue is Bernard Stiegler. Through a series of important texts—Technics and Time 1 and 2 (1994/1998; 1996/2008), Taking Care of Youth and the Generations (2008/2010), and especially For a New Critique of Political Economy (2009/2010)—he undertakes the ambitious project of proposing a general framework through which to understand the co-evolution of humans and technics. Because of its scope, the project has both impressive virtues and limitations that may not be immediately apparent. The test case proposed here, automated trading, may help to refine Stiegler’s framework and also reveal new insights as to where the co-evolution between humans and technics is heading in the contemporary period.
31In Technics and Time 1 and 2, Stiegler develops the concept of tertiary retention—memories stored in artfact that allow access to events a person never experienced first-hand. Moreover, he argues that tertiary retention precedes individual cognition. In Taking Care, this formulation is linked with the development of “long circuits” that allow transindividuation to occur, resulting in an individual becoming intellectually mature and taking responsibility. Subverting the development of “long circuits” are what Stiegler calls the programming industries such as television, video games, and the Web, which seek to capture the attention of young people and convert the tradition of what I have called deep attention into hyper attention (Hayles 2012).
32The case of automated trading programs allows us to see in Stiegler’s concept of tertiary retention an unrecognized print-centric bias. While tertiary retention works well for books as external storage devices, it covers over the rupture that occurred when technics ceased to be only about storage and instead became about machine agency and systemic machine ecologies. A book can be said to possess agency when a human reads it and the act of reading causes the human’s cognitive system to work in a different way—a dynamic central to the humanities, with their deep tradition of ideas conveyed through writing or what Stiegler calls a “grammatological” process. Note, however, that this agency is converted from a passive possibility into an actuality only because a human is involved in writing and reading. In contrast, the point of automated trading programs is precisely that, once created and set in motion, they do not require any human agency to act. Indeed, humans are deliberately cut out of the circuit to allow the machines access to the microtemporalities essential for high-frequency trading. Stiegler is correct in noting that the creation of a “grammatological” process such as alphabetization breaks a continuous flow of words into discrete units such as letters, phonemes, etc. However, using this same term for digitization glosses over the difference between alphabetization and executable program code: while letters are passive, code executed by a machine can actually make things happen without human intervention. “Long circuits,” as Stiegler uses them, apply to human cognition. But machines have “long circuits” too, and the effects here are quite different from those Stiegler attributes to humans. When the automated computer programs execute trades, they often do so in circuitous routes in order to optimize certain parameters, for example keeping the fees that brokers pay as low as possible. Their effects, in other words, protect and advance vampiric capitalism, as pointed out by Arnuk and Saluzzi as they diagram the convoluted “wandering path from order-generation to execution” (2012, 146).
33The “long circuits” in intelligent machines are also related to anticipation. This too becomes a machine function as the algorithms search for patterns that will enable them to predict the next micro-movement of stock prices, as well as orders that competing algorithms may be about to execute. In a sense, these algorithms have both too much memory—hypermnesia—as they race through real-time data streams analyzing what all the other algorithms are doing, and too little memory—hypomnesia—as they jettison the information about what happened yesterday (never mind years or centuries ago) to cope with the huge amounts of information streaming into their processing units. Unlike humans who must sleep to process their memories, these machines have no unused computer cycles. Even when the markets are closed, they are still gathering data, analyzing it, and placing orders that affect what the opening stock prices will be.
34Clearly, then, focusing on memory functions alone is insufficient to understand the complex dynamics of the machine ecologies of automated trading. Also crucial is the agency that the machines possess and their abilities to analyze and predict at lightning speeds, and also to evolve and learn as they compete with other algorithms. Automated trading systems embody evolutionary dynamics that can lead to unpredictable consequences and emergent behaviors. Humans may set up these systems, but they are not in control of how they operate, evolve, and mutate. The issue is not memory alone, but a transformation of global economic systems that increasingly drive us toward vampiric capitalism and away from social responsibility.
35How might humanists contribute to the increasingly precarious position into which these developments are leading us? Here Stiegler makes a crucial contribution, for he insists that the fundamental questions concerning technics have to do with meaning, interpretation, and values. Humanists are unlikely to contribute to debates about regulatory reform of the stock markets, a task that requires intimate knowledge about the systemic dynamics to ensure that the reforms will not have deleterious unintended consequences, which in any event may be unavoidable. Humanists can contribute, however, to discussions about the larger social purposes that finance capital is intended to serve. As recent events have vividly demonstrated, the profit motive stripped of all other considerations leads to a disastrous spike upwards in systemic risk and, consequently, in global economic instability. Humanists can help to put finance capital in historical perspective and connect it with values such as social responsibility, fairness and economic justice.
36Questions of meaning that occupy the humanities, then, are if anything even more important with the rise of the cognitive nonconscious. At the same time, the emphasis on human reason alone as the relevant attribute to explain the world (as in rational actor theory) surely needs modifications in two directions. We must recognize that cognition is a much broader category than thought and we also need to re-position cognition as a systemic property that occurs in complex adaptive systems and not solely inside the mind of a single human actor (Hayles 2014).
37What kind of initiatives will further this goal? There is already a growing body of research that undertakes this task, including historical research on finance capital (Poovey, Baucom), ethnographies of Wall Street (Ho, McKenzie), STS-inflected studies of finance (Callon, McKenzie), and analyses of automated cognition (Parisi). This emerging field, which lacks a universally acknowledged name, might be called “Critical Studies in Finance,” a term that links economic practices with the recognition that the world has a stake in these practices, which consequently cannot and should not be considered only in terms of how much profit they generate.
38To be taken seriously in this endeavor, humanists will need to learn the vocabulary, mechanisms, and histories of finance capital. While Stiegler’s work has much to offer in this regard, it is couched in terminology that an economist would find completely opaque. The work of building bridges between finance capital and the rich critical and philosophical traditions of the humanities requires that humanists learn to write and speak in ways legible to the finance community, for only so can there be a successful transmission of ideas across these fields. The price to gain admission to discussions with economists, business school professors, traders, politicians and other influential actors is steep, but the potential contributions humanists can make more than justify the investment. If there is no way out of the global financial system, then the way forward may require going more deeply into it. “Critical Studies in Finance Capital” should be a project in which humanists claim their stakes and make their arguments, transforming it even as we are also transformed by it.
Notes de bas de page
1 www.nanex.net/FlashCrashEquities/FlashCrashAnalysis_Equities.html
Auteur
Professor of Literature at Duke University, teaches and writes on the relations of literature, science, and technology in the twentieth and twenty-first centuries. Her book How We Became Posthuman: Virtual Bodies in Literature, Cybernetics and Informatics (1999) won the Rene Wellek Prize for the best book in Literary Theory for 1998-99, and her book Writing Machines (2002) won the Suzanne Langer Award for Outstanding Scholarship. She is currently at work on a book Cognition Everywhere: The Rise of the Cognitive Nonconscious (2014).
Le texte seul est utilisable sous licence Licence OpenEdition Books. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.
Le Sujet Digital
Claire Larsonneur, Arnauld Regnauld, Pierre Cassou-Noguès et al. (dir.) Stéphane Vanderhaeghe, Géraldine Bertres, Hélène Soldano et al. (trad.)
2015
Le Comportement des choses
Emanuele Quinz (dir.) Lise Thiollier, Gabriele Stera et Armelle Chrétien (trad.)
2021
Artistes-chercheur·es, chercheur·es-artistes
Performer les savoirs
Boudier Marion et Déchery Chloé (dir.)
2022
Architectures of memory
Jean-Marie Dallet et Bertrand Gervais (dir.) Armelle Chrétien et Joshua David Jordan (trad.)
2022
Angles morts du numérique ubiquitaire
Glossaire critique et amoureux
Yves Citton, Marie Lechner et Anthony Masure (dir.)
2023