Shackling Accidents: Culture and Chance
Steven Connor
A talk given in different versions at the University of Kent, 30 November 2011, and the Royal Academy Schools, London, 23 January 2012. [pdf version]
In what follows I have three purposes. I would like first of all to show that we misunderstand the operations of what we call ‘chance’ if we reify it, separating it from the sphere of the determined. This makes the aims either of leaving nothing to chance or of yielding oneself up to it, especially if this is with the aim of ‘loosening’ or ‘liberating’, equally implausible. Secondly, I want to show the consequences that a reformed view of chance may have for our understanding of history and our relations to it. I will propose in particular that past and present are always intertwined probabilistically, such that ‘what happened’ will always be able to be put into play in new, and not entirely predictable interpretative contexts. There will always be facticity, since the practice of history would be meaningless without it, but that facticity will be cryptic or abstract until the force of the facticity is activated within particular fields of probability. Finally, I will warn that, although the prospect of a more probabilistic climate in the arts and humanities may suggest new forms of judgement and sense-making, it is important not to overestimate the chances of this, since, persistent rumours to the contrary, knowing what it is you are doing does not always conduce to doing it more successfully.
Tristan Tzara offers this well-known recipe for making a chance poem:
TO MAKE A DADAIST POEM
Take a newspaper.
Take some scissors.
Choose from this paper an article of the length you want to make your poem.
Cut out the article.
Next carefully cut out each of the words that makes up this article and put them all in a bag.
Shake gently.
Next take out each cutting one after the other.
Copy conscientiously in the order in which they left the bag.
The poem will resemble you.
And there you are—an infinitely original author of charming sensibility, even though unappreciated by the vulgar herd. (Tzara 1920, 18)
The most immediately striking feature of this text is how much determination – in both senses, of strong intent and the specification of actions and effects – is threaded through its chance operations. To begin with, one must decide, or, rather, have already decided, how long the poem one wants to write must be. Indeed, prior to that decision, one must evidently have already decided to write a poem, and a dadaist poem at that. One must choose one article, and take care to cut round each of the words in the article. The bag in which the words are to be re-ordered is to be shaken ‘gently’ – as though it might in some way invalidate the result, or skew its chanciness to agitate it too vigorously, thereby perhaps shaking it so far away from orderliness as to risk driving orderliness back in. One must copy it out ‘conscientiously’, and must preserve the order in which the words have come out of the bag. Clearly, these are not the kinds of operation one should undertake in an absent-minded condition, or when there was the possibility of persons from Porlock interrupting the process
The outcome is that ‘The poem will resemble you.’ Is this supposed to be the triumphant consummation of the aleatory operation, or its hapless collapse? Perhaps Tzara is suggesting that the vigilant suspension of everything that might exert conscious influence over the operation will give access to the unconscious essence of the person doing the selection (this being a common promise made about chance operations in surrealism). But perhaps he is simply pointing to the fact that, given the strict armature of the aleatory ritual, the resulting poem cannot help but end up as a portrait of the chancey artists or the artist as chancer. That is, the words thus arbitrarily assembled will, mirabile dictu, themselves provide a revelation of the hidden reality of the assembler, which will have been liberated by the process. But perhaps he might also mean that the miraculous benediction of the poem will be the very opposite of a chance occurrence, precisely because it will have been so carefully and rigorously set up, and because the magical procedure mandates, as a matter not at all of chance, but of absolute necessity, that, whatever the result will be, it will be bound, or at least exceedingly likely, to seem like some spooky miracle of aptness.
Tzara’s recipe, which is usually quoted as though it were itself a poem, though, if so, it would seem that it could not, by its own design specifications, be a dadaist poem, is a reflection on the trickiness of achieving chance. Pure chance can only be guaranteed by strict determination, because ‘chance’ cannot be relied upon to happen by chance. So ‘mere’ chance has a good chance of being impure, contaminated by determination. Chance, like death, is hard to avoid, until one resolves to embrace it, at which point, like death again. it has a way of becoming coyly elusive. Furthermore, the recipe seems to recognise that chance does not persist for long. Like ignorance, according to Lady Bracknell, chance is ‘like a delicate exotic fruit; touch it and the bloom is gone’ (Wilde 1971, 266). The recipe intimates how difficult it is to cross over entirely on to the side of chance; seemingly, it is as hard to get chance into one’s poem as it is to keep it out.
Dadaism was only one of the areas of art practice to become interested in trying to exploit the operations of chance. This is something different from simply playing the odds, in the way in which a gambler might, since a gambler only wins if he is lucky. The sort of betting on chance engaged in by Dada is of a kind such that, as long as the chance procedure is carefully-enough constructed and the mechanisms of the aleatory procedure followed to the letter, the player of the game cannot help but get lucky, since they will always and without fail be exposed to the operations of ‘pure’ chance.
The awareness of ‘chance’ has led to a tendency to reify it, even, perhaps, to deify it, in the manner in which Fortuna was, if not worshipped, then acknowledged through medieval times, with corresponding efforts to ensure that it is more and more pure. As well as becoming a substantive, chance has tended to become an adjectival quality, as in the contemporary use of the word ‘random’, or in its intensified form ‘really random’ – the equivalent of being slightly pregnant or reasonably dead. In such a locution, of course, ‘random’ means nothing more than ‘pleasantly unexpected’ or ‘quirkily unpredictable’. But randomness has no specific quality, no defining tone, hue or cast, for randomness is the absence of any detectible determination whatever.
It may help to make it clear to ourselves that the word ‘chance’ does not signal a force of pure randomness. We have a tendency to think of chance as a kind of loosening or dissipation, that scatters coherence and breaks open regularity. But chance is not all on the side of the scattering of coherence. Chance is always plural and never ‘pure’, for it involves an irregular distribution of probabilities between different outcomes, So, where habits and regularities form, these are not opposed to chance, but themselves arise from it. Regularity may be much less probable than irregularity, but it is not in any sense opposed to or on the other side from chance. To say that something happens ‘by chance’ suggests something that happens without a cause or reason. But there is no simple division between things that happen for a clearly determinable reason, or seem to, and those that do not. For most of the things that do have specifiable causes or reasons, those causes or reasons are not absolutely determining, and whether or not they are determining is itself contingent. Whether one thing will turn out to be dependent on another is usually itself dependent on other things still. Only causes or reasons that are absolutely necessary and entirely unaffected by intervening circumstance and not just highly probable are immune from chance – but there are very few if these and perhaps none at all. There is, James Clerk Maxwell was reluctantly obliged to accept, no absolute law decreeing that closed energy systems move from a condition of lower to a condition of higher entropy, that is, in which, though exactly the same amount of energy remains in the system, it is distributed in such a way that less of it is able to be converted into work. That this will tend to occur is not necessary, but just very, very probable. What is known as the Second ‘Law’ of Thermodynamics is therefore no such thing, but rather just a moral certainty, as it might have been called in the eighteenth century, or a racing certainty, as we might call it in ours. Every time entropy does in fact increase, it is dependent on the chance that, once again, the more probable rather than the less will have happened, when it did not absolutely have to. The chances may be a squazillion to one that the randomly moving molecules in a cup of hot coffee will one day all spontaneously assemble themselves up into a hologram of the Sacred Heart shimmering in mid-air, but, since there is nothing to prohibit it absolutely, every time it fails to happen, it is in part a matter of chance – in the sense of ther differential distribution of chances – rather than necessity that it has not (but you’ll be starting to see that I think chance is always only ever chance ‘in part’ and never wholely or solely).
This complexity may put a new complexion on the interest in the operations of chance that arises in many different art forms at the beginning of the twentieth century. We may perhaps see this aleatory effervescence as a reversion to the postulation of the purely accidental that had seemingly been in retreat since the 1660s, when Pascal and Fermat engaged in the correspondence about ‘problem of points’, or how to distribute the stake in an interrupted game, that inaugurated the modern mathematics of probability. Up to that point, God had veritably not played dice, even if the devil (the deuce) did. That is, there was the realm of absolute certainties, and the realm of the purely accidental, with nothing in between. One ould count on the reversals of fortune being unaccountable, but no more precise calculus of possibilities could be undertaken. As Ian Hacking has shown in The Emergence of Probability and The Taming of Chance, the next 250 years would see mathematics and statistics making ever deeper inroads into the previously trackless terrain of chance. Physics, economics and government all came to depend upon probability calculus. It was not until the beginning of the twentieth century that, on various fronts, embodied in the work of philosophers like C.S.Peirce and poets like Mallarmé,that the idea of pure or ‘objecive’ chance began to reassert itself.
In an obvious sense, this looks like the result of recognition both of the unavoidability of chance and an acknowledgement of its generative powers. Artists of many different kinds have seen an openness to chance as one of the most powerful forms of resistance, discovery and renewal in a world characterised increasingly by rational management, and the apparently remorseless reduction of every kind of risk. One of the most dubious of the many ways there are of reifying or in fact personifying chance is by identification of it with the unconscious. Thus, surrealists, following nineteenth-century spiritualists, who themselves followed the history of divinatory practice, claimed to have come up with a number of techniques to suspend the determination or invigilation of the conscious self, chance being thought to be a foolproof way to outflank the dozing sentries and censors of rationality, in the ever more sure and certain hope that what will then be enabled to speak will be ‘the unconscious’. The definite article here is a kind of trick or sleight of tongue. On the one hand, the unconscious is thought to be a personal unconscious, that article than which nothing could in fact be more definite, since it is the secret core of the ore of my personality, its determining engine, its essence. On the other hand, it is the part of me that is thought to be ‘open to’ or ‘traversed by’ all kinds of accidents and randomness.
This is a very different notion from that of the Freudian unconscious, which is, we must recall, not in the least accidental, since it is brought into being by the act of repression, and things are not repressed at random, not even randomness itself. Now, I believe in the existence of the unconscious, by which I mean that I believe that there are many things of which I am not and could not be conscious, for several different kinds of reason; for instance, because they are known, but unfortunately not by me, like whether it is raining or Buenos Aires, or if the 9.25 from Bristol actually arrived on time this morning; or because they are currently unknown by anybody, like the exact population of the earth at this precise moment; or because nobody really knows for sure whether it is possible ever to know them. (For reasons of economy I am of course simply conflating ‘knowing’ here with ‘being aware of’.) Naturally, this is a very large, not to say immense, understanding of the scope of the unconscious, and thus a very weak one, that because it can do all kinds of work in different contexts, depending, does no very specific work in any particular direction.
Of course, this is not what is meant by ‘the unconscious’, which is supposed to be not just the sum total of all the things of which I happen not be conscious, or to be able to be conscious. My unconscious is supposed to be a personalised profile of the purely contingent, an anthology of exquisitely individuated accidents. It is a certain subset of all these forms of unconsciousness, consisting of things that, for specific and significant reasons, I have difficulty in summoning to mind or acknowledging. It is this notion, the idea that there is a little pocket of personalised accident that is my unconscious, that I find unpersuasive. For instance, I can never remember the name of the actor who plays Dumbledore in the Harry Potter films. Big chap, rather gravelly voice, you know who I mean. Actually, that is not true, I remember this name all the time, in the sense of actively recalling it, repeatedly winkling it out, blinking, from its, or my, condition of nescience. Since I like this actor’s work a lot and have occasion often to refer to his propensity for playing practical jokes on fellow actors, I really need to have this name accessible to me, and so have developed mnemonic routines for keeping it at hand. For example, when I saw him in Pinter’s No Man’s Land, his legs collapsed wonderfully under him in the scene in which he is required suddenly to pass out, which suggested to me that it would be an infallible ruse to associate his surname with French jambe, leg. But where other such mnemonics prove to be stubbornly unshiftable in proportion to their arbitrariness or absurdity, this one fails to do its job, for it itself wriggles skittishly away from my groping memory, so I have begun to think I will need to tag it with its own mnemonic to enable me to summon it to my service. It is indeed as though I am, for some reason I must apparently keep secret from myself, stubbornly determined to remember to forget the name of Michael Gambon.
Perhaps after all there is after all some interesting significance in all of this. Perhaps I could and should undertake some auto-analytic investigation, to be entitled perhaps ‘On A Disturbance of Memory in the Duke of York’s Theatre’, and use it as a way to uncover some tediously predictable traumatic imbroglio at the heart of my being. But I do not think so. I do not think I am wrong in saying that this phenomenon takes place mostly on the principle that nothing succeeds like success, that is, that it has established itself as a minimally troublesome little cognitive sub-routine, that tends to keep happening just because it so often has before. In one sense, to be sure, it’s a personal unconscious, in the sense that it is a sort of local impediment to consciousness, and in the sense that it affects me rather than you, like the tendency I have to get styes on my left rather than my right eye, but this is not to say that it has a personal significance (beyond the one I have just been, but quite self-consciously, constructing for it, of course). The chances of there not being some such glitches in each one of us are almost infinitely remote, but this does not make them an unconscious that has a particular, personally significant and defining profile, something that makes it ‘my’ unconscious, in some more interesting way, some way that bears illuminatingly upon the business of my being me.
This is admittedly the classically Freudian idea of the personal unconscious, that tends to get referred to with the distinctly unFreudian term the ‘subconscious’, especially if one wishes to evoke a sort of feral second self, a Caliban, snuffling and scuttling around the cellar of my being, who can be caught only briefly, with matted hair and wild eyes, in the beam of my torch. The problem with this Mr Hyde or Dorian Gray idea of the unconscious is that it is so domestically determinate. It infallibly resembles me, if only as the negative of a photograph resembles the positive. But what the surrealists tended to graft on to this Freudian idea of the personal unconscious is the Jungian idea of the collective unconscious, an idea that Freud himself certainly dallied with, in his more anthropological moments, but which he was much more reluctant than Jung to furnish with specific fixtures and fittings. For the problem with the collective unconscious is that actually, it is much more, rather than much less predictable than my conscious, choosing self, as the many catalogues of its allegedly archetypal contents make clear. Often, the mad are saddled with the job of embodying the possibility of exposure to the unconscious, since this is supposed to be brought about by the dissolution of the bonds of rational thought. The mad (that incarcerating definite article again) are thought to be more open than 9-5 citizens to experiences and connections that are rich and strange, a veritable thesaurus of the unpredictable. The problem is that the mad are on the whole a lot less unpredictable than the sane. Just as there are few more tedious ways of spending an evening than in the company of somebody experiencing pharmacological hallucination, so many psychotic delusions are in fact much less exotic, and much more systematic and logically ordered, than even the run-of-the-mill reveries of a moderately well-adjusted person. Walk around the gallery of the Prinzhorn collection in Heidelberg, and you will, I think, tire quickly of all that B-grade sexual fetishism, all those bales of newspaper and coils of string, all those alien creatures and influencing machines. The body without organs is a regimented thing indeed. Antonin Artauds are rare, while David Ickes are sadly standard-issue.
The unconscious has in fact been forcibly recruited to the work of rescuing chance as such from the tenacious grip of probability theory. This is a rescuing of the subject from itself, insofar as the modern subject can be thought of as having been contaminated by rationality and prudential calculus. The powers of resistance and renewal embodied in ‘the unconscious’ depend upon a conception of chance as a kind of pure exteriority to reason, or to the reasoning subject. But chance is in fact never available as this kind of absolute exteriority, or in any sort of ‘pure’ form, for something of the same reason that the subject can never fully be available to itself, namely that the subject always is in part the thing it would constitute as an exterior object. Similarly, the art that would make chance an exterior force on which to feed will always be liable to encounter the force of chance as part of its own operations, and intertwined with its most determined purposes.
This is because it is always possible, by chance, that some disappointingly or suspiciously orderly arrangement might arise in any undetermined procedure. None of us would be very convinced if, in response to the request to provide a sequence of 6 numbers at random, a programme were to generate the sequence 123456, but there is quite a significant chance of such a sequence arising at random. If it were really random, the shuffle programme on the iPod would play the same song twice or even three times in succession often enough for it to have happened at least to somebody you know. But it doesn’t, because the shuffling is in fact loaded in certain ways. As a sometime historian of and speculator on the voice, I have had occasion to enjoy and endure a number of episodes or performances of glossolalia, both in artistic and religious contexts, in which sounds are emitted that are said to be pure nonsensical utterance, or at least to belong to no recognisable language. The interesting feature of such utterances is that, far from being driven by the pure language of the spirit, or of the elemental passions, they always in fact seem to be subject to vigilant internal monitoring, so as to avoid the accidental articulation of meaningful words. Given that many of these words arise from the crystallisation of accident out of the mouths of babes and sucklings in many different times and climes, it is highly improbable that an entirely unfiltered stream of spontaneous utterance would not occasionally contain them, yet I have never heard a glossolalic performer come near to articulating the words ‘mummy’ or ‘plop’ or ‘bugger’ or ‘haddock’. In order to count as entirely open, such speech cannot be open to simply anything and everything. The order of accident must be tacitly defended against the accident of order.
Seen in these terms, the ideology of chance may be seen as the effort to disavow this intermingling of the determinate and the indeterminate – an intermingling that can never itself be fully determinate. What we may as well call the aleator, or artist of chance, is therefore the mirror image of the determinist; where the latter strives to leave nothing to chance, the former is at pains to have absolutely nothing go to plan.
These arguments are intended to act as a prelude to some reflections about the possibility of acknowledging the work of chance or perhaps, since this word is increasingly used to mean ‘pure’ chance, or randomness, the conditions of probability at work in the fields of art and culture. Works on the operations of chance in different art forms tend to focus on the ways in such forms might or might not succeed in ‘opening up’ to a principle that is held to be alien or inimical to its nature. Such a perspective allows one to rest safe in the assumption that ordinarily chance has no part in the constitution of the art or culture in question, for there could be no question of voluntarily opening up to something to which one is already in the nature of things exposed, any more than one can decide to exclude it absolutely.
The enthusiast for the art of chance procedures, convinced that such procedures produce a different kind of art from other kinds of compositional procedure, can easily be imagined as objecting in the following terms. ‘Yes, we can concede that there is an element of chance in all works, indeed in all actions of all kinds. But surely what matters is the degree of indetermination to which a work may be subject? Surely strongly designed or intended works are much more likely to exhibit a determinate and predictable form, over which chance has much less chance to exert an effect, than aleatory works, which are much less likely to be predictable?’ To take this line would be to abandon the absolute distinction between absolutely determinate and absolutely indeterminate, but to retain the difference nevertheless in a logic of approximation in which works are more or less, but to all intense and purposes more determinate, or more or less indeterminate, but pretty much mostly indeterminate. Of course, you can easily increase the chances of certain things happening – the chances of predictable things happening (follow a routine) or the chances of unpredictable things happening (employ a randomiser, like a dice or a rugby ball), but, as long as you are operating in a zone in which probability is somewhere between 0 and 1, you are not going to be able to increase or decrease ‘chance’ itself, precisely because that is not a known or given quantity. How much chance will in fact operate will always be in part a matter of chance. (But how large a part, exactly? Well, that will depend…)
The approximative way of thinking tends to indemnify a misunderstanding about the operations of chance and probability, a misunderstanding that comes from the defensive desire to reify chance, or sequester it entirely from determination. Call it the ‘aggregative fallacy’. It is nicely illustrated by an argument developed by Stanley Fish in response to a metaphorical scenario projected by Ronald Dworkin as a way of explaining how it is that judges make decisions based on the history of legal precedents. Dworkin asks us to imagine a chain novel being written by a sequence of authors, each of which reads what has come before and then contributes a chapter of their own. The first writer, says Dworkin, will be free, because she will operate in a field of unconstrained choice – the chapter they write can be about anybody, in any setting, and be written from any point of view and in any style that this frolicking fons-et-origo may decide. As the narrative is handed on and the collective, consecutive plot thickens, Dworkin reasons, the choices available will diminish, as successive authors have to take more and more account of the chapters that have already accumulated, until finally, for the last writer in the chain, it may seem as though there is only one possible concluding chapter that can be written.
Fish responds, counterintuitively, that in fact there is no difference, or at least none in terms of the degree of their freedom, between the first and the last in the chain. The first novelist will be free to write what they like, but they will be constrained by everything that is involved in the decision to write a novel, in terms of their understanding of what a novel is and can do. Nor will the putative last or penultimate in the chain have any essential advantage or disadvantage over those who have come before, or not, at least, in terms of the ratio between their freedom and their constraint. For the accumulated pages on top of which they will be sitting will not in fact be self-interpreting, but will themselves need to be construed. The last in the chain will need to interpret what has come before, and will always have the option of radically redefining his and therefore his reader’s understanding of what the foregoing novel is taken to be. Indeed, one might very well say that, given the kind of thing a novel is, that is, given the fact that sudden swerves of plot direction, or frame-switching and rug-pulling manoeuvres (it was all a dream, the detective himself did it) are so much part of the horizon of expectation of a novel, such radical reinterpretations may actually start to get more likely the longer the novel goes on. This means that the Johnny-come-lately in the chain is precisely as free and precisely as constrained as its Prime Mover – that is, his freedom and his constraint are locked indissolubly together: ‘He is constrained in that he can only continue in ways that are recognizable novel ways (and the same must be said of the first novelist’s act of “beginning”), and he is free in that no amount of textual accumulation can make his choice of one of those ways inescapable’ (Fish 1989, 91).
We can substitute without significant loss the concepts of determination and chance for the terms constraint and freedom. Whether determination grows or diminishes is itself not a given, but will all, always, depend – depend upon the conditions of making out to which the work is subject.
On this estimate, or in this way of conceiving what is involved in the act of estimation, it might sometimes be that the strongly intended or determined work, while being in a straightforward sense more defended against chance, is for that reason more at risk from it. The more set in its ways a novel or artwork may seem to be, the higher the possible yield of innovation or surprise, for both writer and reader. A text that may seem to have settled for a place in a comfortable and unchallenging minority niche, giving a modest but regular revenue of pleasure to its fans – Lady Audley’s Secret as an example of Victorian sensation fiction, say – can be reconstrued by a feminist readership as a searching investigation of the politics of the body.
What is often seen as a desirable dividend of innovation in artworks – largely because of the horizons of interpretation within which the things picked out as artworks tend to operate, in which sudden changes of meaning and value are themselves a premium source of value – may be seen as an undesirable, even catastrophic cost if one is talking about a bank or an immune system. It is commonly suggested nowadays, for example, that the immune system of somebody brought up under conditions of strictly controlled hygiene may be unable to cope with the unexpected infectious or pathogenic agents they may later encounter. By contrast, the immune system of a toddler who has consumed his mandatory peck of dirt and which has therefore been primed by the bacterial noise of random exposures may be much better defended against unpredictable contingencies. We may say that the strongly determined work can have the first kind of immunity. Precisely because it seems so strong, it may in fact be weak at certain crucial points, and in proportion to its strength. The strongly or programmatically undetermined work, by contrast, can come to seem almost immune to accident or the unexpected. In this respect, systematically randomised or aleatory works may be a little like the 1980s TV series based on the work of Roald Dahl that was called Tales of the Unexpected – in which the only thing that might unsettle the viewer would be the failure of an episode to furnish the tediously requisite twist or quirk.
There is another respect in which a strongly determined work may be regarded as more exposed to unpredictability than an undetermined one. For a strongly determined work is very likely to conjoin many different kinds of determination, operating with different degrees of force at different points in the work. The characters in a novel might be conventional, but the language obscure and highly-wrought; the setting might be stable and unvarying, but the plot subject to lurching time-shifts, and so on. The probabilities in such a work are differentially distributed, in something of the way in which, within a given volume of gas said to be at a certain temperature, there is in fact a distribution of different temperatures, of which the apparent or advertised temperature is a statistical average. Precisely because it is determined in so many different ways, and to such different degrees, the strongly-determined work is riddled with entry points for chance fluctuations to do their work, sometimes prompting local adjustments to restore equilibrium, sometimes propagating uncontrollably through the system. Such a work constitutes a stochastic landscape, full of chasms and outcrops, slopes, potholes and wrinkles, in which chance fluctuations might get a toehold.
But this local differentiation is much less likely to be there in a programmatically undetermined work, of the kind that might emerge from the procedure recommended by Tzara, for example. Here, the probabilities, and the improbabilities, are spread out much more uniformly. Since everything is as unpredictable as everything else, there is no landscape or profile of probabilities, no faultlines, no hot spots or shaded valleys, no map of mattering. There is more unpredictability on average in such a work, but because the unpredictability goes uniformly all the way down, it is much more predictable. Where the strongly determined work has many entry points for indetermination, the strongly-undetermined work only has one entry point for a difference that would make a difference, which is at the level of the initiating intention to make an aleatory work. (This may actually be one of the reasons why aleatory works are so routinely accompanied by a justifying framework explaining the precise procedures employed to produce indeterminate outcomes – not to guarantee the paradoxically broken integrity of the work, but actually to make available some point of leverage for the work, since, without the possibility of a difference that would make a difference, the information quotient of the work would be immaculately null.) As a result, the apparently stable work is in fact unstable, or at least at risk from instability; whereas the giddily unguessable work is in fact metastable, given stability, that is, by the very uniformity of its fluctuations. You can count on an absolutely aleatory work to do what it is told.
The system of the completely aleatory work is like the thermodynamic system that is approaching maximum entropy. In thermodynamic terms, entropy is a measure of the amount of energy that is available to do work in a given closed system, the higher the entropy, the less available work. In thermodynamic systems, the capacity to do work is a function of the amount of organised difference in the system – typically, for example, the separation of hot from cold molecules. The more disorder in the system, the less work can be got from it – you can make a heat engine with a volume of hot gas and a volume of cold gas, but you cannot make an engine when those two energy states have been shuffled together. Perhaps this is why the Lord God warns the lukewarm believer that he will be spewed out of His mouth (Rev. 3.16). Order here is not entirely subjective or observer-dependent, for it can be given a mathematical description. An ordered system is one which can be reduced to and generated by a formula that is more economical than the system itself; a chaotic system is one of which the description would offer no possibility of such compression, and would have to match the system exactly, like Borges’s imaginary map of a territory on the scale of 1:1. Things drift from order to disorder because, in a given system, the number of ways of being ordered (compressible by formula) will always be much smaller than the number of different ways in which it can be disordered (imcompressible). In moving from order to disorder, therefore, systems move from the less to the more probable, and maximum entropy equates both to maximum disorder and maximum probability. This may at first seem curious, given our tendency to think that disorder ought to be characterised by improbability. The traditional example of a pack of cards can help us over this difficulty. There is only one way in which a pack of cards can be ordered such that the four suits are grouped together and the cards run from Ace through to King within each suit. The number of ways in which the cards can fail to achieve this state (52!-1) is huge by contrast and therefore much more likely to occur. This helps to explain why highly disordered states also tend to exhibit what looks like equilibrium; the most likely state for a pack of cards (or, we might just as well say, a bag of letters) that is subject to a series of shufflings is one in which the unpredictability is, so to speak, evenly distributed through the pack.
This can also explain why so many aleatory works are often, frankly, so tedious, in since they offer so few genuine surprises, or, better perhaps, their surprises are so unfailingly and unsurprisingly ground out. After the first death, there is no other, as Dylan Thomas unreassuringly writes, since the prospect of dying repeatedly is not for most of us the biggest worry (Thomas 1985, 192). This might seem to contradict Mallarmé, who declared that ‘Un coup de dés n’abolira jamais l’hasard’. After the first roll of the dice, the one that decrees that the rest of the work will be generated by rolls of the dice, the scope for chance will be much reduced, precisely because the map of mattering will be so smooth and flat. The maximally randomising act is like the supreme, action-ending act the Cleopatra contemplates:
’Tis paltry to be Caesar:
Not being Fortune, he’s but Fortune’s knave,
A minister of her will: and it is great
To do that thing that ends all other deeds,
Which shackles accidents, and bolts up change (Antony and Cleopatra, 5.2.2-6, Shakespeare 1975, 194)
And yet there is perhaps there is another sense in which Mallarmé is in fact still right. For the very fact that the pleasures of aleatory works tend to be so flat and insipid is not a feature of the works themselves, nor a matter simply of the quotient of unpredictability they contain. It is also a matter of the way in which they function within particular fields of reception, and of how they work out in the different kinds of field in which they are put to work. That is, it is an exposure to relative unpredictability. We are not dealing with closed systems here, in other words, but chained or interlocking systems, in which one system of probabilities is subjected to the force of another.
I’d like to show this by considering what can happen in practice to such claims to have initiated or increased the amount of play in a system. One of the commonest ways in which chance is reified is as a force of liberation or at least of loosening, that can be employed to create new possibilities in a world thought to be otherwise cabined, cribbed and confined by the iron cage of determination and predictability. As I have already intimated, the dream of such a determinate world and the idea of the liquefying or animating force of ‘pure’ chance dance cheek to cheek. Examples of this are not hard to find, but I have recently been struck by what seemed to me to be a representative and instructive one in a recent essay by Natasha Lushetich. The essay describes some of the events that took place in the Fluxus exhibition mounted by Tate Modern in May 2008. Fluxus names a group of artists working in the 1960s and 1970s, whose work is characterised by the devising of various kinds of events and performative procedures. Lushetich writes in particular about the Flux Olympiad, a series of adapted, hampered and otherwise tampered-with games and sports devised by Larry Miller. One of the best of these seems to have been Beci Hendricks’s Stilt Soccer, which, as its title suggests, requires its players to play soccer while on stilts. The result is a series of improvised methods for trying to retain balance while also pursuing the goals of the game – and, of course, the game will only achieve the desired level of agreeable daftness if the players take it seriously, that is, pretend not to be simply pretending to play football.
This leads Lushetich to the suggestion that the game liberates a ‘fundamental undecidability’ (Lushetich 2011, 33), which, parodying more formalised games and sports, ‘restores playfulness to sport and subverts its objectification’ (Lushetich 2011, 34). In this, it is said to be representative of a number of such aleatory procedures which dissolve the ‘structurality of structure’ thus providing ‘a nonhegemonic socio-aesthetic practice’ (Lushetich 2011, 29, 35). Even allowing for the unhelpful smearing of senses in the term ‘nonhegemonic’, which could mean both ‘non-mainstream’ or ‘non-authoritarian’, though only the first of these is really accurate, this judgement seems unexceptionable. Plainly Stilt Soccer is a much looser, much less earnest kind of proceeding than actual soccer. However, I cannot make much sense of the claim that any kind of ‘fundamental undecidability’ is involved in this witty proceeding. First of all, it is governed by rules, just as ordinary soccer is. Indeed it is governed by exactly the same rules that govern soccer played in contact with the ground, albeit combined with another rule, the one requiring the players to walk and run on stilts, that makes all the other rules harder than ever to follow.
In fact, the intriguing thing about Stilt Soccer is that it is a perfectly plausible and possibly in time rather a good game, as well as being a witty send-up of one. If the results are unlike soccer as usually played, one has only to observe children who have only just been introduced to the arbitrary restriction of not being able to use any part of their bodies other than their feet to recognise that the way in which Stilt Soccer interferes with soccer is a pretty exact recapitulation of the way in which soccer itself interferes with the ordinary ways of carrying and projecting a ball – that is, by imposing a restriction that warps the field of probabilities.
But here is what seems to me to be the salient point. That field of probabilities (the differentially distributed likelihood of being able to control the ball with hands, elbows, feet and head) will itself always operate within other fields of probabilities, that determine (but only partially) the ways in which the activity of soccer will be understood to work. These are often spoken of as defining contexts, but I think they are much better thought of as probabilities, that may strongly predispose certain ways of understanding as opposed to others, but do not absolutely determine them. A determining context is not one that rules out chance but rather one in which there appears to be a very strong chance that unpredictable things will not occur.
Seen in this way, Stilt Soccer could only ‘restore playfulness to sport’ (Lushetich 2011, 34) if it were itself taken to be a sport, or a way of playing it. But what are the chances of this? How many people look to the Tate Modern website for details of soccer fixtures? This is the reason that the event is not a game of stilt soccer, but rather the instantiation of an agreeably prankish art-proceeding called Stilt Soccer. Typography is here a reliable indicator of typology; one does not go to the Emirates Stadium to see a work with the title Soccer Match Between Arsenal and Bolton – except, perhaps, implicitly in those games that are tellingly called ‘exhibition matches’. If it caught on, Stilt Soccer would have some chance (though even then a smallish one, I’d say) of restoring playfulness to sports. But Stilt Soccer is, on my estimate, vanishingly unlikely to have any such effect, since, in general, and so far, multi-billion pound sports industries are not much affected by developments in the fields of art practice and aesthetic theory. There is obviously a certain kind of playfulness in Stilt Soccer, but that playfulness is quite strongly determined. How far it can restore playfulness to anything will depend upon how that playfulness is itself put into play, or, as we say, played out, in different fields of expectation or probability.
This leaves – though one had as well say ‘preserves’ – the possibility of writing about chance not as topic (novels about gamblers), nor as strategy (the work of art employing chance procedures) but as universal, yet universally variable condition, or, in fact, the condition of universal variability. Chance would then be regarded, neither as something internal to the work, that is part of its theme or content, nor something outside it to which it is thought to be exposed, but as threaded through the very working of the work itself, as it is put into play. Chance is not on the other side from determination, it is the very process whereby determination and chance are distributed. Determination and chance are not to be put into separate piles and simply totted up, since the force of determination that a work will seem to exercise will itself be a function of chance.
So we might do well to avoid the bipolar mood swings of absolute necessity on the one hand and absolute chance on the other, and learn to inhabit the field of probability or what Gary Saul Morson has followed Aristotle in calling ‘causality for the most part’; as Morson tellingly observes ‘Books may be called Chance and Necessity, as Jacques Monod’s famous one is, but I have never seen one called Chance, Necessity, and For-the-Most-Part Causality’ (Morson 1998, 295). Morson is right to condemn the Leibnizian or Laplacian determinism that governs ways of seeing society and history. The name of Laplace is unfairly attached to determinism, given his great importance in the history of probability theory. But, even if Laplace is absolved of the blame for determinist thinking, Morson is right to call this way of thinking ‘crypto-theological’ (Morson 1998, 300). Oddly enough, the humanities and social sciences, though recoiling from the forms of quantitative thinking characteristic of the exact sciences, and proclaiming their difference from them in their embrace of the undetermined and indeterminable, in fact assume and inhabit a world of absolutes that has been laughably unlikely since at least the work of Laplace. It is often suggested that human affairs are not to be understood with the reductive models developed for understanding processes in the physical world, because the former involve a multi-parameter calculus that is too large and intricate to undertake with any hope of success. Yet this claim cohabits comfortably with the explicit or implicit dependence on models that while they are nearly always derived from the analysis of the natural world, are in fact far cruder and more approximate than any of the models that have been developed to deal with physical processes over the last couple of centuries. These metaphor-models are, for example, geological (‘strains’, ‘faultlines’, ‘eruptions’), or hydraulic (‘currents of influence’), or meteorological (‘prevailing climates of opinion’) or crystallographic (complex symmetrical structures of every kind). But such models would scarcely suffice to describe and predict the bobbings of a rubber duck in the bath, let alone the movements and tendencies of human affairs. If these affairs are really as complex as we say, why do we in the arts and humanities cling to such clunky, clanking machineries of mind to model them? The most charitable explanation is because we have no others available to us, or none that we are prepared to try to understand and adapt.
This dependence on the most reductive kinds of models permits us to permit ourselves to confuse precision and clarity. Precision in fact requires indeterminacy, it makes fuzziness unavoidable. Absolute clarity, by contrast, depends upon approximation. Strangely, then, it is the inexact sciences that breed absolutes, and the exact sciences that have long recognised the need to operate without them. The less you care about exactness, the more absoluteness you will allow yourself; the more exact you are, the less absolute you can afford to be.
By staking its prestige on chance, art is of course giving up its traditional claim to distinctiveness, namely that it was part of the human endeavour to create order, or, not precisely the same thing, reduce entropy in a noisily chaotic and unpredictable world. Going over to the side of what might seem most toxic to its endeavours, art in fact predictably seeks to secure itself distinctiveness, with the suggestion that it has unique powers to open up life-giving and generative zones of exception in a remorsely second-guessed and calculated world. But this elective identification between art and chance is in fact understandable as another way of taming chance, in Ian Hacking’s phrase, of gaining indefinitely from it, of banking upon it. The desirte to embrace chance is always an illusory dream, since chance always has you partly in its grip.
Still, I do think that, if art is mistaken in trying magnanimously to stand aside in favour of chance, it does at least have the possibility of manufacturing interesting transactions with it. A work like Tom Johnson’s Failing: A Very Difficult Piece for String Bass, for instance, seems to me to show that, if the idea is to get chance to show its hand, you are much better starting out devising demandingly determinate procedures, rather than trying to reduce the force of determination, on the principle that if you want an interestingly stochastic scattering, plot the points at which a tennis ball lands when you try to serve the ball so that it hits a coke can in the service box. Rather than trying to make determinacy over into contingency, Failing plaits the two chiasmically together. Chance is not on the other side of intention; rather, it is in the unclosable gap between intention and action that the variations of chance will (almost) invariably be found.
I have begun by speaking about the analysis of individual works, but this is in order to broach a way of thinking that would apply on a much larger scale, to the forms of organisation we call cultures, particularly as they may be conceived historically. If the oscillations of a planet’s magnetic field, the periodicity of a dripping tap, the formation of a snowflake, are to be understood as stochastic processes, why should we expect human history to line up with the bunglingly deterministic models we deploy upon it?
Cultures are sometimes represented as organised sets of principles, articles of faith which can be plainly articulated, along with the systems of behaviour to which they give rise – ‘Protestant cultures value individuality’. But such beliefs and behaviours are never in fact uniformly adhered to, or even universally mandated, in a particular culture, though there may be strong pressures towards them. Cultures are best thought of as climates – climates not only of opinion but also of feeling, belief and action. To be French or female or fin-de-siècle is to inhabit and to contribute to such a climate. But that climate is not a given, but rather a set of potentials or probabilities, values towards which things in that setting will tend. This means that cultures, like climates, are unlikely to have anything uniquely and finally distinctive about them. It may be that, as has often been said, a text like Hamlet marks the beginning of a particular style of intense self-consciousness that had not previously been part of the way in which individual human beings saw and thought of themselves, but that from the beginning of the seventeenth century onwards, would begin to be more common. But it is implausible and unhelpful to imagine that nobody could ever have felt anything like Hamlet’s self-relation before that, or that ‘the subject’ was born at that moment. This is not just because the transitions from one era, or prevailing structure of feeling, to another are slow and irregular. It is also because there is no absolute reason that a Hamlet-like self-relation might not have arisen in any other period whatsoever. It may have been quite unlikely, but it could have occurred, indeed, given the numbers of persons and occasions involved, it must have, and often. We might think of the possibility of such events as we would think of the possibility of unlikely climatic events, say, snow in Sydney. The lowest temperature ever recorded in Sydney was 2.1C, and the last recorded snowfall was in 1836 (though there are sticklers who insist that this can have been no more than airborne slush). But this is not enough to justify the assertion that it never snows in Sydney, only that it is very very rare for it to do so, and pretty unlikely that it will do so in the next, say, hundred years. Cultures are like spread bets, complex probability profiles.
Indeed, just as weather systems are not simply independently occurring events, but tend to feed back on and amplify themselves – a high pressure system may prove to be very stable, making it temporarily much more likely that the weather tomorrow will be the same as today than it usually is – so cultures will not only make certain events more likely, but will iterate those events. Indeed, what we mean by a culture may best be thought of, not as a field of likelihood of certain events happening – the appearance of Hamlet – but as fields of selective attention, that are much more likely to pick out certain kinds of event as significant than others.
Still, one might say, as with the weather, these fields of probability though they are always in operation, are necessarily always prospective. Nobody bets on a race that has already been run. Time runs from the soft to the hard, from the indefinite to the definite, the virtual into the actual. Time continually dissolves probability into positivity, possibles into givens, fractions into cardinal integers. I may have a 10% chance of a heart attack in the next ten years, but I have no chance at all of having 10% of a heart attack; the number of heart attacks I will have had in that time will either be zero or 1 or more. The process of time means that things that did not have to happen keep happening, and however unlikely they were, they thereafter will always have happened, though without having had to. This process itself has something like the force of a necessity; it has to happen that things happen that do not have to. Non-necessity is necessary.
This may appear to make it mean that probability considerations actually have no place in the retrospective constructions that we call history. Of course, it may certainly be that what we find of interest in a given historical field is strongly influenced by the probability gradient of our attention, as we selectively pick out and amplify certain kinds of feature in the field of givens. But it is hard not to believe that, even though we are exceedingly unlikely ever to be able to access and know it, that there was a ‘fact of the matter’ about everything that has already happened. The relations we establish with the past may be variable, but that with which we seek to establish a relation is surely not. The race has already been run, and, no matter how we reinterpret the outcome, it can never be rerun and the outcome decided differently.
History gives every appearance therefore of going from the soft to the hard. We think of facts – dates, data – as hard, which is to say finite and unchangeable, and relations as potential, infinite. There is no limit to what may be made of the Battle of Trafalgar, and no definitive way of estimating the odds on any particular way of making sense of it, since these will depend on preoccupations, and significance-amplifying conditions that have yet to arise. But we are probably mistaken to think that the facts with which relations are established are in fact givens, are in fact intervals of hardness between soft potentials.
It is sometimes said that quantum physics allows for the possibility that, at every moment, the alternative possibilities that seem to be cancelled by the fact of something turning out in one way rather than another, may in fact in some sense actually take place. There is therefore a universe in which I missed the train for my interview at Birkbeck College 32 years ago and decided instead to become a bricklayer. The difficulty with this way of thinking is that it is not self-evident what a fact actually is. There is the fact of my missing or not missing the train from Notting Hill, but this is a compound fact, one made up of many subsidiary ones, just as the Battle of Trafalgar is, each one of which might have gone one way or another, or indeed might have gone in an infinity of possible ways. It is hard to know where the ‘fact-horizon’ ought to be placed, beneath which no fact could exist, that is, no way of coming about that did not absolutely have to come about in the way it did could occur. Wondering about what the smallest fact or event could be, and therefore the smallest degree of change that could have been different from what it is, seems equivalent to wondering about what the smallest possible stretch of time could be, and wondering if there can be atoms of chance is the same as wondering if there are atoms of time.
Adherents of the multiverse theory seem committed either 1) to the view that, in the apparent absence of atoms, or finally indivisible units, of time, there is an infinity of universes branching out from every variable possibility contained within every stretch of time, however inconceivably small that may be, or 2) to the view that time disposes itself naturally into lines of fracture, producing break points, where the alternatives seem significant enough for alternative universes to bud off from them. At which point, one is forced to wonder how this significance is determined, and for whom.
This suggests that facts can never be natural events. What’s done is done, and can never be undone. But the question of precisely what it is that has in fact been done is not one that can easily be done with. So, in a certain sense, it is facts that are soft, not relations, or rather it is relations that make occurrences into hard facts. Nothing that has simply happened, that is to say happened without having entered into a relation of significance, has really happened, for there is nothing for or in terms of which it is a fact. Things that have happened only once have not yet happened at all – only things that have happened twice – once in the mode of occurrence, and then again in the mode of recurrence. This is not to deny that the events of history have taken place, but it is to say that these events are potentially infinite, and therefore without significance.
This means that there can really be no such things as the ‘events’ that Alain Badiou evokes. However rare and precious these atoms of incident may be, they can only be events insofar as they have already been put into play, retrospectively constituted as events, in the fields of probability to which they are subjected. Like everything else, events must take their chances. We think we are on the determinist side of events that have moved from the virtual to the actual, but we are always in fact between the virtual and the actual, the determined and the undetermined. We are like the player who, just having gone down 3-2 suggests ‘best of 7?’ Our situation, and the situation outlined by Stanley Fish, resembles the ‘problem of points’ investigated by Pascal and Fermat, in the correspondence which inaugurated modern probability calculus – that is, the problem of assessing the likely outcome of a game which one is in the middle of playing. What Pascal and Fermat could not take into account was that their way of estimating the just outcome of the game might in fact be part of the game.
So, I have tried to show that it is unhelpful to think of chance as outside or beyond determination. I have also suggested that this makes for a complex interlacing of before and after, anticipation and retrospection, in history, which articulates time, in both its senses, dividing and connecting it,
Certain kinds of quantitative method are beginning to become available, as digitisation and new methods of data analysis are providing us with more forms of evidence and ways of investigating it. We are likely to develop much more fine-grained and verifiable accounts of fields of meaning and the conflict and circulation of concepts as a result.
Nevertheless, probabilism is likely to affect the reading of texts rather less than the reading of the reading of texts, likely to lead to the making of history in different ways rather less than the making out of what it means to make history. It is hard not to feel that an enriched, or at least more particularised self-awareness of this kind is bound to lead to better, more informed practice. Yet such theoretical knowledge does not invariably produce better results, since investigating the rules and rationale of a game is not necessarily the same thing as playing it.
References
Fish, Stanley (1989). Doing What Comes Naturally: Change, Rhetoric, and the Practice of Theory in Literary and Legal Studies. Oxford: Clarendon.
Lushetich, Natasha (2011). ‘Ludus Populi: The Practice of Nonsense.’ Theatre Journal, 63, 23-41
Morson, Gary Saul (1998). ‘Contingency and Poetics.’ Philosophy and Literature, 22, 286-308.
Shakespeare, William (1975). Antony and Cleopatra. Ed. M.R. Ridley. London: Methuen.
Thomas, Dylan (1985). Poems. Ed. Daniel Jones. London: Dent.
Tzara, Tristan (1920). «Pour faire un poème dadaïste», Littérature, 15 (July-August), 18.
Wilde, Oscar (1971). Plays. Harmondsworth: Penguin.