Mild Peril: An Academical Enterlude
Steven Connor
[pdf]
It is an awful thing to be in an awful job. It is not as awful as being physically tortured but that is not because it is not itself a minor kind of torture. Awful jobs can be awful in different ways: they can be laborious, exhausting, painful, dangerous, demeaning, monotonous and, to use an expressive phrase that is much less current than it used to be, ‘soul-destroying’. This phrase first appears in print in the early seventeenth century and was very quickly doing a roaring trade, mostly to assist various kinds of moral roaring against sin, or other kinds of aberration, which are held to be the thing that puts one at most risk of destroying one’s soul. The earliest use of this phrase in print is in William Prynne’s feverishly voluble denunciation of stage-plays (in which Prynne is careful to include ‘academical enterludes’), Histrio-Mastix, which rails against ‘the dangerous quality of these effeminating soule-destroying sinnes, which are more pernicious to a Common-weale, than pestilence or warre it self; more fatall to men’s soules and bodies, than any Circean charmes’ (Prynne 1633, 509).
It is not at all clear whether the destruction of the soul is logically or theologically feasible. Many definitions of the soul insist that it is the immortal part of one’s being, such that the notion of a mortal soul would be as oxymoronic as that of a high valley, a solid gas, or a flourishing desert (all of them child’s play to imagine, of course). David Hume offered in his Treatise of Human Nature the calm estimate that his existence consisted of an endlessly varying succession of experiences without any continuous underlying something that had or did the experiencing:
For my part, when I enter most intimately into what I call myself, I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure. I never can catch myself at any time without a perception, and never can observe any thing but the perception. When my perceptions are remov’d for any time, as by sound-sleep; so long am I insensible of myself, and may truly be said not to exist. And were all my perceptions remov’d by death, and cou’d I neither think, nor feel, nor see, nor love, nor hate after the dissolution of my body, I shou’d be entirely annihilated, nor do I conceive what is farther requisite to make me a perfect non-entity. (Hume 1985, 300)
This passage, in its action and intent literally soul-destroying, as it was clearly understood to be by horrified readers across Europe, still has a certain power to unnerve. It unnerved me, anyway, when it was paraphrased, out of the blue and con brio and with no warning at all, by a philosophy tutor in Wadham College in 1974.
Many of the things that make work awful are things that make it seem to approach to the condition of slavery. Strangely, but how strange is it really, there is one feature of slavery which protects it from being as utterly ‘soul-destroying’ as it might be. This is the fact that slaves are traditionally exploited, that is, reduced to the condition of a mere instrument. This is a very bad thing, yet Kantians are mistaken if they think it is the worst thing of all. In my Dreamwork, I permitted myself the excitable claim, overstatement being a familiar danger in the making of books, that the real atrocity of slavery is that it forces the slave to pretend to work (Connor 2023a, 18-21). I still get the point I was making there, which is that one can only really work if one has agreed, however grudgingly, and however disadvantageous the terms, to subject oneself to the work one is doing, and so not only perform the task but take it, as we offhandedly say, ‘in hand’. But I here need to retreat a little from that view, in conceding that, despite the undoubted atrocity of existential imposture, in almost all cases, slaves do tend to be put to the performing of useful work. It may not be useful at all to the slave, but the point of having slaves is to force them to perform work that produces desirable things for other people, sugar, gold, trainers, bomb-casings, more money and so forth, work which it would be prohibitively hard or expensive to persuade humans to undertake on similar terms if they had a free choice. What slave-owners hardly ever do (I dare not say never) is keep slaves whimsically on hand in order to have them perform divertingly pointless pretend-work, work that is therefore not work at all, in the same way as Miss Havisham requires Pip to ‘play’ for her entertainment. It may sometimes occur, but the enterprise will tend quickly to be consumed by its own fatuity.
Yet fatuity is the feature of modern work that goes beyond slavery, as identified in the title of David Graeber’s puissantly pulse-taking book Bullshit Jobs (2018). The worst thing, Graeber maintains, about many forms of modern work – though this is still very far from being the worst kind of work there can be – is not that they are laborious, exhausting, painful, dangerous, demeaning or monotonous. Bullshit jobs assuredly can be all those things in some measure, but what is most ontologically depleting about them – soul-destroying, in modern parlance – is that they are also so often meaningless, in a way that is, like so many modern experiences, at once horrifying and hilarious. The difference between the chain-gang and the desk-work that is replacing it is that modern workers do not seem to feel they know what the essentially bureaucratic functions they are increasingly employed to perform are actually for, with many feeling that they may not be for anything at all beyond self-perpetuation, the ugly sister of sustainability. The fact that the phrase ‘soul-destroying’ is applied much more regularly to forms of labour than forms of behaviour may be a hint as to what we nowadays think a soul might consist of, or what it might really mean to be ‘a perfect non-entity’.
Now, this is going to be a bit of a wrench, but it’s my essay, so I mean to use this somewhat melodramatic preamble as a way into thinking about the institutional performativity played out in the academical interlude of trigger-warnings, or, to use the terms in which they tend to be more cooingly (less triggeringly), encountered nowadays, ‘content warnings’, or ‘content notes’.
Content warnings are used across many disciplines, though overwhelmingly, and understandably, in the disciplines that used to be known as the ‘arts’, and are now increasingly known as the ‘humanities’. I have hinted somewhere else (Connor 2024) at why one should take warning from this sinister shift. And undoubtedly the most prominent of these humanities disciplines for many years has been English Literature, which happens to be the subject which I have professed, or professed to haha, for most of my career. I do not mean to glory in the queenly preeminence of English, merely to observe that it has often been gloried in. And, while content warnings are encountered across humanities disciplines, my experience is that they are used with that particular mixture of grimness and relish I have described as the ‘zealotic style’ (Connor 2023, 146) in literature departments, especially English Literature. I have heard of departments in which it is strongly recommended that every single course module in an undergraduate programme, from medieval literature (descriptions of mêlée combat in The Knight’s Tale) to the present day, should have its customised roster of warnings.
Despite the opportunities for amused fuming such things provide to the popular press, and whatever else they will turn out to have been, for all things must pass, content warnings are an effort, undoubtedly well-intentioned on the part of many of their proponents, to assert the continuing seriousness and probity of the academic enterprise in the face of its soul-eroding inundation by bureaucratic operations of administration and management. University education is not alone in this, but the practices of content warning that have been elevated (but therefore also debased) into a kind of institutional ritual in universities are a distinctively procedural response to the inundation by procedure that Michael Herzfeld unforgettably called ‘the social production of indifference’ (Herzfeld 1993). This makes content warnings the fluttering bureaucratic pulse of a heartlessly bureaucratised world.
One of the important things to recognise about content warnings in English departments in particular is that they rest on a huge overestimation of the powers of literary texts. In this, literary texts can stand as an epitome of the high-minded, but quietly crazed belief, that seems to have escaped from the academic laboratory while the world was under house arrest, in the absolute power of cultural representations of all kinds, films, paintings, statues, symphonies, sitcoms and so on, to shape and determine the social world for good or ill, though largely, as we will momently see, the latter. The study of English literature inherited from F.R. Leavis and his followers a sense that it had a leading role to play in rescuing moral sensibility from the cretinising effects of mass culture, a matter thought, at least in English departments, to be of huge social urgency and import. From our present observation point this mission seems both latently and blatantly religious in style even if it was often militantly anti-religious in manifest content. Somehow, this gospel sense of the spiritual mission of English Studies passed across intact into what became known as Theory, despite the latter’s generational contempt for Leavisism and all its works. But where the mission of Leavisism was to use the development of literary-critical sensibility to wage mental fight against the soul-destroying vacuity of modern technologico-Benthamism, the mission of Theory, as it persists in its lowest-common-denominator keyword forms, stripped of all their tiresome circumstance but retaining all their pomp, is increasingly to shield vulnerable minds from literature. In essence, the conviction that literature and literary study provided what John Keats, recognising ‘how necessary a World of Pains and troubles is to school an Intelligence and make it a soul’, called a ‘System of Soul Making’ (Keats 1958, 2.102, 2.103) has been replaced by an anxiously zealous (that is, anxious to appear anxious) rearguard defence against various kinds of soul-destroying sin, injury and error. These two alternatives, of the spiritualising exaltation and the juridical self-arraignment of Literature, have in common a grandiose overestimation of literature’s powers, licensing in turn a pneumatic inflation (and this might be the principal point) of the powers and responsibilities of literature’s earthly exponents.
At the heart of content warnings is a more particular kind of exaggeration, of the magical powers of mimesis, as again applied much more to the proliferation of ills than to the contagion of good example. This is essentially a theory (suicidally self-belittling one might think, in the case of an educational institution) of the impotence of reason in the face of vileness and malevolence, for which the only conceivable remedy is avoidance. It is not unusual for students to be warned for example of the presence in assigned texts of insulting language, especially racist words like ‘nigger’. The thinking here seems to be that there is no amount of education, historical, philosophical, psychological, religious or linguistic, about the cruelties and humiliations imparted by such language that could ever countervail against its venomous potency when encountered in the raw and without personal protective equipment.
An important ingredient in and consequence of this is the setting aside, amateurish in the extreme in a Department of English, of the linguistic-philosophical distinction between use and mention. The magical power of certain kinds of expression seems to mean that to mention them must always and automatically be to put them to fell use, the puny quarantine of inverted commas affording no protection at all from their radioactive malignity. Let us not niggle too long over the fact that certain contextual factors, such as the use of the N-word by indemnified kinds of person, can wash it as miraculously white as the driven snow. One could easily imagine, however, that anti-warnings might sometimes be needed to make it clear that the impressively florid uses of the N-word in certain texts do not in fact merit any warning, auto-injected as they are with the antidote of their assumed authorship.
To edge a little further out on this perilous ledge, though, the fact that the N-word is only believed to have the immediately triggering force it does under certain, not always immediately self-evident, conditions might lead to a paradox we can call Schrödinger’s Shock. According to what Erwin Schrödinger in 1935 himself gently described as a ‘burlesque’ thought-experiment (Schrödinger 1935, 807), a cat in a closed and unobserved box which must suffer death by poisoning depending on whether or not a random sub-atomic event may occur must, under the Copenhagen interpretation of quantum mechanics, be regarded as undecideably both alive and extinct until the box is opened and it is discovered whether or not the lethal event has occurred. In a similar way, the allegedly irresistible force of the psychic kryptonite indwelling in the N-word either will or will not be (have been) unleashed, depending on what is known (or might subsequently come to be) about its provenance. The resulting trauma of exposure to it would have then to be regarded as in a state of superposition, both immediate and suspended. It is as though one could be both struck and not struck by lightning, pending verification of whether thunderstorms were predicted in that morning’s weather forecast. Curiously enough, Freud’s principle of Nachträglichkeit, deferred action, or, more etymologically, after-triggering, proposes that trauma might sometimes be seen in a similar way, as both a determinate point of origin in the past (what happened to cause all this trouble) and the retrospective effect of back-projection (the causative force of what I cannot but maintain must causatively have happened).
To be sure, the use-mention distinction, as formalised by Willard Quine (Quine 1940, 23-6), on the basis provided perhaps by Gottlob Frege’s 1892 distinction between Sinn and Bedeutung, sense and reference (Frege 1980, 56-78), and, earlier still, the White Knight’s nice denominative siftings regarding the song ‘Sitting on a Gate’ in Lewis Carroll’s Alice Through the Looking Glass, (Carroll 1982, 218), is very far from being a closed case. It is a topic in philosophy for heaven’s sake, and philosophical linguistics continues to provide amusing instruction and instructive amusement on the devious ways in which things may be done with and to words. Popular culture plays its part too, in the Ploddish pantomimes of hate-crime investigation, proving that every Dogberry has his day; or in giggly uses (mentions?) of the use/mention distinction like Natalie’s remarks to the Prime Minister in Love Actually: ‘Shit, I can’t believe I’ve just said that. And now I’ve gone and said “shit” – twice.’ But, behind the failure to recognise or credit the difference between use and mention and the grotesque victory of literalism to which it leads, is an ignominious retreat from the cognitive achievement of almost all four-year-olds known to psychology as the attainment of ‘theory of mind’. To one still possessed of this godlike power, though, what must really count are not content warnings, but context warnings, though they must imperil all semantic contentment.
One paradoxical effect of decreeing that certain terms and turns of phrase cannot but be lethal weapons is to make it simultaneously imperative and impossible to forbid them, except by dangerous allusion or periphrasis. How do you answer someone, an infant, say, who seeks confirmation of which particular N-word might be referred to by ‘the N-word’? If their tender soul has not already been corrupted by knowledge of the word, how are they ever to learn what word to avoid saying, or how prevented from wasting their lives superstitiously shunning the word ‘nightingale’ or ‘negligée’? Someone will have to use the word of power, on them, if merely advisory mention is ruled out. And what if, over time, ‘the N-word’ (the phrase, not the word) were to be used so often to conjure up forfending awareness of the offending N-word as to be contaminated by its toxicity? The history of religious prohibition, still alive and viciously kicking, confirms that the wiliness of sacred contagion hath no bottom.
It is sometimes conceded that the majority of young readers (not all that young, actually, for it is a singular fact that content warnings are much more common in universities, that is to say, among adults, than in schools) are probably unlikely to do more than shrug at finding that such words were once actually spoken and were even able to be printed. However, we are darkly warned, one can never entirely rule out the possibility of there being some readers who have personally experienced hurt and humiliation as a result of such racial insults, on whom the effect of encountering them anew may be to trigger a repetition of the painful feelings, exactly as they were originally experienced, or even in intensified form. So best to issue a general warning to everyone. Such phobic responses to particular words have sometimes been asserted, but I know of no evidence that they are regularly or systematically observed. It may be that there is a bleed-through, itself in confirmation of a logic of malignant mimesis, from the very different phenomenon of intensifying allergic reaction to insect bites or certain foodstuffs.
By contrast, the widely-attested view that repeated exposure to unpleasant experiences can dull their force – the entire point, Freud decided in Beyond the Pleasure Principle, of the mysterious compulsion to repeat painful experiences in ritual and game, in order to effect the ‘psychical binding of traumatic impressions’ (Freud 1955, 33) – while systematic avoidance of them seems to intensify their traumatic potency seems to have little influence on the practice of content warning. Nor is much credence given to the possibility that the study of nasty things might provide an arena for the kind of controlled exposure therapy that seems to have material benefit in the case of victims of extreme trauma. This is despite the obvious fact that the predominating pleasure provided by almost all imaginative or fictional writing is the immunological one of reading about nasty things happening to other people, or nice things accruing to them as a reward for the nastiness they have experienced in earlier chapters. At the very least, the claims of subjects like English Literature to develop emotional maturity and critical awareness in their students, which have been a very large part of their justification, seem likely to be substantially diluted by systematic practices of warning against things against which there is no protection to be had beyond primitive run-for-your-life or duck-and-cover.
One might add here that many forms of anxiety disorder are not nearly so closely tied to specific triggers as the view encouraged by specific and targeted avoidance might seem to imply. In conditions of generalised anxiety, though there may well be occasioning conditions of stress or pressure, the anxiety is latent, and so in a strange sense objectless, or even, as it seems, on the lookout for objects that can serve as something to be anxious about. The over-generalisation in popular culture of the traumatic model of mental disorder probably encourages a belief that there must always be some external anxiogenic cause which, once removed or avoided, will automatically diminish the anxiety: but this is not the case with what used to be known as endogenous forms of anxiety which, though they are themselves independent of external occasion, are nevertheless, and perhaps for that very reason, ready and able to attach themselves to may different kinds of thing by the sufferer, for whom finding some kind of cause for their suffering can seem to promise relief from it. In such cases, the object of the anxiety can really feel like the condition of anxiety itself, as an anxiety about not being able to help being made anxious by and about almost anything. A widespread culture of content warnings, which may seem to confirm the sense that there are reasons for anxiety wherever one looks, does not seem likely to be helpful to such persons.
One would not wish on anyone the job of working through all the available evidence to try to see if content warnings actually do any good. So we must be grateful to the authors of a recent wide-ranging meta-analysis in Clinical Psychological Science of the effects of content warnings. It concludes unceremoniously that:
Existing research on content warnings, content notes, and trigger warnings suggests that they are fruitless, although they do reliably induce a period of uncomfortable anticipation. Although many questions warrant further investigation, trigger warnings should not be used as a mental-health tool. (Bridgland, Jones and Bellet 2024, 768)
I doubt that this article will have much circulation among academics and administrators committed to practices and procedures of content warning, and am confident that its conclusions will be ignored or shrugged off by pious precautionists with quotas to fulfil. The possible danger of inducing anaphylactic oversensitivity through the encouragement and rewarding of avoidance behaviours is not often rehearsed. But it is, I imagine, not inconceivable that departments striving ever more officiously to deploy content warnings in the face of the finding that the empirical literature surveyed ‘consistently demonstrates that viewing a trigger warning appears to increase anticipatory anxiety’ (Bridgland, Jones and Bellet 2024, 753) may be doing real, systematic harm to students suffering from the conditions of latent or generalised anxiety just mentioned. They might even lay themselves open to expensive litigation, should word of the finding get around among opportunist affliction-hounds.
In general, however, I think that considerations of what content warnings are supposed to be for, or the effects that they are likely to have, are actually supremely immaterial. Though such considerations are constitutive parts of the workings of content warnings, they do not really seem to have much to do with their larger function. For the apotropaic logic of admonition is essentially a matter of ritual rather than rationale, making it a Wittgensteinian ‘form of life’ rather than a theory of it (Wittgenstein 2009, §19, §23, §241). The piety of monitory performance is what I recklessly saw fit a couple of years ago to call a ‘style of seriousness’ (Connor 2023b, 178-214). By their encouragement in students not, as once, of the need for careful reading but rather of the need to be careful of what they read, such warnings exhibit their own assiduously absolute form of paternal care, and thereby strive to prove their commitment to seriousness and responsibility. By demonstrating that it is taking the potential dangers of literature seriously, even, and perhaps especially, in the fact of exaggerating those dangers to the point of absurdity, monitory ceremonial is a collective obeisance to the importance of being earnest.
But it is an earnestness that cuts its own throat by being only of a certain kind and attentive only to certain, comically limited kinds of thing. Looking at the evidence of the areas typically covered by content warnings, it does seem as though they are less concerned with material that is likely to disturb than with material of which students are assumed and, let us be honest, are also strongly advised, to disapprove. Among the topics to which attention is routinely drawn are: child abuse, paedophilia and incest; eating disorders and body hatred; mental illness and ableism; hateful language directed at religious groups; misogyny, homophobia and transphobia; pornographic content; racist slurs; violence (including sexual assault, self-harm and suicide). All horrible and disreputable enough, to be sure, but very far, alas, from exhausting the field of human accident and emergency. There is a host of undoubtedly painful experiences which must be regarded as perfectly capable of causing upset in certain students, but which do not feature in content warnings, including, deep breath now: loss of religious faith, industrial accident, loveless marriage, bombardment, debt, divorce, restless leg syndrome, famine, amnesia, claustrophobia, unemployment, imprisonment, floodings, typhoons, volcanic eruptions and spontaneous human combustion. To help make a start on listing forms of wrongdoing rather than suffering, of which we might find equally abundant evidence in literary texts, but of which again there is scarcely a whisper in content warnings, let us borrow Leopold Bloom’s admirably even-handed survey of the things he decides might turn out to be even more reprehensible than adultery:
theft, highway robbery, cruelty to children and animals, obtaining money under false pretences, forgery, embezzlement, misappropriation of public money, betrayal of public trust, malingering, mayhem, corruption of minors, criminal libel, blackmail, contempt of court, arson, treason, felony, mutiny on the high seas, trespass, burglary, jailbreaking, practice of unnatural vice, desertion from armed forces in the field, perjury, poaching, usury, intelligence with the king’s enemies, impersonation, criminal assault, manslaughter, wilful and premeditated murder (Joyce 1993, 685)
One might be tempted to see the addiction to being appalled as a recent development in academic life in particular, but it is in reality of reassuringly long standing. And there needs no ghost come from the grave to remind us how soothing rather than anxiogenic shared disapproval tends to be. The syrupy balm of revulsion in good company is evidenced in the almost affectionate relish with which the Daily Telegraph reports the views of Angela Rayner, and the Guardian those of Nigel Farage, with no hint to readers that they should sit down before reading them, or offer of helplines to help them get over the shock. In such cases it is hard not to feel that content warnings are really functioning as a form of moral vade mecum or picture-book conduct manual.
As George Orwell suggested, precisely because power is vulnerable to absurdity, it can also recruit absurdity to its ends, the menacing force of the bearskin or the goose-step being of a laugh-if-you-feel-lucky kind (Orwell 1968, 61–2). The pretence – in the Old Pretender sense of a claim seriously maintained – to be paying responsible attention to serious matters and present dangers is a way for the disciplines of the humanities in general, and of their erstwhile flagship discipline of English in particular, to underwrite Harvey C. Mansfield’s ‘importance of importance’, and thereby, as mirror on mirror mirrored, to lease from it the special importance of their importance (Mansfield 2007).
I am very likely myself exaggerating the significance of all this exaggeration, especially if the dangers warned against in content warnings are as pleasantly negligible as those specified by the British Board of Film Censors, who, I was recently reminded by James Purdon, informed cinema-goers in 2005 that the superb nature documentary March of the Penguins contained scenes of ‘mild peril’. This is more especially the case since there seem to be much more substantial perils ahead for English Literature departments in the UK, with rates of application continuing the death spiral maintained over the last ten years. I suspect that hard-up Vice-Chancellors looking at targets for departmental closure over whom few tears will be shed are less likely in practice to defend academic subjects which render themselves so opulently open to guffawing mockery than they might wish to appear in principle and in public.
Perhaps I should be grown-up enough by now to be less exercised by the esteem in which the study of literature, or of the humanities more generally, is held; and the record will show with what expense of spirit as a young academic I once courted admiration for my strutting disdain of it. But the fact that I have also invested so many years that cannot come again in thinking about the idea and history of literature and what it means to study it, makes it hard to care nothing at all about its street value, or not to feel an occasional hunger on its behalf to be more serious.
Literary academics are certainly not alone in wishing to be taken to be on the side of the angels. It may not matter in the end, whether or not that is the end of English, or, in undue course, of the humanities, what views they cherish or relinquish in relation to the political issues which currently preoccupy them and others. It may however matter much more what keeping up the good work in English departments of warning students off what was once meant to draw them in might play in the deepening of the administered society. At my back I feel an apprehension that in the long run it will be more perilously soul-destroying for an academic subject to have been thought silly than sinful.
References
Bridgland, Victoria M.E., Payton J. Jones and Benjamin W. Bellet (2024). ‘A Meta-Analysis of the Efficacy of Trigger Warnings, Content Warnings, and Content Notes.’ Clinical Psychological Science, 12, 751-71.
Carroll, Lewis (1982). Alice’s Adventures in Wonderland and Through the Looking-Glass and What Alice Found There. Ed. Roger Lancelyn Green. Oxford: Oxford University Press.
Connor, Steven (2023a). Dreamwork: Why All Work is Imaginary. London: Reaktion.
—————— (2023b) Styles of Seriousness. Stanford: Stanford University Press.
——————- (2024). ‘Laura and the Lost Cause of Psychoanalysis.’ stevenconnor.com/lostcause.html
Frege, Gottlob (1980). Translations from the Philosophical Writings of Gottlob Frege. 3rd edn. Ed. and trans. Peter Geach and Max Black. Oxford: Blackwell.
Freud, Sigmund (1955). Beyond the Pleasure Principle. In The Standard Edition of the Complete Psychological Works of Sigmund Freud. Vol. XVIII: Beyond the Pleasure Principle, Group Psychology and Other Works. Trans. James Strachey et. al. (London: Hogarth), 3-64.
Graeber, David (2018). Bullshit Jobs: A Theory. New York: Simon and Schuster.
Herzfeld, Michael (1993). The Social Production of Indifference: Exploring the Symbolic Roots of Western Bureaucracy. Chicago: University of Chicago Press.
Hume, David (1985). A Treatise of Human Nature. Ed. Ernest C. Mossner. London: Penguin.
Joyce, James (1993). Ulysses. Ed. Jeri Johnson. Oxford: Oxford University Press.
Keats, John (1958). The Letters of John Keats 1814-1821. 2 Vols. Ed. Hyder Edward Rollins. Cambridge MA: Harvard University Press.
Mansfield, Harvey C. (2007). ‘How to Understand Politics: What the Humanities Can Say to Science.’ First Things, August–September 2007, 41–47.
Orwell, George (1968). The Collected Essays, Journalism and Letters of George Orwell: My Country Right or Left 1940–1943. Ed. Sonia Orwell and Ian Angus. London: Secker and Warburg.
Prynne, William (1633). Histrio-Mastix: The Players Scourge, or, Actors Tragædie. London: for Michael Sparke.
Quine, Willard Van Orman (1940). Mathematical Logic. Boston: Harvard University Press.
Schrödinger, Erwin (1935). ‘Die gegenwärtige Situation in der Quantenmechanik.’ Naturwissenschaften, 23, 807–812.
Wittgenstein, Ludwig (2009). Philosophische Untersuchungen/Philosophical Investigations. 4th edn. Trans. G.E.M. Anscombe, P.M.S. Hacker and Joachim Schulte. Ed. P.M.S. Hacker and Joachim Schulte. Oxford: Basil Blackwell.