Tuesday, April 04, 2023

The Zombie Storyteller (Der Erzähler als Wiedergänger)

The germ of the present essay dates back to my encounter, at least ten years ago and possibly even more than fifteen, with a book called Walter Benjamin at the Dairy Queen by Larry McMurtry. The book then caught my eye not only on account of its admittedly eye-catching title (a title guaranteed to keep a particularly tenacious hold on my eye in particular in the light of the metaphysical aura of primality accruing to the proprietary name Dairy Queen chez moi, an aura descanted on in “Proprietary Names: the Name/Proprietary Names: the Place”), but also on account of its author. For despite having never read a single line ever written by him, I had a very firm sense that McMurtry was a sort of upmarket airport novelist, meaning a novelist not quite worthy of being described as literary in the fullest latter-day sense (which in a sense is the only sense that it has ever had, as in earlier days—i.e., before ca. [and that is a very big “ca.” as it is based on my as-un-yet-unsubstantiated sense that the distinction between literary and non-literary writing began with the founding of magazines like The Smart Set and The New Yorker] 1920—all writing was regarded as literary as a matter of course). The distinction between literary and non (or sub)-literary writers, and novelists in particular, has always been if not a significant then at least a meaningful one for me (meaningfulness itself being a more downmarket version of significance, naturally), which is to say that while few literary novelists—few of even the most critically acclaimed ones—interest me even ever so slightly, I can with a bit of effort manage to get into the spirit of most literary novelists’ modi operandi; I can approvingly view what they are at least trying to do and bring myself to take some measure of pleasure in it, because I can see that what they are at least trying to do is not completely wrongheaded or dishonorable, even if they are going about it in an entirely wrong way or even if the very goal they are aiming for is a transparently unattainable one. In short, I do not generally think of literary novelists as complete frauds or imbeciles, which of course is simply to say in negative that I do generally think of non (or sub)-literary novelists as frauds or imbeciles. I realize that in saying this I am at odds with scads of apparently intelligent and cultivated people, who are prone to saying of their favorite non (or sub)-literary novelist, Sure, he’s no Shakespeare [(sic) in their defense on the citation of a non-novelist, Shakespeare, in preference to, say, Proust, for they are generally defending their favorite non (or sub)-literary novelist qua writer tout court rather than qua novelist], but you’ve got to admit he’s enormously entertaining. But I have not got to admit any such thing, or at any rate (for I may very well someday [and indeed sooner rather than later] be forced to tell a lie to such an effect) I am incapable of honestly assenting to any such thing, for the modi operandi shared by non (or sub)-literary writers, and seemingly completely indispensable thereby, are seemingly tailor (or bungler)-made to forestall or short-circuit any possibility of entertaining me because they ineluctably strike me as intrinsically imbecilic or fraudulent, which is to say that I can’t read a sentence of a non (or sub)-literary novel without getting the sense that in that sentence the writer has knowingly or unknowingly written something that ought not to have been written because it is untrue (put down that hand that you are waving at me with Horschakesque impatience, kiddo, before I chop it off; for I know that in a novel one is always dealing with something that is known as fiction, something that is at least held to be professing itself to be untrue from beginning to end, and I shall, Lord willing, specify why that plea does not let novelists off the hook of veraciousness) or has already been written. If the writer has unknowingly written that should-never-have-been-written thing, he is an imbecile (or conceivably an ignoramus, although, for reasons that will, LW, be explained in a more appropriate context, in this case-class if few or any others, the difference between the two dummy-types is nugatory, whence its relegation to the present parenthesis); if he has knowingly written it, he is a fraud, and either way he is of course not a person (or more than likely, creature or wanton, for of all the conceivable types of writers behind the scriptum horribile the only one who qualifies for personhood in the strong, Frankfurterian sense is the non-casual non-truth teller) fit for decent company, even (or perhaps, rather, especially) the virtual decent company of print. I suppose the locus classicus de nos jours of a non (on sub-)literary writer who is supposed to be enormously entertaining despite being no Shakespeare is J.K. Rowling, which makes for quite a nifty coincidence inasmuch as as (sic on the repetition of as) luck or Providence would have it, the very last non (or sub-)literary novel on or at I was foolish enough to bestow a glance by way of giving an apparently intelligent recommender of the book the benefit of the doubt was the very first Harry Potter novel. I suppose I gave it a bit more than a sentence, but certainly well before reaching the end of the first chapter I had resolved not to read another word of it. Why? Because the gosh-damn(ed) thing read exactly like a transcript of the opening of one of C.S. Lewis’s Narnia novels—not the most famous one, The Lion, the Witch, and the Wardrobe, but one of the more obscure later ones, specifically the one that’s set in the Victorian period; I believe it’s called The Magician’s Nephew (and no, I’m not going to look it up because the sort of pedantic solicitousness about essentially interchangeable turds that would be instanced by such a fact-checking expedition is destined to be one of the principal bugbears of this essay). But Harry Potter and the Iridescent Hemorrhoid isn’t even set in the Victorian Period!  Speak parenthetically of the devil qua bugbear (or bugbear qua devil)! I know HPIH isn’t set in the Victorian period. I also know that HPIH, unlike The Magician’s Nephew, is centered on a character called Harry Potter, and that some grammar- or-boarding school called Mugwarts or whatever figures in it but doesn’t figure in The Magician’s Nephew. I know that HPIH diverges from The Magician’s Nephew in all those particulars, and I am willing to concede sight unseen that it diverges from The Magician’s Nephew in at least a trillion other particulars, but I maintain that not one of those at-mimimum-1,000,000,000,003 divergences distinguishes HPIH from The Magician’s Nephew in any essential way, because all the essential elements of The Magician’s Nephew are present in those first few pages of HPIH—and by these elements I mean the stuffy, wood-paneled material environs, the orphan-like situation of the hero, and the promise of magic as a means of liberation from both of the preceding. And if J.K. Rowling didn’t realize she was copying those essential elements of The Magician’s Nephew she was a downright imbecilic ignoramus, and if she did know she was copying them she was a fraud.  Supposing (which I shall do here if only for the sake of argument) that the essential elements of HPIH are the essential ones of The Magician’s Nephew, Rowling is self-evidently a fraud if she knowingly copied them, but supposing she unknowingly copied them is she really to be regarded as a downright imbecilic ignoramus? After all, C.S. Lewis himself is no Shakespeare in quite a different sense from the one you have employed; which is to say he does not stand at the dead-center of the Anglophone literary canon, such that ignorance of him is perforce less reprehensible than ignorance of Shakespeare. It is true that Lewis is less central to the Anglophone literary canon insgesamt-cum-tout court, but he is very close to the dead-center of a certain sub-canon of that canon, namely, the sub-canon of Anglophone children’s literature; indeed, apart from Lewis’s pseudo-semi-namesake Mr. Carroll-stroke-Dodgson it is difficult to think of any more eligible candidate for nomination as the Shakespeare of Anglophone children’s literature than CSL, and such being the case, Mrs. Rowling must have been a criminally ignorant children’s author indeed if she was less than intimately acquainted with any volume in the Narnia series (and so implausible is the notion that she was less than intimately acquainted therewith that I cannot but unhesitatingly peg her as a fraud).  And what about your friend the apparently intelligent admirer of HPIH? What if he simply hadn’t read The Magician’s Nephew? I trust that he was not a children’s writer. No, indeed, and given that he wasn’t even a Brit (or even much of an Anglophile), I suppose it’s not extremely improbable that he had never read The Magician’s Nephew. But supposing he hadn’t, in that case he should have smelled a fraudulent or imbecilically ignorant rat in or from the opening pages’ echoes of Dickens, for minus the magic, the opening scenario is a complete knockoff of the Dickens novels with male protagonists who at least start out as helpless boys—Oliver Twist, David Copperfield, and Great Expectations. (And of course the ineluctable inference that Lewis himself pilfered his scenario from Dickens goes to show why The Magician’s Nephew is itself at best a second pressing [and hence perhaps a non (or sub-) literary novel in its own unright] and HPIH at best a third pressing.) Yes, but supposing he had never read Dickens… Why, then, he was an imbecilic idiot in his own right (or, rather, wrong). But do people really read Dickens as much as they used to?… I suppose not. So perhaps your beef is preeminently with the c******l competence of the airport novelists’ readers rather than with the derivativeness of the airport writers themselves… Perhaps. I think we shall find that I have beeves with both of them but that the two beeves are much of a muchness. In any case, for the n***e, having established that I think of airport novelists as writers who either knowingly or unknowingly parrot much-earlier-written writings written by people other than themselves, I think that it is clear enough why my eye was caught by the title of a book both name-checking Walter Benjamin and penned by a presumptive airport novelist. It’s clearish, to be sure, but not quite clear enough…Well, then, to clarify: it is or should be odd to find a person whose stock in trade is either knowingly or unknowingly regurgitating boilerplate taking an interest in anything remotely genuinely highfalutin. Genuinely highfalutin sounds or smells to my ears or nose like an oxymoron, and I think I’ve twigged exactly why: isn’t highfalutin-ness a kind of un-genuineness, a kind of phoniness? Indeed it is or can be, when considered from within, when considered with regard to the highfalutin author’s or text’s attitude or object-of-preoccupation (which is to say that being “absurdly pompous or pretentious” [so the 1990 COD’s definition of highfalutin] is generally unwarranted [Yes, yes, yes: at this very moment I see a milliard hands {or rather, admittedly-for-the-n***e-inexplicably, a milliard sick-green cartoon paws like those prominent in the oeuvre of Dr. Seuss} handing me a milliard hand-mirrors]), but I am thinking of it in terms of a sociological category, in terms of a quasi-synonym of highbrow, and in those terms there is such a thing as a difference between the genuinely and un-genuinely highfalutin. The genuinely highfalutin is the domain of c*****e comprising all the sorts of c*****l artifacts that embody or treat of material that is genuinely vulnerable to being construed (whether accurately or inaccurately) as absurdly pretentious, and sociologically speaking, one would expect to find an airport novelist shunning all things highfalutin like the pox (i.e., shunning them as if they were the pox, not shunning the pox itself qua highfalutin thing, altho’ of course the pox itself, qua faux ami for its diminutive and pulline semi-namesakes-cum-archaic synonym of syphilis, is the very quintessence of highfalutin-ness) because it is transparently in an airport novelist’s interest thus to shun such things (or at least to conceal his non-shunning of them) because it is in turn in his interest to represent the boilerplate he generates as a paragon of the virtue of unpretentiousness. (The un-genuinely highfalutin by contrast is the domain of c*****e comprising all the sorts of c*****l artifacts that embody or treat of materials that merely wish to give the impression of being serious or moving rather than entertaining [e.g., virtually every winner of the best-picture Oscar in the entire history of the Motion Picture Academy], and it is therefore by no means reflexively shunned by airport novelists; indeed many an airport novel is itself an instantiation of the un-genuinely highfalutin.) Now I believe that every empirical reader of this essay who has taken in and, if only provisionally, for the purposes of the present lecture, seconded my definition of the highfalutin, will concur with me in regarding Walter Benjamin as canonically highfalutin. Perhaps many such readers will maintain that WB is not the pinnacle of highfalutin-ness, perhaps even that he is but a foothill in the mountain range of highfalutin-ness by comparison with, say, Maurice Blanchot, who may in turn be but a foothill in that selfsame range according to the lights of such readers as are acquainted with far more recherché conceivably-absurdly-pretentious c*****l produce than I am. But at minimum he is completely untarrable (or perhaps, rather, in the context of an Alpine metaphor, un-Prussian blue-able) by the brush of midfalutin-ness, let alone by that of lowfalutinness; at worst a connoisseur of genuine highfalutin-ness will depreciate the checker of the name Walter Benjamin for checking WB a tad too often, for not substituting another name from the highfalutin canon for every third or fourth checking of WB. (As far back as the early 1990s, I heard a professor of German complain that for a certain longish spell in the then-recent past, it had been impossible to get an article that did not cite Walter Benjamin published in any reputable journal of German studies.) Accordingly, I felt the temptation to read Walter Benjamin at the Dairy Queen irresistible on sociological grounds alone; I found that I had to read it if only to make sense of an apparent convergence of two realms that I had regarded in combination as a locus classicus of never the twain shall meet—the realms of highfalutin literary criticism, philosophy, or what have you, and the realm of airport novel-dom; or, to recast the metaphor in more meta-geographically satisfactory terms, the visitation of the one realm, Highfalutinia, by a visitor, a citizen of the other, Airportnovelania, McMurtry, who would have been as unlikely both by inclination and prohibition to visit the other as a Goldwater Republican to visit the U.S.S.R. in the 1960s. (Naturally certain perverse shags will maintain that the U.S.S.R. should rather be cast as Airportnovelania,  and McMurtry as a Stakhanovite Communist, and I naturally encourage them to devote a screed of their own to their shaggage.) Would it reveal that airport novelists weren’t as intrinsically obtuse or fraudulent as I had always regarded them as being? Or would it simply and more mundanely reveal that McMurtry wasn’t an airport novelist after all? Would it reveal that highfalutinness was not as ineluctably high-flown as I had always regarded it? Or would it simply and more mundanely reveal that Benjamin’s reputation as a purveyor of highfalutinness was undeserved? Unsurprisingly if admittedly conveniently for my present purposes, what the book turned out to reveal–or rather, suggest, for none of the disclosures was definitive—was the proverbial a-little-of-bit-of all-the-above. McMurtry at least initially turned out to be by no means a lout or even a churl but quite a reflective soul with more than a dash of highfalutin-ness in his own constitution: I particularly approvingly noted (and to this day recall with relish) his employment of the mid-top-shelf adjective Wilhelmine to characterize Benjamin’s ambition to produce a literary work of prodigious length (viz., his Arcades Project, left thousands of pages long yet far-from-finished at his death). Obviously no proud or unregenerate airport novelist could be so refined, and when I reflected that McMurtry’s novels had served as the basis not only one of the most famous of all television miniseries Lonesome Dove (the miniseries, like the airport novel, being an intrinsically execrable genre), but also one of the greatest films of all time, The Last Picture Show, a movie that, for all its instant and lasting popularity, was in decidedly highfalutin fashion composed of long black-and-white takes accompanied by little if any non-diagetic music (this in decidedly unflattering contrast with its near-exact contemporary, the merely supposedly great [The] Godfather,  a heavily edited color film liberally slathered with incidental music [and based on an airport novel stocked one shelf above the bodice-rippers at the newsstands]) was willing to hazard that McMurtry was a properly literary novelist of at least middle stature, perhaps a sort of John Updike or Cheever of the upper-mid-southwest. Moreover, his use of Benjamin was at least initially transparently ethical and based on what struck me as a sound and cogent interpretation of Benjamin’s essay “The Storyteller” [“Der Erzähler” in the original German, tho’ MrMurtry understandably refers exclusively to the English translation of the essay]—for essentially, despite a few episodes of broader bibliographical-cum-biographical scope such as the one including the above instance of Wilhelmine, it was exclusively qua author of that essay as which Benjamin figured qua subject-cum-object of interest for McMurtry. Benjamin, McMurtry initially maintained, maintained (sic on the repetition of maintained) that the age of the storyteller was over and done with, and he, McMurtry concluded that all the evidence of his own senses seemed to corroborate the storyteller’s kaputdom. After all (so McMurtry, as envisioned by my memory’s ear, from here onwards, until I explicitly signal otherwise) the so-called frontier, the classic fuel-source of American storytelling, had long since closed, and its closing had swiftly been followed by the gradual extinction of the traditional occupations and activities associated with its open-ended days. Take, for instance, cowpokin’—not cowpokin’ as in coitin’ with cows (’cause some occupations and activities never go out of style, however drastically the conditions and relations of production may change), but cowpokin’ as in herdin’ cows, brandin’ ’em, throwin’ a lasso(o) ’round ’em, exetera (sic). Well in the old open-ended frontier days it took a hundred horseback-ridin’ cowboys to manage fifty cows. Now a mere pair of cowboys can round up a thousand cows from the safety of a helicopter cockpit. Where’s the excitement and adventure in that? Why, today’s cowboys don’t even convene around a classic campfire for a classic cowboy meal of cowboy coffee and cowboy beanie-weenies. No, they go to the stinkin’ Dairy Queen right next to the ranch for frankfurters and chocolate sundaes, just like their bean-countin’ and pencil pushin’ contemporaries in the big cities. Now how’s a feller goin’ to tell a story ’bout that, ’bout Joe and Jane Anybody chowin’ down on frankfurters and chocolate sundaes at a gosh-damn Dairy Queen in Anywhere, U.S.A? But then (here I explicitly signal that I am switching to my own voice, such as it is) about two-and-a-half fifths of the way through the book, McMurtry lost me, as they say. At that point he started arguing, quite against the grain of his gist in the previous two-and-a-half fifths, and on the basis of a seemingly complete misreading of “The Storyteller,” that the time was just about right for the revival of the art or craft of storytelling, that somehow the mere fact that the obsolescence of the storyteller was in a certain way or to a certain extent something to be sad about meant that Americans really collectively needed to put the storytelling pedal to the metal and set about telling new stories like gangbusters (and possibly even like “Gangbusters,” if “Gangbusters” be the name of a particularly good good-old-fashioned story, as it really should be even if it isn’t), etc. The gist of this second two-and-a-half-fifths of the book seemed to be that because storytelling had died out as a consequence of the extinction of certain old-fashioned ways of living and doing, these old-fashioned ways of living and doing could be talked back into life via storytelling, or at any rate, storytelling could engender new ways of living and doing that were every bit as authentic as the old ways thereof. This gist would have been quite supportable on Benjaminian grounds had Benjamin at any point in his essay stated or intimated that the changes in ways of being and doing that had caused the death of storytelling were at least conceivably temporary, had he stated or intimated that they were at least in principle reversible. But as he had, to the best of my recollection, stated or intimated no such thing therein, I could not but regard this second gist as superstructed on an elementary meta-logical blunder, the blunder of regarding cause and effect as mutually interchangeable. Accordingly, I could not but conclude that McMurtry was an accursed airport novelist—or at any rate, a person too obtuse to write anything less contemptible than airport novels—after all. And so I wrote off Walter Benjamin at the Dairy Queen as a one-off misencounter between the abovementioned two realms, and consequently the germ of this essay would have appeared to have fallen dead-born from the seed-packet (or have died while still sealed therein), if I had ever thought to think about it, as I never did (for why would I have done, for who would write an essay attacking the one-off commission of an elementary meta-logical blunder?).

The germ proved to have instead been a live one, and began germinating, not many years later, meaning about ten years ago, when I started noticing quite a few other people quoting “The Storyteller,” and noticing that each and every one of these quoters was drawing the same erroneous conclusion therefrom that McMurtry had drawn from it. These quoters-cum-errers (sic on e for o, natch) hailed from most if not quite all walks of intellectual life; some of them were airport novelists or persons of airport-novelistic falutine altitude; some of them made your average airport novelist look like Alfred (sic) Einstein or Sergei Eisenstein; some of them, at the other extreme, made even Alfred Einstein and Sergei Eisenstein look like the stupidest famous person bearing a -stein terminating surname (Harvey Weinstein, perchance?  I said the most stupid, not the most foolish [if HW even be that]). And by and by, as the various versions of hysterical nihilism now collectively (if transiently) known as wokeness acquired ever-greater prominence in and influence over the Occidental Lebenswelt over the course of the 20-teens, I found that nearly all the scads of resistors to, nearly all the resilers from,  this tendency, were parroting the quoters-cum-errers’ party line on storytelling even if most of them did not quote Benjamin by name; I found, in other words, that they nearly all of them seemed to think that a revival of storytelling might very well help to extirpate the scourge of wokeness from the abovementioned Lebenswelt (cf. these selfsame resistors’-cum-resilers’ ascription—as described in my previous essay, the one on the current hyper-efflorescence of Riesmanian inner-direction—of a comparable soteriological valence to tradition and faith); and finally, perhaps no more recently than about six months ago as of this writing (January 22, 2022), one began to hear these selfsame resistors, or at least the proportion of them given to hold forth on current events, banging on polemically, and generally pejoratively, about story’s slightly higherfalutin synonym, narrative, often apparently specifically against a specific narrative (sic on repetition of specific), because one tended to hear this word narrative preceded by the definite article (i.e., to hear references to something called the narrative). By this point, the aforementioned germ was now “developed, overgrown, and smothered,” as Henry James wrote apropos of a piece-of-writing-germ of his own (although he designated this germ by germ’s salutarily less pathologically laden but unhappily less precise synonym grain) in his introduction to the novel as which that germ had eventuated, The Ambassadors (an introduction that I felicitously or providentially began reading only two days ago (i.e., two days before this writing [still January 22, 2022]). Somehow everyone seemed to be talking about stories, and somehow everyone seemed to think either that they would save us or that they would destroy us, or that they would either save us or destroy us, or that they would both save us and destroy us, and somehow or other Walter Benjamin was to blame or to credit for some substantial proportion of the entire story-centric hullaballoo-cum-kerfuffle. Obviously, the time was long-past high (at least to the highly debatable extent that the time for the present writer to do anything is high, or even mid-altitude) for the present writer first to revisit The Storyteller, to take a good if desultory jeweler’s eyeglass-aided look at that essay, and to determine to what extent the essay held together both on its own terms, vis-à-vis phenomena up to and including those of the historical moment of its composition, and qua quasi-Delphic prognostication vis-à-vis phenomena arising since that moment; and then to scrutinize the most salient recently propounded assertions about storytelling by whatever light was furnished by the essay, to determine to what extent the validity of these assertions was reinforced or undermined thereby. In contrast to most self-or-other-styled essayists (if also in conformity with the inventor of the essay, Montaigne, and also, to some more equivocal extent, with Walter Benjamin), I don’t ordinarily perform the cat-unbagging exercise of tendering what is known as a thesis, because it seems to me that the extent to which a thesis is merely supported by the material that follows it, it is superfluous and the extent to which it is not merely supported but supplemented by that material, it is perfidious, but on the present occasion that tendering seems not-un-called-for if not quite exigent, and in any case (pseudo-spoiler alert! [a.k.a. phantom kitten-unbagging!]) the revelation enrobed by it is of a sufficiently general, abstract, and fragmentary character that it cannot stand on its own, that revelation being that while today’s storytelling boosters are indeed missing the explicit point of Benjamin’s essay, he left them plenty of opportunities to miss that point and indeed to infer a point pointing in the opposite direction of the explicit one, which is to say in Yogi Berran terms that Benjamin didn’t really say what he said. 

And now that I’ve opened this can of potentially Benjamin-devouring worms, I might as well proceed by lobbing the most conspicuously voracious of these worms right into WB’s eye (or into one of his two Freudian southern-situated eye-counterparts) as follows: “The Storyteller” is subtitled “Reflections on the Works of Nikolai Leskov,” meaning that the essay is at least ostensibly an essay on storytellers in general only in virtue of arguing that Leskov himself is and was an exemplary storyteller. The present writer knows precious little about Leskov; indeed knows him almost exclusively as the author of a story that formed the source of the libretto of Dmitri Shostakovich’s opera The Lady Macbeth of Mtsensk and consequently is familiar with Leskov’s work almost exclusively as refracted through the pen-prisms of Shostakovich and his literary collaborator, Alexander Preys. But one of the very few things I do know about Nikolai Leskov is that he was born in 1831 and died in 1895, a thing that is of more than trivial importance vis-à-vis the historical timeline of Benjamin’s argument because of course Benjamin argues that the decline of storytelling corresponds to “the rise of the novel at the beginning of modern times” because unlike the story, “the novel neither comes from oral tradition nor goes into it,” is “essential[ly] dependent on the book,” and “became possible only with the invention of printing” (Section V), and Leskov entered the world long after the beginning of modern times, and indeed later than many a novelist, and indeed later than all three of the three great classic novelists who like him hailed from Russia—Turgenev, Dostoyevsky, and Tolstoy, such that at first blush either the aforementioned timeline must be erroneous or Leskov must not have been a genuine storyteller (the latter disjuncture mechanically garnering some credence from the fact that Leskov’s published oeuvre contains seven novels, nearly as many or slightly more than many as each of the Big Three’s). Now of course a prospectively plausible timeline-and-Leskov saving counterclaim to both of these two disjunctures immediately presents itself, namely that Leskov was an anachronism, a holdover, one of the last of the dying breed of storytellers, or what have you, and it is indeed as exactly as such a figure that Benjamin presents him. But short of some sort of un-frozen-caveman lawyeresque scenario, a scenario wherein a certain person lives in complete organic isolation (as opposed to self-imposed isolation) from contemporary material and intellectual currents, no human being has ever been a genuine anachronism, and every human being has always and axiomatically been a human being of his own time. Benjamin does his best to convince us that Leskov was an un-frozen-caveman-esque figure, but the evidence he adduces towards this end is rather feeble. Leskov, he says, was “a member of the Greek Orthodox Church” and “a man with genuine religious interests,” (Section III), but what Russian of the nineteenth century was not both of these? And in particular which of the big three Russian novelists was not? Turgenev perhaps, but only just perhaps. B. adds that L. spent the bulk of his extra-literary working life as a kind of traveling salesman, the “Russian representative of a big English firm” for which “he traveled through Russia” (ibid.). Apart from the stationary used car salesman, it is hard to think of a figure more emblematic of commerce-jaded modernity than the traveling salesman, and although his obligation to deliver a spiel or pitch certainly gives the traveling salesman plenty of opportunities to hold forth orally in a not-un-storyteller-like way, that selfsame spiel or pitch is signally and unfavorably distinguished from the good old-fashioned story-teller-told story by its nakedly instrumental character, by the fact that it is principally intended to get the listener to do something for the teller and is therefore radically incompatible with something that Benjamin regards as “the nature of every real story,” namely that it “contains, openly or covertly, something useful,” that it “has counsel” (Section IV). Finally B. makes much of the fact that L. had a penchant for ancient literary sources; he states that L. “was grounded in the classics,” immediately thereafter asserts that Herodotus was the “first storyteller of the Greeks” (without, however, even suggesting that L. was a jot fonder of Herodotus than of the least storyteller-like classical author [say, Cicero {although of course not even C.’s works are devoid of story-tellage (consider, for example his relating of the legend of the people who live near source of the Nile in The Dream of Scipio)}]), and then quotes the episode from H.’s Histories wherein it is related that the Egyptian king Psammenitus impassively witnessed his son and daughter being led to execution only to burst into tears on seeing a servant of his being led to the same fate. For Benjamin, what makes the episode such a cracker-jack story, and indeed an embodier of “the nature of true storytelling” (But I thought the nature of storytelling consisted in ‘having counsel’ [Shutcho carping carp’s mouth!]) is that Herodotus forbears from offering any explanation of the king’s belated breakdown, that he leaves it to the reader to conjecture why the king remained unmoved until beholding his servant. (Section VII).  In the eyes of the present writer, though, Benjamin’s entire treatment of Herodotus here eo ipso is of far greater illustrative value than the Herodotus-episode itself, as it, the treatment, is a truly cracker-jack example of something both quite characteristic of Benjamin and nearly akin to storytelling as defined and prized by Benjamin but not identical to it, something that in apparent aim if not technique is indeed more nearly akin to the abovementioned spiel or pitch, something we might term strategic ellipticality.  I term it this because it relies on ellipsis, on omission, to persuade the reader of the actuality of a state of affairs that is at the very least debatable and often actually chimerical. In this case, Benjamin omits to avail himself of a sufficiently representative sample of Herodotus. From Benjamin’s confinement of his citation of Herodotus to the episode of Psammenitus’s breakdown, one—or at any rate, the reader who has never read the Histories (and the present writer confesses to having been such a never-reader until about a year ago as if this writing [January 26, 2022]), as whom Benjamin may very well have been knowingly [and consequently strategically] entitled to regard his typical reader even back in the 1920s—cannot help gathering that Herodotus never offers explanations of the events that he recounts, that he never attempts to influence his readers’ interpretation of them. In point of fact, Herodotus is a compulsive explainer; in point of fact, he rarely concludes a narrative episode before throwing in his two drachmas’ worth on why he thinks the participants in it acted or reacted as they did. Just on browsing through a handful of pages from Book VIII, I find him writing apropos of the Persian general Mardonius’s canvassing of the Greek oracles before leading his army out of Thessaly, “I suppose that he sent to enquire the business which he had in hand [i.e.,  the just-mentioned prospective departure from Thessaly], and not for any other purpose” (Chapter 133) and then speculating that Mardonius sought an alliance with Athens against the other Greeks because the oracles “counselled to make Athens his friend: so that it may have been in obedience to them that he sent an embassy” to the Athenians (Chapter 136). Now if even the “first storyteller of the Greeks” was so keenly addicted to explanatory in-butting as this, one cannot but reasonably infer that explanatory in-butting is at the very worst not inimical to storytelling. Moreover, the logically implied complement of the assertion that refraining from explaining is uniquely characteristic of storytellers, namely, the assertion that such refraining is inimical to novel-writing, crumbles into the dust of utter untenability after the most cursory consultation of the most incontestably literary, the least oral-eque, of novels. Why, even the work of Proust, the expounder of notoriously un-aloud-readably involute sentences and explainer extraordinaire (as Samuel Beckett sagely pointed out in his monograph, Proust’s method of characterization consists in “explaining away” his characters, in describing them in such microscopic detail and with such a multitude of qualifications that the reader eventually loses all sense of them as mutually discrete entities) is not devoid of moments in which he tantalizingly leaves explanations to the reader’s imagination. A particularly tantalizing such moment, and one that IHOP easily vies with the episode of King Psammenitus in point of the richesses of speculation that it invites, occurs in the course of the narrator’s last encounter with the Baron de Charlus, almost exactly halfway through Le Temps retrouvé, the final volume of the Recherche. Having remarked on the Baron’s retention of remarkable powers of memory despite having suffered a stroke severe enough to impair his gait and pronunciation, on his pride in this retention, and on his desire to flaunt it to the narrator with an example, the narrator switches to the quotation of direct speech for the communication of that example:

Without moving his head or his eyes or altering his delivery by the slightest inflection, he said to me by way of example: “Here’s a pole with a poster on it just like one I was standing no in front of the first time I saw you at Avrances—no, that’s not right—at Balbec.” And it was indeed an advertisement for the same product.        

And having recorded that ultra-dry statement, “And it was indeed an advertisement for the same product,” the narrator moves on to a new paragraph and an entirely different topic, but one likewise concentrated in storyteller-like fashion on the there and then of the encounter, namely his gradual adjustment to the Baron’s newly acquired “pianissimo” dysphasia.  Now Proust need not have moved on so expeditiously from the advertising poster, and in point of fact, any enthusiastic and attentive reader of the preceding volumes of the Recherche cannot help at least initially finding him well-nigh criminal for not lingering over it in the light of its evocativeness of so many significant moments in those volumes. Of course it most transparently and quite overtly evokes the episode of the Marcel’s first encounter with Charlus, an episode memorable not only in the light of the gargantuan role Charlus is destined to assume in the fabulae personae but also as a kind of self-contained vignette in virtue of its frankly comical depiction of homosexual love at first sight from the point of view of the unwitting object of that love. (Oddly, if perhaps not accidentally (q.v. below, LW), the advertisement itself, though present, is a very minor element of that episode and indeed referred to merely as a generic poster [affiche] implausibly employed by Charlus as a pretended object of his attention, and without being described at all.) Charlus’s reference to that meeting in connection with his re-sighting of the poster would be a perfect point of embarkation for a meditation on the complete transformation of Marcel’s relationship with the Baron over the intervening decades, from his transformation in the Baron’s eyes from an anonymous prospective bit of young male-tail to a fellow-crapulous viellard of no erotic interest to him whatsoever; or perhaps for a more general meditation on the waning and eventual disappearance of Charlus’s libido, or the libido of all homosexuals, or all men, or all human beings. But Proust declines to use the reference as such an embarkation-point. Then the reference to the poster specifically as a poster first seen at a seaside town cannot but put the reader in mind of the vehicle of Proust’s most productive metaphysical engine next to that of involuntary memory (the phenomenon exemplified by the famous Madeleine-tasting episode), that of the seeming uniqueness of named entities in contrast to entities designated merely by words, namely “one of those posters, entirely blue or entirely red, in which, because of the process employed or a whim of the designer, not only the sky or the sea but also the boats, the churches, and the people in the street are blue or red,” to which Proust likens names in contrast to words. Is it not, or ought it not to be, significant that despite likening names to seaside posters, Proust never specifies the visual composition of the advertisement twice-sighted by Charlus, let alone the name of the product advertised thereon, especially given that being specifically an advertisement for a product, it presumably prominently displayed at least one certain name, the name of the product’s manufacturer? (A footnote to the Temps retrouvé episode supplied in the folio classique edition, in acuminating this presumably to a certainly—“Notebook 51 [of his drafts for the Recherche] specifies: ‘It was the same Liebig advertisement {presumably an advertisement for soup manufactured by the Liebig corporation [founded in 1847]}’”—makes Proust’s inklessness on the contents of the poster in the finished version of the work even more exasperating. Could it be, I only now [January 29, 2022] think to ask myself, that, foreseeing the supreme sovereignty of proprietary names that I would lachrymosely decry in “Proprietary Names: the Name/Proprietary Names,” Proust here preemptively engaged in the sort of whitewashing of proprietary by novelists, filmmakers, et al. of recent decades that I would decry no less lachrymosely in that essay?) Yes it is or ought to be, and indeed, Proust’s complete severance of the advertisement at Balbec from his discussion of names is so glaringly counterintuitive that one cannot but suspect that he deliberately effected this severance for the sake of attaining a more exigent goal even than the elaboration of his metaphysics of names, and as it so happens, I can think of a plausible example of such a goal; at any rate, I think of it as a worthy goal, however plausible Proust’s desiring of its attaining may be—viz., the goal of adding Charlus’s signature to the Recherche in a manner consubstantial with the manner in which he had added Swann’s signature to the Recherche at its other end by including his name in the title of the first volume.  By a signature, I mean simply an indication that the character in question possesses an outlook or cast of mind with which the author (or the narrator, “the text itself,” or whichever other entity to which was one ascribes the generation of the work) is more or less wholeheartedly sympathetic. From the Recherche’s treatment of Swann—not only in its presentation of the bulk of its first volume from his point of view but also in its reflection on that presentation in subsequent volumes—we gather that he is a kind of alter ego of the narrator, that the narrator finds Swann both susceptible to many of the same psychic injuries to which he otherwise alone seems to be susceptible and capable and capable of forms of knowing of which he alone is also capable. Not to put too fine a point on it, throughout the first six volumes we get the impression that almost every character apart from Swann is an asshole and an idiot by comparison with the narrator, and the semi-exceptions—at the moment I can think of only three of these, the narrator’s grandmother, Bergotte, and Elstir—are too patently different in constitution from the narrator or on view too briefly to become centers of dramaturgical gravity. And throughout these first six volumes, Charlus, despite being on view almost as much as Swann, comes across as but the most prominent member of the idiot-and-asshole gallery, because for all his conspicuously towering intelligence he is more flagrantly assholish than the most assholish of the others. Throughout one’s reading of those volumes one gets the impression that Charlus is too addicted to concocting intrigues and making scenes to stop and smell the mind’s roses (or hawthorns) after the manner of the narrator. The Temps retrouvé episode centering on him manifestly casts doubtful light on that impression via the gravitas and magnanimity it shows Charlus evincing in, for example, bowing abjectly to a woman he has always professed to despise. But that impression may receive its coup de grace, may even be obliterated completely, by the sub-episode of the re-sighting of the advertisement, which may very well show that Charlus was never first and foremost and intriguer and scene-maker, that in point of fact he was always and to the contrary preeminently an observer and reflector, just like the narrator, that for all his then-competing erotic preoccupation his attention to the poster on that distant day in Balbec was not feigned after all, that, indeed he may have descried in that poster lines and points of contact unsurmised by the narrator himself (whence, conceivably—and along entirely different lines than the those of the interpretation tendered in the footnote-centered parenthesis above—Proust’s reticence about the contents of the advertisement: to have mentioned the name of the product advertised therein would have made the narrator seem more observant and thereby undercut Charlus’s pleniscence)—in short that the world may have been even more thoroughly understood and felt by Charlus than by Swann or Marcel. But of course this is an awfully long string of “may”s, and because Proust is as silent on the significance of Charlus’s recollection of the advertisement as Herodotus is on the significance of King Psammenitus’s belated breakdown, not one of these “may”s can ever be transformed into an “is.” Which simply and apodictically—if devastatingly (i.e., official literary history-debunkingly)—goes to show that Proust was not a novelist but a storyteller. To the contrary, what it no less simply and apodictically—and not at all devastatingly (i.e., official literary history’s bunk-preservingly)—goes to show is that storytellers and novelists alike sometimes do not explain things that might be enlighteningly explained. And how could it be otherwise? After all, however nearly exhaustively one describes something—any sort of something—there is always more about that thing to be described when one has finished one’s description of it. And such being the case, every chunk of narrative discourse, be that chunk a hundred-word joke or anecdote or a million-word novel like the Recherche, cannot but present oodles of opportunities for its reader or listener to provide his own explanation, his own interpretation, of some entity or state of affairs referenced therein. And complementarily, as I have already shewn, apropos of Herodotus, even the work of a storyteller par excellence may be chock full of explanatory episodes, episodes in which the writer (or teller) does his best to preempt the reader or listener’s explanation with an explanation of his own.  But it is very easy to maintain otherwise, to maintain that not-explaining is uniquely characteristic of storytellers and every novel is exhaustively explanatory, when one avails oneself, as Benjamin does, of strategic ellipticality, when one conveniently confines oneself to adducing examples that support one’s assertions.  But how can one forbear from availing oneself of a technique so brazenly devious when one’s assertions are so…how do you say…wrong? I should hope that that is not how you do say, for when one is levelling such a charge at a writer as highfalutin as Walter Benjamin, a word like wrong is the most inflammatory of fightin’ words. I agree, but what more conciliatory word is there for an assertion that is instantly disprovable?  I own that an assertion such as “A storyteller never explains the events of his story” is not wrong in the same flagrantly falsifiable a way as is a statement such as “The present king of France is bald”; I own that it does not make reference to an entity that obviously does not exist, but it is as wrong as, and wrong in the same way as, many an assertion whose wrongness is very nearly as uncontested as “The present king of France is bald,” an assertion like, for example, “An Irishman never drinks coffee without whiskey”; it is wrong, that is, because it is readily refuted by plentiful counterexamples. I admit that time was, not so long ago, I would have shied away from being so blunt in my objections to WB; but then eventually my recollection of a criticism WB’s friend and quasi-protégé (but also closeted archenemy) Theodor Adorno had repeatedly lodged against WB’s writings intersected with something Adorno had once lodged against a certain writing of another writer, Jean-Paul Sartre, and produced the revelation that wrong was indeed the only mot juste for a fair proportion of WB’s pronouncements. A’s criticism of WBs writings was that they were too often undialectical, and I thought I sort of knew what that meant without being able to put my finger on it and indeed without feeling any urgency about putting my finger on it, given that undialectical, like all other words derived from dialectic, was (as it still is) a word of unassailable highfalutin status and therefore entitled to a goodly portion of inscrutability. But then I read the just-mentioned criticism of the bit of Sartre and concluded that the notion of the undialectical was really a much easier notional fish or fowl to get hold of. Apropos of Sartre’s famous apothegm from his play No Exit, “Hell is other people,” Adorno exasperatedly fumed, “One might as readily say that Hell is oneself.” Whereupon I invoked some unnamed power to strike me dead or pink (or dead and pink) if Adorno wasn’t right about that, right about what one might readily say about Hell. To be sure, I reflected, it would be pretty darned hellish to be trapped for all eternity in a room with four or five other people as in the scenario of Sartre’s play, but it would be equally darned hellish to be trapped in a room by myself for the selfsame entirety of that selfsame eternity. And then it dawned on me that the sort of wrongness exemplified by “Hell is other people” was the very essence of the undialectical, that this wrongness was quintessentially undialectical because, as I had learned from my reading of not only Adorno but also Plato, the essence of the dialectical consisted in the ruthless consideration of statements alongside their antitheses and the rejection of any statement whose antithesis was not manifestly untrue. And to say that Walter Benjamin’s work is undialectical is merely, but also devastatingly, to assert that it is chock-full of statements whose antitheses might just as plausibly be maintained as the statements themselves.  And “The Storyteller” positively (in perhaps two or more senses) pullulates with such assertions, and the assertion that a storyteller never explains the events of his story is by no means necessarily the most egregiously undialectical of such assertions. Indeed, for my admittedly exiguous money at least, that statement is one of the two that immediately follow this colon (and the second of which immediately follows the first in the original text [whichistersay I’m not about to be tactically anti-elliptical in the following quotation]): “Actually there is no story for which the question as to how it continued would not be legitimate. The novelist, on the other hand, cannot hope to take the smallest step beyond that limit at which he invites the reader to a divinatory realization of the meaning of life by writing ‘Finis’” (“Storyteller,” Section XIV). The first of these statements would seem to be instantly disproved by every fairy tale, as every fairy tale ends “And they all lived happily ever after,” as Benjamin himself reminds us later, in Section XVI, immediately before asserting something (“The fairy tale, which to this day is the first tutor of children because if was once the first tutor of mankind, secretly lives on in the story.”) that sheds no light on the reminder.  It would seem that the reminder, which reads in full “‘And they lived happily ever after,’ says the fairy tale.” and which starts the final paragraph of the section, is meant instead to provide reinforcement to the sentences that immediately precede it, the sentences, that is, that round out the penultimate paragraph: “All great storytellers have in common the freedom with which they move up and down the rungs of their experience. A ladder extending downward to the interior of the earth and disappearing into the clouds is the image for a collective experience to which even the deepest shock of every individual experience, death, constitutes no impediment or barrier.” OK, fine, one gets and takes Benjamin’s point here: “And they lived happily ever after” implies that they, the protagonists of the tale that has just concluded with this formula, are immortal, that they live for ever after, for all eternity (although it must be noted that the mere past-tensess of lived cuts against the grain of this insinuation).  But the biological immortality of a character in a story is no impediment to the finitude of the story he inhabits, and indeed the “happily” in the formula apodictically indicates that the story of the ever-livers is over; a life with happiness is a life without incident—or any rate without incident of narrative interest, inasmuch as a tale, unlike a sitcom, cannot be “about nothing”; it cannot center on endlessly iterated quotidiana because it is at least affecting to follow an uninterrupted narrative rather than to drop in on a group of characters every so often in the course of a generally vaguely defined timeline. Accordingly, it is simply not true that “there is no story for which the question as to how it continued would not be legitimate,” for it is entirely illegitimate to entertain the question of how a “They lived happily ever after”-ending story continued; once one starts asking what the prince did after slaying the monster or the princess did after being married to the prince one has already exited the realm of storytelling proper and entered the realm of meta-storytelling, the realm of parody, satire or burlesque (a realm whose hitherto known confines are exhausted  by a now all-but-abandoned mine containing nothing but by-now-all-but-exhausted veins of leftist polemics). And even supposing the novelist’s “Finis” did indeed constitute “a limit beyond which one cannot take the smallest step,” one must acknowledge that an undoubted at-least-whelming majority of these novel-ending “Finis”es are effectively merely echoes of “and they lived happily ever after,” inasmuch as an undoubted majority of novels end with the prospectively happy marriage of two young people. But of course the truth is that the novelist’s “Finis” constitutes no such limit, that there are scads of novels in which the contingent placement of the “Finis” is self-evident and a fair number of them in which that contingent-ness is conspicuously foregrounded. The most obvious instances of this are the great novelistic sequels—although the first sequel may actually have been a play rather than a novel, depending on whether the second part of Don Quixote counts as a sequel and antedates the The Merry Wives of Windsor, a vehicle for Shakespeare’s revival of the character of Falstaff at the request of Queen Elizabeth—for example, Arthur Conan Doyle’s resumption by popular demand of the Sherlock Holmes series after killing off his hero. At the end of “The Final Problem,” Holmes falls to his death in the clutches of his arch-enemy Professor Moriarty as far as Doyle himself is concerned, and in “The Adventure of the Empty House” he steps forward very much alive and explains that he escaped those clutches at the last possible moment. If to my adducement of this post-finisial switcheroo as an example it be objected that it hails not from the domain of the novel proper but from a late-flowering manifestation of the story (an objection admittedly sustainable on the grounds that both parties to the switcheroo are of sub-novel length but undeniably underminable on the grounds that the Sherlock Holmes narratives, being detective narratives par excellence, are obsessively preoccupied with information, an interest in which, as we shall see [LW], WB regards as intrinsically and fatally corrosive to any interest in telling or listening to stories), one need only turn to that pinnacle of the Victorian novel, Middlemarch, wherein after tying up her narrative in the tightest and neatest bow imaginable in her penultimate chapter by uniting two couples in happy matrimony and leaving an already-married third couple no-less-decisively united in unhappy matrimony, George Eliot spends another five or so pages escorting these characters from the early 1830s to the time of the novel’s composition, the late 1860s, and informing us of which of them are enjoying a happy old age, which of them a less happy one, and which of them have died off along the way. And a fortiori, who can forget the brazenly arbitrary, abruptly-arrested-phonograph-needle-like endings of Laurence Sterne’s two novels, Tristram Shandy and A Sentimental Journey, the former concluding plumb in the middle of a character’s relation of an anecdote (i.e., his telling of a story) whose denouement we consequently never learn, and the latter indeed concluding with the literal Anglo-Saxon equivalent END, but only in service of a bawdy pun whose supervention via that vocable shows that the real action (i.e., an act of coition between the hero and a chambermaid) hasn’t even started with that END; that that END constitutes a “limit” merely of the manners-mandated sort constituted by a “DO NOT DISTURB” sign. And if it be objected that Tristram Shandy and A Sentimental Journey are outliers or marginal cases or exceptions that prove the rule, I shall shout the objector down with an apothegm of one the all-time-great theorists of the novel, Vicktor Shklovsky: “Tristram Shandy is the most typical novel in world literature.” And if the objector tries to shout me down in turn with the counter-objection that Shklovsky was merely being provocative in terming Tristram Shandy ultra-typical, that he termed TS ultra-typical precisely because he knew that it was a highly atypical novel, I shall shout the counter-objector all the way down with the unassailable counter-objection that Shklovsky would not have felt obliged to promulgate such a provocation if he had not had sound reasons for believing that preceding theorists of the novel both lay and professional (among whom for aught I know he might have classed Benjamin) had been missing the novelistic wood for the novelistic trees; if he had not had sound reasons for believing that while Tristram Shandy undoubtedly did not typify what novels were ostensibly all about—namely, the propounding of a lengthy and more less complicated narrative—it did in fact typify those attributes and qualities that novels most copiously abounded in and that constituted novels’ actual main draw for their readers—namely, all the attributes and qualities that tended to participate in digressions from such a narrative.  And inasmuch as narrative is merely a highishfalutin word for story, have you not just scored a game-ending goal against your own side by effectively conceding that novels are fundamentally about everything but propounding a narrative? Indeed I have not; indeed, I have to the contrary scored two goals for my own side and indeed goals that I would not hesitate to term game-ending, because everything needful to the decisive trouncing of my opponent is indeed implied in or by them, were it not for the fact that unlike a tie-breaking goal they contain implications that need to be explicated (a full-fledged proof of the general fatuity of sports-derived metaphors and analogies is contained in nuce in this particular breakdown of a soccer-derived metaphor or analogy; but here, having the aforementioned implication-explication to attend to, I shall behave like Benjamin’s idea of an exemplary storyteller and let the reader crack or water this nuce); I have scored two goals for my own side inasmuch as 1) As we have already seen in the case of Herodotus, storytellers themselves are interested in a great many things other than propounding a narrative, and 2) Benjamin himself does not believe that the central and essential purpose of storytelling is to propound a narrative; he believes, rather, that that central and essential purpose is the dispensation of counsel, of advice: “[E]very real story…contains something useful. The usefulness may, in one case, consist in a moral; in another, in some practical advice; in a third, in a proverb or maxim. In every case the storyteller is a man who has counsel for his readers” (Section IV).  Surely you’re not going to argue that the novelist is a man who in every case has counsel for his readers. Here for once is a time when I sincerely repine with Benjamin and countless others at the defectiveness of the written word by comparison with the spoken; for in having just having uttered that sentence textually rather than viva voce, you have deprived me of an opportunity to tender the ritualistic protest that is My name’s not Shirley. But to comment on and overturn your cocksure prediction at one go: while I am surely not arguing that novelists are commonly in the business of purveying verbiage that takes the form of counsel, I am surely arguing that they are probably always in the business of purveying verbiage that assumes the substance of counsel and cannot help being received as such by readers.  They are perforce in that business inasmuch as the verbiage contained in them describes or presents people behaving in a certain way and presents or describes that behavior in a favorable or unfavorable light, whether by making it intrinsically appealing or unappealing by showing that it results in a happy or unhappy fate for the person engaging in it; and the reader cannot help taking these descriptions, presentations, and demonstrations as recommendations of or warnings against engagement in that behavior to the extent that his inalterable constitution and position in the world enable him to do so. Thus a young man (and perhaps also many a not-so- young man) cannot but take Fielding’s Tom Jones to be tendering him the piece of counsel eventually cheesily crystallized by Crosby, Stills, and Nash (and possibly also Young) as If you can’t be with the one you love, love the one you’re with, inasmuch as the novel depicts its hero-and-eponym discovering his true love near its beginning then having a one-night stand with another woman and being kept by a third during its middle, a middle during which he has been separated from that true love, before being to-all-appearances permanently reunited with that true love at its end. And complementarily and consequently, a young woman (but perhaps only a specifically young woman, inasmuch as the two flings of the middle section are both substantially older than the hero’s true love) cannot help taking Tom Jones to be tendering to her the counsel eventually maudlinly crystalized by Tammy Wynette as Stand by your man, or perhaps (because Sophia Western never learns the full extent of Tom’s involvement with either of the other women) the more general, and presumably as-yet-untrademarked advice, Don’t ask your man any questions about his previous amorous history. And of course because the conformity of these pieces of advice with Christian morals was, to the say the least, highly debatable, shortly after Tom Jones’s publication and wildly popular reception, Samuel Johnson felt obliged to write an essay cautioning novelists against writing about people with attractive manners and dubious morals, as he would not have done had he been confident that readers of his day were taking in Tom Jones as a narrative completely abstracted from matters of practical concern to them. So at bottom you believe, super-contra Benjamin, that there’s nothing whatsoever that essentially distinguishes storytelling from novel-writing or story-reading from novel-reading. At bottom, yes. But what about the essential rootedness of storytelling in the oral tradition that Benjamin makes so much of? What about his postulation of the essential connectedness of storytelling with manual handicrafts (if “manual handicrafts” be not a pleonasm), and the notion that the art of storytelling is “lost, because there is no more weaving and spinning to go on while they [i.e., stories], are being listened to?” (Section VIII).  The about-this-phenomenon or these phenomena, as far as I seem to be concerned, is that it or they simply does or do not seem to merit the much-making that Benjamin makes of it or them. Granted, in preliterate or non-literate societies (if a genuinely non-literate post-literate society be not an oxymoron), people presumably exclusively told stories in a strong sense—i.e. spoke them—and conceivably did so as mainly as an accompaniment to work. But given that from Herodotus onwards stories were routinely written down, it seems untenable to regard the story as an essentially oral form, for even the most unlettered post-Herodotusian yard-spinner must have taken inspiration from some written Ur-source—if only indirectly, via an Ur-reader thereof. And complementarily, given that even today [i.e., February 8, 2022], in the age of too many proprietary and non-proprietary text-actuated media even to dream of comprehensively cataloging, people still find themselves routinely obliged and inclined to communicate with each other viva voce in narrative form, given, in another words, that storytelling in a strong sense still thrives if only mainly in the form of such utterly uninteresting and substantially interchangeable quotidiana as micro-anecdotes about the asshole who cut one off at the stoplight or onramp on one’s way to work this morning. And as for this notion that there is no longer any weaving and spinning—well, the present writer is ordinarily at earliest the antepenultimate person in the world to lambast eggheads for being out of touch with normal people, but I can’t see how Benjamin would have entertained the just-mentioned notion unless he had assumed that everyone else in the world of the 1920s was, like him, engaged wholly in non-repetitive, non-routinizable brainwork, a kind of labor so absorbing in its own right that it precluded the degree-cum-kind of out-zoning requisite to attendance to a narrative in synchrony with its (the labor-kind’s) performance, for a micro-moment’s reflexive reflection on the state of labor in the early twentieth century, a single bout of free-association actuated by the phrase early twentieth-century workplace, will elicit a conveyor belt-driven cascade of mental images of the scene of a kind of labor sewing-machine-made for such a degree-cum-kind of out-zoning, the scene constituted by the large-scale factory. One of my grandmothers worked at such a factory, specifically a cigar factory in Tampa, back in that selfsame early twentieth century, and conceivably even in the very micro-micro-epoch of “The Storyteller’s” composition (for being born in 1916, she must have attained the legal working age in about 1930), and in each and every production-room of each and every one of these cigar factories, in addition to the dozens of workers attending to the highly repetitive labor of rolling minced tobacco into tight cylinders there sat another worker fulfilling an entire different function, a worker known as a lector whose job was to read aloud printed matter—a combination of newspapers and classic novels like Don Quixote—to the rollers as they rolled. And of course if one fast-forwards, as they say, to the 2020s, one finds a world crowded with virtual lectors in the form of the readers, both human and robotic, of audio books, which in turn have proved popular because even people whose official lines of work exact Walter Benjamin-worthy levels of non-repetitive concentration lead quotidian off-the-clock lives chock-full of episodes that are as repetitive and quasi-mechanical as weaving and spinning (their stinking treadmill workouts, stinking school runs, stinking colonic cleansing sessions, etc.). And finally, to this day [February 11, 2022] even the most intensely graphocentric genres, even the genres of writing most flagrantly alienated from the oral tradition or quasi-tradition or pseudo-tradition, frequently have recourse to evocations of oral discourse–e.g., the dialogues with the imaginary reader in the present essay.  (Here, as vis-à-vis Sterne’s digressiveness, one finds eloquent seconding of one’s demurral in a theorist of the novel—Bakhtin, who posited the dialogic imagination as one of the novelist’s chief attributes.) So in point of fact everything is and ever has been hunky-dory, right as rain, and copacetic in the realm of storytelling, and Benjamin is as heartily worthy of pillorying in a pamphlet entitled “The False Alarm” as the non-Booth-bearing John Wilkes ever was?  To the contrary, everything is unprecedentedly un-hunky-dory, wrong as rain’s antithesis (gravel flying up from the ground, perchance?), and awful in that realm, and Benjamin was very much in the right to raise alarm bells about the bad state of affairs therein, but at the same time I think Benjamin is heartily worthy of a pillorying in a pamphlet entitled “The Red Herring,” and not just because he is or was “in a certain very real sense” a malodorously dirty Commie, but also and mainly because to identify storytelling eo ipso as the principal locus of the calamity, and to distinguish the story from the novel categorically along the way, as Benjamin does or did, was and remains highly misleading, and indeed misleading in such a way and to such a degree that one cannot jolly well wholeheartedly blame McMurtry & co. from concluding from a reading of Benjamin that more and better storytelling is the panacea for what ails us (whoever us may have ever been or may subsequently turn out to be). Can then nothing be salvaged from “The Storyteller”? Can it not, like one of the shaggy dog-type stories that Benjamin prizes so highly, at least offer us a modicum of the counsel that Benjamin also prizes so highly? Indeed, I think a modicum—or specifically two modica—of highly valuable counsel is to be gleaned from “The Storyteller”’s mound of dog hair, and gleaned specifically from the two phenomena of the decline in value of experience, whose actuality Benjamin asserts at beginning of the second paragraph of Section I, and the emergence of information as a new form of communication, whose actuality he asserts at the end of the first paragraph of Section VI. I shall tackle—or perhaps, rather, to employ a more apt metaphorical vehicle, curry-comb, or nitpick—the second of these first because of the two it is the more approachable via concrete illustration because Benjamin himself illustrates it via a pair of properly concrete examples rather than via a succession of ghostly pseudo-concrete examples like the entire First World War and the disappearance of horse-drawn streetcars, namely the printing press and the newspaper:

…[W]ith the full control of the middle class, which has the press as one of its most important instruments in fully developed capitalism, there emerges a new form of communication…[Here interveneth a fair amount of the kind of to-and fro-ing about genres that I have already ] This new form of communication is information. [Here interveneth a paragraph break] Villemessant, the founder of Le Figaro, characterized the nature of information in a famous formulation. “To my readers,” he used to say, “an attic fire in the Latin Quarter is more important than a revolution in Madrid.” This makes strikingly clear that it is no longer intelligence coming from afar, but the information which supplies a handle for what is nearest that gets the readiest hearing.

There is, as they say, a lot to unpack here—a veritable Benjamin-sized library to unpack, as they don’t say. And the bit of that lot that craves to be unpacked most exigently isn’t even a proper part of the contents of the package, which is to say that while it coincides exactly in appearance with a certain part thereof, it actually forms part of a package that is “in a certain very real sense” an entirely different package. By this non-proper part I mean the word information (or rather, to be hyperpedantic, Information [for however “experimental” Benjamin may be construable as a prose-stylist, he was not at all experimental in his capitalization, and common nouns as well as proper ones are capitalized in German])’s occurrence in the original German text of the essay. While information, for all its Franco-Latin roots, is a completely naturalized word in modern standard German, it is far from the most commonly used German word typically translated into English as information. In the entry for it in the English-German section of Cassell’s dictionary, it appears in sixth place, three places behind Mitteilung, the word B’s translator Englishes as communication, and there only in plural form (Informationen). This seemingly deliberate choice of information in preference to more popular alternatives authorizes us Anglophones to hold Benjamin’s feet to our own tongue’s fire in appraising the justness or unjustness of his employment of the word. Had he used the first-place alternative, Nachricht, which also happens to be Cassell’s first-place translation of news, the obdurate Benjamin fan would always be plausibly within his rights to demur, “Well you see, he’s really just treating of that variety of information that we call news.” But because he has opted to use information, we Anglophones have the right to confute his argument about information via recourse to dispositive examples of information that do not behave in the ways in which he stipulates that information must behave. So, for instance (“Why fetch water from Timbuktu when you can fetch it from Poughkeepsie?” as a Benjaminian storyteller might gnomically query) when Herodotus takes great pains to specify the precise number of men from each nation, tribe, city-state, etc. participating in a battle and to tabulate the total number of participants in the battle from these numbers he is self-evidently purveying information, because such enumeration-cum-tabulation is the very warp and weft of data, and data is or are self-evidently a form of information, else we would not, inter multissima alia, subsume everything having to do with the technologically enabled treatment of data under the heading of information technology. Accordingly, information eo ipso cannot, or at least should not, be regarded as a new form of communication. Well then, so much for your contention that valuable counsel can be gleaned from this-here section, right? Wrong, because I think there is something distinctively new (and so something perforce unknowable to Herodotus and the other oldsters) about the sort of information that Benjamin is describing in this section. Ah, yes, I see: that something is proximity or perhaps rather propinquity, for he writes that it is “the information which supplies a handle for what is nearest that gets the readiest hearing.” He certainly seems to believe that propinquity is what is distinctive about it, but his example belies such a belief, or at any rate but half betruths it. It is true that the editor of the Figaro whom he quotes seems to posit propinquity as his principal criterion of selection for material to cover in his newspaper: he presumes the fire in the Latin Quarter, a neighborhood in the city in which his newspaper is published, to be of greater interest to his readers than a revolution in a different city, Madrid, in a different country, Spain. Still, one cannot fail of noticing that he does not say that the revolution in Madrid is of no interest to his readers whatsoever, and one cannot fail of presuming that the Figaro of Villemessant’s mini-epoch, the mid-to-late nineteenth century, featured coverage of many a foreign revolution as well as many a local fire. Foreign revolutions are after all a prominent part of the stock in trade-cum-bread and butter of all newspapers, or at least of all newspapers published in cities at least as big as, say, Poughkeepsie (again, “Why go fetch water from Timbuktu,” etc.).  In Alban Berg’s opera Lulu (or, if you prefer, Frank Wedekind’s Earth-Spirit, one of the two plays from which the opera’s libretto is taken nearly verbatim), the son of a newspaper editor in an unnamed German city presumably much bigger than Poughkeepsie because it has its own stock market barges in on his father at a particularly awkward private moment to announce, “Revolution has broken out in Paris!  Nobody [i.e., presumably, nobody back at the newspaper’s editorial office] knows what to print about it,” as he presumably would not do if the revolution were not destined to be covered on the front page of the next edition. To be sure, in a truly tiny town, coverage of international events is usually left to the daily of the nearest big city. And it is indeed probably only in truly tiny towns that coverage of ultra-nearby events has ever commanded the preeminent degree of interest that Benjamin ascribes to information in virtue of its supposedly inalienable propinquity. In Blue Ash, Lord’s Valley, or Applebachsville (“Oftimes when the well at Poughkeepsie has run dry, not even Timbuktu will supply fresh water,” quoth our conjectural Benjaminian storyteller), a fire in the Little Latin Quarter will be of interest to every citizen because in such a town every citizen lives almost more than figuratively within smelling distance of virtually every other citizen. And in such a town, a newspaper article about such a fire will command interest preeminently for the reason that it contains a kind of communication that, despite being delivered in an impassive tone by an impersonal anonymous authorial voice, answers to all the essential functions of storytellerly counsel inasmuch as it makes it clear to the reader that he must go out of town for groceries if the fire has destroyed the town’s only grocery store, that he must count on a diminution of custom if he is the owner of that grocery store and the fire has destroyed the town’s only large apartment building, autc. In Paris, the proportion of citizens who are materially impinged upon by a fire in the Latin Quarter cannot be very large; even in the LQ itself the majority of readers about a fire within its precincts will be complete strangers to its immediate victims and may even be unable to recall its site from memory. Pace Benjamin, they will still be more absorbed by such an article than by an article about a revolution in Madrid, but their interest in it will be of the same quasi-abstract, largely disinterested character that they bring to bear on the foreignly sourced article. Both articles will merely elicit a fleeting acceleration of the reader’s pulse and an interjection of “There but for the grace of God go I!” even though the acceleration-cum-interjection elicited by the locally sourced article will admittedly be a smidge less fleeting and a smidge more heartfelt in virtue of the fact that the “there” in its case is a “there” that the reader could get to much more quickly (at least as shanks’s mare trots, for of course the state of local transit may make the trip to the “local” there temporally lengthier than the one to the foreign one). So basically, at least in a “certain very real sense,” information of the kind one gets from a newspaper is characteristically and unprecedentedly valued for the very opposite reason that Benjamin asserts that it is valued; it is valued, in other words, for its conveyance of a sense of distance rather than of proximity to the events to which it refers; altho’ a less geographically laden word than distance is probably called for to denote the sought-after quality–perhaps that pretentious Brechtian term Verfremdung, a term that is typically inadequately translated as alienation (not that the inadequacy of the translation detracts a jot from the pretentiousness of the original), is most apt.  When reported on in a big-city newspaper, a fire a stone’s throw away takes on the aura ([sic] on the Benjaminian aura [sic2] of this word) of an event that might as well have taken place in a faraway city, or even on a faraway planet, for all its material relevance to the reader. But of course the only reason this kind of information exerts such a form of fascination regardless of the degree of spatial distance of its referent is that it is itself always temporally proximate—that it is itself always hot off the press, as they used to say. That is after all why it is generally called news, on account of which WB really should have used Nachrichten (the synonym of Information that most often appears in the names of newspapers) instead of information. [Switch from italics-for-emphasis to italics-for-indication of DGRly interjection:] By “it” do you mean the information itself or the events reported on in it?  mean the information itself or the events reported on in it? You tell me, as they used to say. Certainly common sense would dictate that only information on events that were new in themselves would attract interest, that information that effectively conveyed that nothing new had happened would attract no interest whatsoever even if it was brand-spanking new in itself, and yet there are abundant indications that news that has nothing new to report is routinely consumed with at least half-hearted interest. I presume you are thinking of the so-called 24-hour news cycle that legendarily began with CNN’s uninterrupted coverage of the so-called First Gulf War back in 1991. I am indeed thinking of that cycle, but only inter alia (after all I did write indications), for whilst it is admittedly true that more than figuratively uninterrupted news coverage is only just over 30 years old, the phenomenon of news media compelled to generate coverage even in the absence of new material to cover is quite an old one indeed, and it must date back at least to the middle of the eighteenth century because Samuel Johnson complained of it back then, specifically on May 27, 1758, in No. 7 of his newspaper column The Idler

…[J]ournals are daily multiplied without increase of knowledge. The tale of the morning paper is told again in the evening, and the narratives of the evening are bought again in the morning. These repetitions, indeed, waste time, but they do not shorten it. The most eager peruser of news is tired before he has completed his labour; and many a man, who enters the coffee-house in his nightgown and slippers, is called away to his shop, or his dinner, before he has well considered the state of Europe.         

Johnson was writing at a moment when “state of Europe” centered on the conflict that would come to be known as the Seven Years’ War, when the redundancy of newspaper coverage resulted in the news that a French warship was captured by an English one’s being “echoed from day to day and from week to week,” such that it ceased to be news despite being represented as such.  This redundancy also resulted in an epiphenomenon that Johnson touched on only very fleetingly and in very general terms in No. 7—that of people talking about facts in which they merely “think they have an interest”—but that he had occasion to instance in more specific terms in a later column, No. 48, which satirically comments on a Monsieur Le Noir, so dubbed despite his English nationality because despite having no “property or importance in any corner of the earth,” he “has, in the present confusion of the world, declared himself a steady adherent to the French” in the war to the extent of being “made miserable by a wind that keeps back the packet boat”—i.e., that detains letters containing news of English defeats from reaching the press via the safe arrival of the boat in port—“and still more miserable by every account of a Malouin privateer caught in his cruise”—i.e., by every newspaper article reporting that a French pirate preying on English trading ships has been captured by the British navy before reaching France. Le Noir obviously has no personal interest in French military success; to the contrary, from the perspective of self-interest he ought to be cheering every English victory. But because Le Noir has none of the usual material means of distinguishing himself from his fellow Britons, he can only distinguish himself from them by consuming news in an idiosyncratic fashion, as he is able to do only because of the newspapers’ presentation of all events as interchangeably both extraordinarily important and immediately materially irrelevant. Le Noir is a prototype of the figure that David Riesman nearly two hundred years later would label the “inside-dopester,” the mid-twentieth-century American who prided himself on knowing what was supposedly really going on the corridors of power in superior contrast to his supposedly benighted contemporaries, the “moralizers,” who took politics at face value and also participated in politics in traditional ways, by joining their local Democratic or Republican machine.  That Le Noir and his ilk had not yet become fully ascendant even by the age of noir despite the intervenient expansion of news into the media of film and radio (and presumptive consequent lesser ignorability) doubtless says something about something that I may very well have occasion to address before the end of this essay. For the moment, before moving on to the topic of the decline in value of experience, I shall recapitulate the three main characteristics of information as objectively adumbrated by the phenomena that Benjamin would analytically affiliate it with and distinguish it from: 1) Information is not a new form of communication or one that is inimical to storytelling. It has been with us at least since the writings of the supposed arch-Greek storyteller, Herodotus, and contentedly coexists with storytelling therein. 2) Information is not categorically objectively distinguishable from counsel because every piece of information is conceivably receivable as counsel by a certain sort of person in a certain sort of situation. 3) The new form of communication is information eo ipso but news, which is indeed a form of information but one that is categorically distinguishable from other forms of its information in virtue of its novelty, which an analytic attribute of it. A report on an event that took place two weeks ago is not if present in a newspaper published two weeks ago; but a report on that same two-week-old event is news if present in a newspaper published this morning. 4) The immediate effect of news is to make the events reported on in it seem more distant not more proximate, but a quasi-paradoxical knock-on effect of it is to make its consumers take a personal interest even the most distant events. It can have this effect because it is a commodity, and like every commodity its exchange value can both rise and fall. (N.B. The “it” here is not to be understood as news tout court—although to be sure-ish, news tout court is presumably less or more valued in relation to pork bellies, gold, orange juice, etc. from day to day—but one piece of news in relation to others; the sought-after-ness of one so-called scoop [or, in Riesmanian terms, piece of “inside dope”] in relation to others.)

Now as to the decline value of experience: the first thing to be noted about it in “The Storyteller” is that in the original, “Der Erzähler,” of the two—and here, in contrast to information vis-à-vis its German quasi-equivalents, there are really only two—words usually translated as “experience,” Erfahrung and Erlebnis, Benjamin opts for Erfahrung. I have no wish to add even my two cents of bloviation to the acres thereof that have been devoted to the distinction between these two words, so let me try to say everything I have to say about it before I get to the end of the present sentence and make that sentence at longest a medium-length sentence (at least by my standards): in the lowest-common-denominator terms Erfahrung is mostly objective and Erlebnis mostly subjective; and in second-lowest-common-denominator terms Erfahrung is mostly about what one has learned by experiencing something and Erlebnis is mostly about what it felt like to experience that something.  I have not verified with empirical certitude that our microepoch’s most horrendous canteme lived experience is a direct translation of Erlebnis, but I cannot but presume that it is because Erlebnis includes a contracted form of the German word for life, Leben. (So much for confining yourself to one sentence. Not at all: for I am now no longer discoursing specifically on the distinction between Erlebnis and Erfahrung, at least not explicitly.) The notion of Erlebnis is by no means as ineluctably bumptious as “lived experience”; it is not actually coextensive with the act of standing with arms akimbo and chin-and-lower lip thrust upward and saying (in words approximating Douglas Murray’s): “You can never understand me, but you must understand me,” but it does share with “lived experience” the inconvenience of being unfalsifiable, which it is why it is singularly convenient for us (or at least for me) that Benjamin opted to focus on Erfahrung rather than Erlebnis. Because he opts for Erfahrung, we can be sure that experience in “The Storyteller” always refers to something that is supposed to be locatable in the external world rather than in some person’s internal world; that in “The Storyteller’s” assertions about experience we are always dealing with something along the lines of an assertion that the average life expectancy has increased or decreased and never dealing with anything along the lines of an assertion that life somehow just isn’t as fun (or exciting, vivid, scary, etc.) as it used to be. So when Benjamin writes that experience passed “from mouth to mouth” declined in value specifically in the interval between the beginning and the end of the First World War, we are entitled to ask, “What was it that could be passed from mouth to mouth before the First World War that after that war either could no longer be passed from mouth to mouth or whose passage from mouth was no longer as highly valued as it had been before the war?” and to consider Benjamin either largely vindicated or largely refuted depending on the quantity and salience of the answers that spring to mind (or perhaps more aptly, because we are after all concerned with things that can be pointed at, finger).  The transition from horse-drawn streetcars to electrically powered ones presumably occasioned a more-than-negligible devaluation of certain kinds of experience; presumably the presumably complicated skill of managing a horse in an urban setting was no longer as highly valued after the transition as it had been before it; presumably even the routine of being a passenger on horse-drawn streetcars exacted certain kinds and quanta of know-how—perhaps boarding a moving car without stepping on horse droppings in the adjoining gutter required a peculiar form of corporeal finesse; or perhaps the quasi-proverbial young man running late for a rendezvous with this sweetheart could soft-soap the driver or grease his palm in a peculiarly artful fashion by way of getting him to apply the whip with a trifle more alacrity and thereby convey him (the young man) to his destination a few minutes ahead of the timetable-designated time. And presumably it must have come as a disappointment to such a young man to find such savoir-faire (and indeed savoir-vivre) suddenly utterly useless just when he would have had the opportunity to pass it on to the next generation—that is to say, when offering counsel to his own younker of a son on how to comport himself on a date-night. So the example of the rider of the disappearing horse-drawn streetcar would appear to be classifiable with all other examples of technology-induced human obsolescence, from the scythe-wielding mower displaced from the wheat-fields by the combine-harvester to the expert COBOL or FORTRAN programmer unfortunate enough to survive into the age of C++. It is a bit strange that Benjamin chose to pair an example of this phenomenon with the First World War because it certainly did not originate during the First World War and was at least a****bly more pronounced at least in civilian life in certain pre-1914 epochs. Still it was sweet and commendable of him to consider it as Erfahrung, as a phenomenon that was objectively measurable and capable of rising and falling in value, rather than as Erlebnis, as most highfalutin authors before him (e.g., the Thomas Hardy of Tess of the D’Urbervilles) had done, to dwell not on how miserable it had felt to be displaced by a combine harvester etc. but rather on the quantum of transmissible know-how devalued by such displacements. The salience of World War I itself qua Erfahrung-devaluer is obviously much starker. In this connection one recalls, for starters, that the men who fought in that war had for the most part been born in the 1880s and 1890s, that its most plausible predecessor—i.e., the most recent conflict immediately embroiling the major European powers—the Franco-Prussian War, had ended in 1871 (and had not directly involved any of those powers apart from France and the principalities then coalescing into unified Germany), that accordingly their fathers had in general been too young to fight in that war, that whatever meager experience their grandfathers about warfare might have communicated to them must have proved useless to them given that the First World War turned out to be radically different and immeasurably more cataclysmic than the Franco-Prussian War; then one reflects that the generals and heads of state who initiated had in fact been expecting it to play out as a sort of larger-scale sped-up instant replay of the Franco-Prussian War, a war that would consist in a speedy victory of one side over the other, and to be over indeed in a matter of months if not weeks rather than a year in the light of intervening technical improvements in weaponry; such that it was not only the doughboys, poilus, et al. in the trenches but also the éminences grises in the capital cities and staff headquarters who were blindsided by its destructive duration. In the light of these considerations, it seems clear that the First World War must have been a massive devaluer of experience at multiple levels; that it must have nullified not only the knowledge of war that fathers passed on to sons but also the knowledge thereof that national governments passed on to their successor-governments—hence perhaps, at least also in the second case, knowledge not entirely passed down literally “mouth to mouth” (although many a strategy-orientated pre-1914 chinwag may have taken place between, say, Pat MacMahon and Georges Clemenceau or Art Balfour and Hank Asquith), knowledge partially passed on by statesmen and military leaders in their memoirs and letters. By the end of the war it must have seemed that any generalization or prediction about war that anyone, regardless of his credentials and service-record, might make was vulnerable to instant discrediting. But of course the First World War was not (eventually) called the First World War for nothing; of course, about a decade after the composition of “The Storyteller” it was followed by a Second World War, and given that Benjamin did not survive that war and indeed lived barely a year into it, such that he never had an opportunity to weigh in on the value of experience in the conduct of that war, it is an entirely fair question to ask whether military experience proved to be more valuable in the course of that war than it had proved to be in the course of WWI. I thought that the historical consensus had long since shewn the definitive answer to this question to be a firm “No.” Indeed not: the historical consensus hath if not quite conclusively shewn then at least unanimously claimed that pre-1939 political experience was either insufficient—i.e., of too little value—to prevent the outbreak of a Second World War or that pre-1939 political experience, although of adequate value to prevent the outbreak of such a war, was too little or poorly heeded to prevent that outbreak. Hardly a textbook example of a firm answer, what-what? But as to the question whether the generals and statesmen behind the Second World War found their military experience of the First World War valuable in their conduct of that Second—well, the present writer is certainly no military historian, but his limited-but-by-no-means-negligible knowledge of the history of that Second suggests to him that that question merits at least a tentative yes for an answer.  I would think that the doubling of the casualty total from about 40 million to 80 million would suggest a very firm answer even to that question.  Yes-esque, yes-esque, yes-esque, I can see where that doubling would suggest as much, but it would seem that as in your previous rejoinder, you are still inappropriately measuring the value of experience in terms of its efficaciousness in reaching the best or better outcome in absolute terms—viz., the outright avoidance of war, or the minimization of its casualties, respectively.  But when one considers the value of experience vis-à-vis the aims of war itself from the point of view of each of the belligerents, when one considers experience’s contribution to the aim of defeating the other side, it cannot be easily denied that military experience increased in value between the First and Second World War, that each of the powers involved in the Second World War reflected on their mistakes in the First and consequently managed to avoid many of those mistakes, that in particular by the beginning of the Second World War they had acquired some sense of the extent to which the evolution of military technology needed to impinge on military strategy, and how developments in that technology since 1918 necessitated a different approach even than an approach that would have been successful in 1918. And you maintain this in the teeth of the fact that the Second World War lasted nearly two years longer than the First? Indeed I do, for if the value of experience fruitfully brought to bear on a conflict increased in direct proportion to the brevity of that conflict one would have to regard the experience brought to bear on the Anglo-Argentine Falklands War or the U.S.’s invasion of Grenada as more valuable than that brought to bear on either of the World Wars. The Second World War was from the outset a much more complicated conflict than the First, such that had it been fought in the absence of any experience acquired from and since the First, there is almost more than figuratively no telling (or writing) how much longer than the First it would have lasted because it would have suffered from the long stretches of futile (if fatal) activity, of virtual if lethal stasis, that typified and dominated the First, but in a greater number of settings involving a greater number of participants.  As it turned out, the only portion of WWII that even remotely approached WWI in point of stasis was the so-called Phony War or Sitzkrieg of the autumn of 1939 through the spring of 1940. For the remaining ca. five-sixths of WWII, there was almost daily significant movement on some front or in some theater, and almost every such instance of motion was expedited by valuable experience. Because this essay is at least not an essay on military history (and because I remain no military historian) I shall not itemize even a handful, or even quite a half-handful of such experience-expedited instances, but lest I leave myself open to a challenge from an akimbo-armed and chin-and-lower lip upward-thrusting churl peremptorily ejaculating, “Name one!” instead of “You can never understand me etc.,” I shall itemize two of them—the allies’ seamless mutual integration of naval power, counterespionage, and at the invasion of Normandy, and the U.S.’s obviation of a protracted and bloody conventional pre-occupation battle by the dropping of the atomic bombs on Hiroshima and Nagasaki. (And no, I’m not going to engage with a demurral Was the dropping of those bombs really worth it, given that it resulted in hundreds of thousands of civilian casualties and inaugurated a form of warfare that threatens to annihilate the entire human race?, because here again we are—or would be—dealing with applied experience vis-à-vis the best outcome in absolute terms, or perhaps in this case relative but long-term terms, and in either case a frame of reference exceeding the scope of the war itself.) So it would seem that Benjamin’s assertion about the devaluation of experience did not hold good at what one might (however reluctantly) term the macro level of the Second World War, at the level of the stratagems and strategies of statesmen and generals; but what about at what one is now obliged (however reluctantly [and that reluctantly is a very big reluctantly indeed, a much bigger one that attended the preceding use of macro]) to term the micro level thereof, the level of the experience of the average GI, or the average poilu or whatever French soldiers were called during WWII (or perhaps, rather, in the light of France’s early surrender to the Germans, the average French resistance fighter, the average maquisard)? Did they learn anything useful about soldiering or sailoring from tales told to them by their fathers, or more likely [what with the average doughboy or poilu’s being on just barely old enough on returning from WWI to sire a son old enough to serve in WWII] uncles, or even from boning up on books and movies about soldiering or sailoring in the First World War (notably All Quiet on the Western Front and The Grand Illusion)? On the whole one gets the impression that they didn’t. The present writer is perhaps at least by a criterion of admittedly dubious value able to speak on this matter with greater authority than on the matter of the value of such experience in World War I—the authority, namely, of having personally known two combatants in the Second World War, namely, his grandfathers, both of whom served as seamen in the U.S. Navy for virtually the entire duration of the U.S.’s involvement in the conflict. Ah, how wonderful (you interject while rubbing your hands together in eager anticipation): you must have hours of wartime anecdotage to regale us with. Ah, but that’s just it you see: I haven’t even a single minute of such anecdotage with which to regale you because neither of my grandfathers, both of whom died in the mid-1990s, ever told me a gosh-damned thing about his wartime service. Ah, but surely your parents must have heard from their respective fathers many a wartime anecdote that they related in turn to you. They most assuredly didn’t hear many such an anecdote, because their expression of regret at having obtained almost no such anecdotage was one of their shared pet self-flagellation texts in the years following their respective fathers’ deaths. They always maintained that it had not entirely been for want of trying that they had not succeeded at obtaining more of such anecdotage, that they had from time to time tried to get the old gentlemen (or perhaps rather old ex-young salts) to open up about their respective wartime experiences, but that on each of these occasions the dad-cum-granddad at hand-cum-ear had always brusquely changed the subject, directed his interlocutor’s attention to some random object in the room and immediately disappeared, stuck out the palm of his hand as if to block an imaginary paparazzo’s camera, autc. And my parents always presumed that the reason the old men had refused to talk about the war was that such talk would have revived unpleasant memories that they had preferred to keep dormant; and doubtless that had been at least one reason for their close-mouthedness, but perhaps—at least conceivably perhaps—the main reason had been that they had not themselves found their wartime experiences sufficiently interesting to devote to them the amount of attention requisite to relating them to other people. My mother’s father spent almost the entirety of his war service well beyond even the threat of action, at a base in Australia at which ships (and perhaps also submarines) were repaired in between engagements; essentially he was not involved in the war in any capacity that would have differed dramatically from a peacetime one—altho’, to be sure, having always been something of an Australophile, I doubtless would have found captivating anything he had had to say about his doubtless thousands of hours spent ashore in Melbourne (or perchance Sydney) during intervals of leave.  And in point of fact during one of these intervals he did find time to generate a document of that shore-time that probably survives to this day, as I heard it as recently as the 1980s—viz., a gramophone record whereon a female voice (presumably the manageress of whatever mini studio-cum-pressing plant at which the record was made [altho’ my grandmother may for all I know have interpellated her as one of the conjectural wide-heeled sheilas she perhaps only half-jokingly fancied had borne my mother and her two fellow- legitimate little Pullaras a succession of half-siblings]) greeted his wife, my grandmother, “as Mrs. Pullara in Florida,” then handed the mike to Mr. Pullara, who then sang my grandmother a song that he claimed to have made up on the spot (and whose words I have since forgotten, altho’ as I remember the tune I do not despair of ascertaining via some sort of tune-recognition robuht, as my grandfather could not avoid betraying that it was in fact one of the hit tunes of the war years when he sang it in my presence several years after my audition of the record), and wound up by expressing great delight at the recent birth of the couple’s first daughter, my aunt Bernice.  It was all very sweet, but not very enlightening even about Australian civilian life. To be sure, my paternal grandfather saw action in two separate theaters, and indeed at perhaps the two most famous battles of those theaters, Normandy and Okinawa. I know nothing of his duties at the later battle, but according to my father (who apparently did get him to open up at least for the quarter-minute required to divulge the following datum), at Normandy he accompanied transport-boatloads of soldiers from ship to shore and was required to lower the barrier at the front of the boat and thereby enable them to land and storm the beach. Now, however complicated the mechanism for lowering that barrier may have been, it could not have required a great deal of skill to actuate, and so the knowledge of its actuation could not have amounted to much in the way of an experience conveyable to another person by way of mouth. To be sure, as my grandfather presumably did not accompany only one boatload of soldiers in the course of the invasion, as he presumably was obliged to travel to and from the ship and the beach a minimum of several times in that course, he must have come to find the actuation of the mechanism a bit tedious, a bit boring, but the boredom would not likely have been productive of the crystallization of a good story, because unlike the reliably ho-hummish boredom associated with spinning and weaving it was presumably constantly attended by the fear of imminently being shot to death.  I should think the phenomenon of having to perform such a mechanical action repeatedly while being all the while subject to the threat of imminent death would better be described as traumatic rather than boring. Why, I suppose it would, and what is trauma but the most prized commodity of the fetishizers of Erlebnis, of the sort of experience that cannot be communicated from mouth to mouth and hence that he would not even have bothered trying to communicate via an anecdote? And in any case, one must presume that even my paternal grandfather spent many long months—namely, the ones in between battles—that were boring in quite a strong yet unspinnerly-cum-unweaverly way, months that were as devoid of the fear of imminent death but involved very little mechanically repetitive labor; months that were probably even more devoid of anecdote-worthy incident than my other grandfather’s time in Australia, inasmuch as they must have been spent on the de-facto boring-beyond-belief high seas. And I recall happening to hear over the radio not very long ago—but long enough ago, of course, that I can’t remember the name of the dash-blamed thing (altho’ I do recall that Leonard Bernstein had something to do with it)—a sort of round-robin musical about the early months of the U.S. war effort composed and performed at lightning speed during those selfsame months. One number of it described or recounted the incredibly elaborate and dilatory preparations involved in that effort and the bemusement it was occasioning chez the GIs who had naively enlisted in the hopes of immediately impaling the asses of Japs and Krauts with their bayonets; its chorus—delivered to an all-male chorus in the other sense—consisted almost entirely of the querulously iterated observation that from the GI’s point-of-view (and hearing, touch, etc.) the whole super-affair was a matter of “hurry[ing] up and wait[ing].” It was as if by then the U.S. soldiery were resigned to their experiencing of the entire war as an experience-free experience. And the smattering of descriptions and representations of the Daseins of veterans of the war in its immediate aftermath suggest that the conflict was complementarily retrospectively regarded by them in those very terms. To be sure, the film The Best Years of Our Lives did its best to ascribe a dose of pathos to this retrospection by depicting—in proto First Blood-ian fashion—the veterans’ fellow-citizens as scornfully indifferent to the suffering they had endured (most graphically in a scene centering on a young vet who had lost an arm in battle being heckled for his claw-like prosthetic by a passel of local hobbledehoys too young or cowardly to have served themselves), but the over-the-top-ness of the spectacle did not ring true; it had (and still has) the feel of something concocted in advance, perhaps many months before VJ day, by Hollywood script-doctors who took it for granted that this was how the veterans would or should be received at home–perhaps based on their (the script doctors’) reading and viewing experience of post-WWI-malaise- saturated novels and films like A Farewell to Arms and Petrified Forest. David Riesman’s reporting of his findings of his interviews with actual veterans in The Lonely Crowd indicates that indifference was to the contrary the prevailing attitude of the veterans themselves: “The veterans of World War II bring scarcely a trace of moral righteousness into their scant political participation. These men ‘ain’t mad at nobody’” (179)—i.e., presumably, in a specifically political context, not angry at their representatives in Congress and other governmental-institutions for not affirming the validity of their experience via concrete policies (although, to be sure, part of their lack of resentment might have been ascribable to the U.S. government’s forestalling of grievances via such policies—e.g., the GI Bill). And imbuing all of this like the smell of a pre-smoking-ban bar at six o’clock on a Saturday or Sunday morning is one’s (i.e., my) awareness of all the off-the-scale partying that went on in the American armed forces throughout the war—all the benders, whoremongering sprees, Glenn Miller-powered dances, Tom Mix-powered cinema visits, Marlene Dietrich-powered live cabaret shows—at which the average GI never failed to be present whenever Uncle Sam dozed off or lowered his index finger; one’s awareness, effectively, that for all the substantially greater danger to these men’s corporeal well-being, in a certain very real sense the everyday phenomenal lives of U.S. servicemen overseas were not radically different from those of U.S. civilians in Pittsburgh or even Peoria. Gore Vidal, I believe in his second memoir, Point to Point Navigation, which as its name suggests contains a great deal of material on his WWII navy service, clinches this historical typicality in his blasé observation, “Most of us were just trying to get laid.” Of course, even after making allowances for the contribution of Vidal’s specific sexual proclivities to the synthesizing of this resolution (making allowances, in fine, for the fact that as an unabashed if still officially closeted homosexual, he must have found the all-maleness of shipboard life more libidinously stimulating than his heterosexual shipmates must have done), one is loath to reduce the Erfahrungschaft to a half-decade-long partouze, but it is doubtless true that the average U.S. World War II veteran’s memory is dominated by the sexual and perisexual paraphernalia of the 1940s—nylon stockings, pop songs like the one quoted by my grandfather on the abovementioned record, pin-ups of Betty Grable et al., etc.—rather than by anything inalienably associated with any aspect of the military Dasein. And—not to stray too far from the 1940s, for I aim to return to them very shortly—it seems to me that from all that I have learned of the experience of the average U.S. solider or sailor (not to mention the average U.S. airman or marine) in subsequent conflicts—an all on which in certain admittedly equivocal respects I am perhaps more qualified to comment than on the all thereof vis-à-vis WWII, inasmuch as, despite my lack of close personal acquaintance with any of the combatants in these conflicts (for altho’ my father spent a few months of 1971 in the U.S. Coastguard, in which he enlisted to avoid his immanent drafting into the army and presumptive subsequent out-shipment to Vietnam, the farthest he ever got from home was New Jersey, at a boot camp in which he was billeted until being discharged for medical reasons [viz., scoliosis]) I have learned it at a smaller temporal remove, and in the case of every conflict from the so-called Gulf War (which may be termed the St. Petersburg [Florida, not Russia] of wars, inasmuch it was simply known as the Gulf War from even before its inception [one must recall that the parrain of hyperreality, Jean Baudrillard, wrote and published a newspaper article called “The Gulf War Will Not Occur” during the so-called run-up to the war in the last three or four months of 1990] until the inception of the war that eventuated in the occupation of Iraq, when that war was christened Gulf War II and the earlier war consequently retroactively christened Gulf War I, as it consequently remained known until at some mysterious unpinpointable point in the mid-to-late 20-teens Gulf War II gradually stopped being referred to as Gulf War II and began being referred to as the Iraq War, leaving the I-suffix of GWI to wither and finally fall off like an under-irrigated tree-limb), I have acquired the knowledge in so-called real time, from accounts rendered by veterans newly returned from the battlefield or even in between tours of duty. Over and over again one hears from these people accounts of unimaginable horror—of unimaginably painful and disfiguring depredations visited on the human body, whether on the body of the combatant himself or of the bodies of his fellow-combatants or even civilians—and much more often than not these accounts originate from the very thick of battle; and yet only very seldom do they impart a sense of the specifically military nature of that thickness, a sense of exactly why from a strategist’s or tactician’s point of view the sufferers of the damage were at the particular place where they were when they suffered it. One of course hears the relators of these accounts described as sufferers of trauma, or more clinically, of post-traumatic distress disorder, but they might no less accurately be termed experiencers of negative Erfahrung-free Erlebnis, as experiencers of a painful kind of experience whose painfulness is augmented by the fact that it is utterly incommunicable to other people as useful knowledge, as either expertise or counsel. In short, at the resolution of the individual soldier or sailor in the trenches, on the high seas, in the jungle, on the dunes, autc., Benjamin’s assertion about the devaluation of wartime experience appears to have been consistently borne out by all the wars that have taken place since the composition of “The Storyteller.” “That said,” and as mentioned before, the experience of soldiers et al. is not confined to their time in the trenches, autc., but as not mentioned before, that extra-trenchial experience is not necessarily exhausted by the experience they share with their civilian contemporaries—by all the pop-songs, pin-ups, movies, etc. At least in principle, a soldier aut al. is capable of experiencing during his wartime certain things that he is capable of carrying back to his civilian Lebenswelt and communicating to others in that Lebenswelt; he may, in a word, acquire certain skills that are transferrable to civilian life. And this observation occasions the above-promised return to the 1940s, and specifically to my two grandfathers during the war. Admittedly, the two of them do not furnish examples of such skills-acquisition of equal salience. My paternal grandfather seems not to have learned much of any lasting use in the navy, at least to judge by his line of work in my childhood, when he worked at a bakery, and not some mom’n’pop bakery whereat he presided as the pop, but rather some massive industrial bakery where he served as perhaps one of dozens of dough-slingers, and where the end-result of the dough-slinging was not some sort of lovingly hand-misshaped artisanal loaf but rather a loaf of the sort of perfectly shaped, plastic-sleeved sliced bread that one bought at the supermarket (of the mass-produced character of the finished article I am certain because he would occasionally bring me peculiarly tiny sample loavelets of two or three slices packaged in the livery of one of the bakery’s brands, Roman Meal, a livery consisting of the image of a red-caped centurion standing against a yellow background). It would have been more than a bit of a stretch to term his job a trade because it had presumably required a none-too-lengthy-or-complicated course of training and because it could be dropped at the literal drop of a literal hat (specifically a paper one) without more than minimally disrupting the operations of the operation at which it was practiced (at least at the resolution of an individual hat-drop, for of course a massive strike or walkout of all butcher’s-dozen dough-slingers presumably would have brought operations at the bakery to a halt). My maternal grandfather, on the other hand, spent his war years picking up the rudiments of an occupation that has a long and illustrious history of being regarded as a trade, a history that, indeed, extends at least as far back as Plato’s Republic, namely cobbling or shoe-repairing, and upon exiting the navy he secured his cobblering credentials at a trade school with the help of the GI Bill and opened a shoe-repairing shop that he kept in operation as its sole proprietor for nearly half a century. To be sure, there were differences between the shoe-repairing practiced by my grandfather and the shoe-repairing practiced in Plato’s day, and not all of these differences were merely ascribable to differences in the style of footwear; to be sure, his manner of making shoes was dependent to some extent on technology that would have been unavailable to a shoe-repairman of Plato’s day or even of (to stick to philosophers) Nietzsche’s, such that he—along with every other practicing shoe-repairman of the mid-to-late twentieth century—could scarcely be regarded as the umpteenth-great vocational grandson of Plato’s shoe-repairman, or even the vocational grandson of Nietzsche’s. I recall with particular vividness a machine that was like a pair of exceptionally powerful electro-mechanical teeth within which my grandfather would pivot sheets and squares of vulcanized (?) rubber until they assumed the rough shape and size of brogue-soles, stiletto heel-tips, autc. I doubt very much that Plato could have guessed the ultimate aim or final cause of this machine even if he had seen it in operation. Nonetheless, the finer shaping of the heels and soles had required the use of hand-tools whose basic construction had presumably not changed in millennia, the heels had had to be hammered on with an ordinary hammer, the soles to be sewn on by a hand-operated sewing machine (or perhaps even directly by hand with a needle and thread [for I remember the sewing machine in use only in connection with my grandfather’s sideline of repairing the protective padding worn by members of our local professional football franchise, the Tampa Bay Buccaneers]), and throughout my grandfather’s workday he had constant recourse to a singularly old-fangled metal object in the shape of an inverted foot atop a waist-high pole, along with dollop after dollop of noisome brown glue slathered on with a sort of giant Q-tip that was replaced after each use in a sort of giant inkwell encrusted all over volcano-esquely in layers of petrified glue slurry that seemed to be of genuinely geologically primeval antiquity. In short, I do not doubt for an instant that if Plato had been vouchsafed a time-traveler’s tour of the shop he would have been in any doubt about its ultimate aim or final cause despite its mystifying bits of apparatus unknown in fourth century-B.C. Athens. And such absence of doubt being the case chez moi, I find the notion that orally transmitted experience insgesamt was moribund some four decades before my birth rather hard to swallow or otherwise assimilate. Understandably. But is your grandfather’s shop still up and running and doing a booming business under the proprietorship of his successor and former apprentice? [The present writer, looking you steadily in the eye, and not even a wee bit ruefully]: No. My grandfather never had a proper apprentice, although he did retain an assistant at certain intervals. One of these assistants was his son, my uncle, and for a time at least he cherished a hope that the latter would take over the business. But the latter showed no serious interest in such an overtaking. And was no other enterprising shoe-repairman, perchance grandfather’s principal rival in the catchment area, the Cunningham to his George Jefferson, so to speak, interested in purchasing the business? Evidently not, because about a year before his death, my grandfather shut down the shop and moved all its equipment into a storage shed next to his and my grandmother’s house, whence I know not whither it ever went or when. Very well, then; I rest my case: your Platonic shoe-repairman’s shop may have survived into the ninth decade of the twentieth century, but it was obviously already moribund by then, or else your grandfather would have had no trouble getting someone to take over the business in some fashion or other. [I, still 100% rue-free, and my gaze still locked in lock-step with the reader’s:] And I freely concede the validity of your newly rested case. Oh, I see, I see: you fancy you’ve got a t***p card up your shirtysleeve, the trump card of the new thriving artisanal cottage shoe-repair industry, thanks to which for a trifling sum of a half-grand plus shipping amounting a trifling additional cee-note or two one can have one’s shoes resoled while one waits (specifically waits thrumming one’s fingers on one’s breakfast or dinner table while sitting thereat barefoot for two months). Indeed I have got that very card but not up my sleeve qua t***p card for this stage of the argument (or, rather, for metaphorical consistency’s sake, game) but rather ready to hand in an unspecific way qua ordinary court card for a later stage thereof. At the moment all I wish to do is show that at least at the resolution of the individual, the sharp and permanent devaluation of experience in one domain—that of war—did not proceed in historical lockstep with the devaluation of experience in another, the plying of trades. And my grandfather’s successful—or at least functional—plying of the trade of shoe-repairing until within spitting distance of the dawn of the third millennium has quite serviceably allowed me to show just that. So Benjamin was off by a measly seven or eight decades. Big deal. Perhaps it would indeed be no grate sheiks if shoe-repairing had for all those decades been a single hand in the wilderness (or above the deluge); but a moment’s reflection in conjunction with a month’s fact-checking reveals that despite wave upon wave of automation, standardization, etc., the world since ca. 1930 has pullulated with anciently established occupations to whose subsistence the transmission of knowledge from hand to hand and mouth to mouth is absolutely essential. Consider the occupation of undertaker, for instance. Doubtless the process of preparing a corpse for inhumation has changed over the millennia and is continuing to change with each passing year; doubtless the American and International Associations of Undertakers’ annual conventions bristle with commercial travelers handing out “literature” on the latest undertaking gadgetry that is eagerly snatched up by the rising generation of undertakers and scornfully spurned by the retiring generation (who in turn doubtless swear by certain gadgetry scorned by the generation they put six feet under); but it is no less doubtless that the fundamental process of embalmment in the twenty first-century Occident retains links to the process practiced by undertakers since time nearly immemorial—since, indeed, the processes of the Egyptian mummifiers described with such detailed gusto by our old pal Herodotus. The dependence of the study of the law on a body of knowledge extending back at minimum hundreds of years (if one takes the U.S. Constitution as one’s starting point) and “arguably,” “in a very real sense,” several thousands of them (if one starts with the code of Hammurabi) should go without saying despite the lunatic race-Marxists’ zealous efforts to discard the entirety of property law.  Even such an upmarket and supposedly technology-driven profession as medicine is heavily reliant on ancient trade-lore. I remember not many years ago hearing an interview with a forensic physician wherein she extolled the pricelessness of certain maxims of Hippocrates—among them the refreshingly frank “You’re not dead until you’re warm and dead”—to the everyday practice of her métier. So in short, you maintain contra Benjamin that the God of experience is in his heaven, and all’s right with the world. No, I maintain nothing of that sort, as a certain gaping gap in my account in the recent fortunes of experience should make plain. The gap, namely…The gap, namely, implied by my latest employment of the phrase “at least at the resolution of the individual.” That gap amounting to an implication that experience has indeed declined in value at the level of so-called society? Yes, after a fashion, which is to say that I do at least believe that at the level of so-called society, the value of experience has contingently become underappreciated, which is certainly not at all the same thing as to say what Benjamin appears to have said, namely that the objective tendencies of the development of society have rendered experience objectively and essentially less valuable. Let us for non-political economy’s sake return to the case of the applicability of experience to war at the macro-level. As I have shewn, experience at this level undoubtedly increased in value between World War I and World War II. But what about between World War II and the wars that have been fought in the intervening three-quarters of a century? Here again the present writer fancies that he possesses some extra modicum of insight in virtue of having coexisted with a goodly proportion of these wars, although here again (not that he thought to specify this last time) that extra modicum is present largely if not quite exclusively vis-à-vis the wars in which his native polity, the United States, was involved. To cite a counterexample: less than a week ago, I was listening to a five-or-ten-minute-long top-of-the-hour news broadcast that I had taped from the BBC World Service on November 2, 1986 and that I had revisited only three or four times since. During this audition of a few new ago I noticed that the presenter mentioned an outbreak of civil unrest in Karachi, Pakistan, specifically of violence against the city’s Pathan population, or perhaps violence exerted by the city’s Pathan population against its larger non-Pathan population. The mention implied that the outbreak was connected to some conflict of longer duration without elaborating on the nature or origin of that conflict, as was perhaps not unsurprising in the light of the brevity of the entire broadcast. Until that audition, for all my previous acquaintance with the broadcast, I had thought of the Pathans as exclusively indigenous to Afghanistan, and I might not even have ever have thought of Pathans as any sort of people at all had not a certain fellow-Baltimorean—the manager of a local bar whose accent bespoke his non-North American provenance—been identified to me as a Pathan by another fellow-Baltimorean, an Indian Punjabi (who concurrently somewhat disturbingly described the Pathans as the real Aryans), in ca. 2003—i.e., when the so-called War on Terror had been up and running for ca. two years—and had I not then looked up Pathan at the online reference of first resort (or perhaps some more established online or offline alternative [for in ca. 2003, W*******a was still so unrespectable that the present writer was both obliged and painlessly anonymously allowed to correct some quite glaring errors in an entry on one of his favorite composers]) and learned that Pathan was but an alternative designator of the ethnicity known as Pashtoon, the ethnicity as which all the usual official news sources, presumably including the BBC, had identified the portion of the Afghan population from whom both the majority of the Taliban and the country’s first post-Taliban U.S.-installed president, Hamid Karzai (himself the brother of one of Baltimore’s preeminent restaurateurs), hailed. The very belated discovery that some sort of Pashtoon-involving mini-civil war was taking place in Pakistan 36 years ago, and hence exactly contemporaneously with the Russian occupation of (or war in) Pakistan’s next-door neighbor, Afghanistan, an occupation (or war) in which certain Pashtoons, incipient or prospective members of “our” future enemies, the Taliban, were on “our” side, leads me to wonder if I have not all along been missing some vital or crucial piece in the puzzle of recent peri-subcontinental-cum-peri-Middle-Eastern military history and hence, conceivably, of recent military history insgesamt. All of which—the “which” being the present counterexample—is meant, essentially, to inject a certain admittedly minute quantum of humbleness into the present phase of my argument, a humbleness arising from my awareness that my generalizations about recent military history insgesamt vis-à-vis-macro-level experience are founded on knowledge of a presumably infinitesimal—and only infinitesimally increasing—proportion of the wars that have been raging and simmering over the past half-century. “That said,” given that the United States has been one of the world’s great military powers throughout that half-century, such generalizations cannot but enjoy a certain degree of irrefragable validity because in virtue of being a great military power the United States has always been in a position to impose the fruits of its experience—or of its lack or disregard thereof—on a substantial proportion of the rest of the world. So anyway(s), to specify the first of the major occasions of this imposition during that half-century: this is obviously the Vietnam War, which ended in 1975, when I was three years old, such that—what with my very first memories of so-called major news events hailing from 1976, and specifically from the U.S. presidential campaign (and specifically from the moment when I laid my wee tyke’s hand on the jacket-sleeved forearm of then-mere-candidate Carter during a local flesh-pressing session and was immediately pounced on by a member of his Secret Service mini-detail, who feared that I might have been being used by my parents as a bomb-conveyor)—I have nothing at all to say about in-real-time reportage on the war itself, but of course that war cast a very long and very fat shadow in the form of hundreds or perhaps even thousands of references to it in movies, television dramas, sitcoms, and pop songs throughout the remainder of the 1970s plus the first ca. three-fifths of the 1980s, so well into my adolescence. And for all the heterogeneity of the sources of these references, each of them tended to be distillable into a single assertion, viz., The Vietnam War was the first war the United States ever lost, an assertion that quickly ossified into an idée reçue in virtue of its unanimity and virtual uncontestedness. I beg at least the right to beg to differ with the assertion embedded in the preceding dependent clause as follows: mightn’t the notion that the Vietnam War was the first war the United States ever lost have become an idée reçue at least partly because the Vietnam War actually was the first war the United States ever lost? Well, I don’t know. It depends on what one means by lost. If the Vietnam War was indeed actually the first war the U.S. ever lost it would seem logically to follow that the U.S. won the war in which it was engaged immediately prior to the Vietnam War, namely (at least as far as I now know, altho’ I don’t doubt that as with the Pathan uprising of 1986, some petty skirmish of the mid-to-late 1950s is qausi-officially known as one of the U.S.’s wars) the Korean War of the early 1950s. But the continued partition of the Korean peninsula into two territories-cum-polities and presence of a substantial U.S. military force at the line of that partition in 2022 would seem to give the like to the notion that the U.S. won that war. I believe most military historians would maintain that that partition-cum-presence shews that the Korean War ended in a draw or stalemate, an outcome that, inasmuch as it is categorically distinguishable from a loss, would seem not to give the weest ghost of a suspicion of a lie to the notion that the Vietnam War was the first war the U.S. lost. I don’t know one way or another whether most military historians would maintain that, and in any case, even if every man Jack of them ([sic] on the absence of the appending of “or woman Jill,” an absence enjoined by the statistical nullity of female military historians) maintained that I would be inclined to ascribe their maintenance of it prevailingly to the universality and obduracy of the idée reçue.  To be sure, it is certainly plausible to regard the outcome of the Korean War as a loss if one ex-post-facto defines the U.S.’s (or, technically, the UN’s, but of course, like the {First} Gulf War, the Korean War was basically the US’s baby [even if one mustn’t write off the UK’s contribution to the UN force, a contribution perhaps most movingly memorialized in Basil Fawlty’s offhand reference to his commission of several kills during the war]) objective in the war as the confinement of Chinese Communist hegemony to the northern half of the Korean peninsula. But considered vis-à-vis the U.S.’s actual objective in the war—viz., the complete exclusion of Chinese Communist influence from the peninsula, the war must be regarded as at best a light loss. And by the same or a nearly similar token, vis-à-vis the U.S.’s actual objectives in the Vietnam War—viz. the confinement of Soviet Communist hegemony in the form of the Vietcong’s government to the northern half of the formerly unified polity of Vietnam, a proper, true, and total loss would have had to take the form of the killing of the very last living GI in Vietnam at the tip of the Cà Mau Peninsula at the end of a complete unavoidable retreat of the combined U.S. forces to that peninsula. But of course the Vietnam war actually concluded by a methodical, voluntary withdrawal of living U.S. forces over a period of two or three years, such that the U.S. did not so much lose the war itself as lose interest in continuing to fight it.  So the notion that the Vietnam War was the first war the U.S. ever lost is evidently doing an awful lot of work, as the literary critics say (or used to say); it is trying to make the U.S.’s fortunes at the end of that war seem much direr than they actually were, and indeed to lump them by default in with the lowest fortunes of the –say, Germany at the end of either of the World Wars. The notion came bundled, as it were, with a sort of quasi-Sophoclean meditation to the effect of All great powers eventually get their comeuppance, their moment when they are forced to admit that they are great no more and indeed so ungreat as to be obliged to bum a smoke off dirt’s kid sister, and in Vietnam it was at last the United States’s turn to get that comeuppance after nearly two centuries of strutting and fretting furiously about the world stage like a rabid tumescent gamecock with its head (!) cut off. And the present writer, for one of milliards, accepted this meditation-bundling employment of loss as gospel truth; its efficacy on him was almost exactly consubstantial with the contemporaneous efficacy of the word poverty—which, as I have recounted in another essay, I had grown up regarding as denoting a state of famine-induced malnourishment even when applied to living conditions that patently did not include such malnourishment, notably those of the so-called American inner city. Throughout my childhood and early adolescence (and possibly even the majority of my late adolescence as well) I really did believe that the U.S. was as assuredly a terminally defeated power as Germany, a power whose sidereal hour on the map of the world stage had long since passed. Of course, a moment’s reflection on the big international news events of those years—the news events that were not the matter of a single top-of-the hour newsbreak but of several-to-many entire evening news hours— e.g., the U.S.’s boycotting of the 1980 Olympics, the bombing of Libya, and the Reagan-Gorbachev summit in Reykjavik—would have shown up the utterly preposterous untenability of this belief; or, rather, the thousands if not millions of moments of reflection that I actually did devote to them should have shown it up. But there was a fairly if not quite very good reason that those thousands if not millions of moments did not show up that untenability, namely, because the reportage on these events was being supplied to me by a constellation of individuals and corporate bodies that overlapped extensively with the constellation of individuals and corporate bodies that were propounding the idée reçue that the Vietnam War was the first war that the United States had ever lost. And lo and very belatedly behold, only about a year ago as of this writing (March 12, 2022), it turned out that this idée recue had already been in the process of being cooked up by the news-event propounding portion of that constellation years and years before the war itself ended. Here I refer to the moment at which I learned that no personage less revered or supposedly trustworthy than the avuncular Walter Cronkite, the CBS news anchor who had closed out each of his broadcasts by declaring with officially sub-humble understatement (i.e., actual overweening hyperbole) “That’s the way it is,” had misrepresented the so-called Tet Offensive of 1967, in reality a victory for the U.S. military, as a resounding defeat and thereby laid the groundwork for the official historical interpretation of that offensive as the turning point in the war, the moment from which America’s supposed loss of that war had supposedly been inevitable. In short, the Vietnam War was not to so much lost as sabotaged by a conspiracy of treasonous crypto-commies. No—meaning not that the Vietnam War wasn’t at least “arguably” sabotaged by a bunch of treasonous crypto-commies but that that is not the principal inference to be drawn from the immediately just-remarked-upon characteristics of that war; that principal inference is, rather, that with and in the Vietnam War the experience of war at the macro level went from being a commodity that could rise or fall in value in the established manner, in response to so-called market forces, to a commodity whose value could be preemptively regulated, and regulated by the art or craft of (Whodathunkit?-cum-You guessed it) storytelling. If one doubts that this is that principal inference or rather and especially that this is any inference at all that ought to be drawn from the characteristics in point, one need only consider the Vietnam War as the right side of a diptych, a diptych whose left side is the Korean War, along the same meta-macro-experiential lines as I (or, if you, DGR, prefer, we) considered the diptych comprised by the First and Second World Wars. Under the auspices of that consideration one cannot simply say on the basis of the evidence that in the Vietnam War the U.S. fell further short of its objectives than it had in the Korean War that the experience garnered in the Korean War simply proved to have fallen in value by the time the Vietnam War rolled around. It is true that both the preconditions and conditions of the Vietnam War differed from those of the Korean War in certain substantial ways that a few downright cartoonish strokes suffice to adumbrate—in Vietnam, the U.S. had taken over prospectively permanently for another imperial power rather than parachuted in temporarily to forestall the intrusion of such a power; was fighting prevailingly in rain forests rather than in and on mixed terrain; was confronted by small bands of guerillas rather than large organized armies—but basically the two wars were identical in official terms and aims and the attainability of those aims: they were both wars of containment that neither admitted of total victory nor entailed total defeat; basically the U.S. could have kept the North Vietnamese at bay indefinitely just as it has kept the Korean Communists at bay indefinitely if it had really wanted to. Very well, then: the clinching difference between the two wars is simply that the U.S. didn’t want to keep the to-be-contained enemy at bay in the later one, and corollarily that the experience acquired in Korea fell in value by Vietnam simply because nobody had any interest in putting it to use; that it was a falling in value comparable to the fall in value suffered by, say, rollerblades in the 20-oughties rather than that suffered by dial-up interweb access in the same decade. To the semi-contrary, that would have been the clinching difference had the abandonment of the war been actuated by a conclusion on the part of the generals and their masters in the executive branch that Vietnam was not an important enough country to be kept non-Communist or more broadly that containment of Communist expansion was longer a goal worth attaining.  But the abandonment was not actuated by such a conclusion; it was actuated, rather, by a sense that the war was morally wrong simply in virtue of resulting in the deaths of people on both sides. But again, even putting it this way does not quite seem to get at why the war wound down as it did. Why of course it doesn’t, because the sense that the war was morally wrong came not from the generals and their masters in the executive branch but rather from the American public; and such being the case, its wind-down does not seem to be substantially any different from that of presumably thousands of unpopular-on-the-home-front wars throughout the millennia. I agree that the just-twice-mentioned sense emanated in part from the American public, but I do not concede that that means the wind-down was unremarkable. For one remarkable thing, the opposition did not come entirely from the public, unless one wishes to include in the public certain multi-milliard-strong corporate entities, for as mentioned before, the intrinsically anti-war-like notion that the Tet Offensive was a U.S. defeat rather than a U.S. victory came from the newsroom of the Columbia Broadcasting Service. And for another remarkable thing, the opponents of the war could seldom stop at representing it as an isolated event that had to cease and after whose cessation Americans could resume the collective pursuit of worthier goals (e.g., the Apollo moon landings); they could seldom resist representing it as somehow indicative of the entire American polity’s deep-seated and pervasive moral turpitude, as something that most certainly would not go away once the war had ended, something that Americans would have to continue wrestling or grappling with for many decades if not eons to come.  And for yet another thing, the opponents could seldom content themselves with evincing their opposition by marching down the street and waving angry handwritten signs like ordinary politically disgruntled people; no, they had to write so-called protest songs and perform them at giant festivals where they took off their clothes and coited and consumed vast quantities of mind-altering drugs; such that they obviously were enjoying their opposition to the war a bit too much and probably would have been quite happy for it to continue so that they could continue inordinately to savor their opposition to it. Finally but not finically, these opponents fetishized the distinctiveness of the conditions of the war in a peculiar way, albeit not by specific contrast with the Korean War; they believed or would have others believe that neither the U.S. nor any other wielder of conventional military force had ever contended with warfare in heavy untended foliage or involving guerilla forces or against people who were fighting to protect their own Heimat (to cast this fetishization in my adapted Benjaminian terms, they believed or would have others believe that in Vietnam the value of macro-experience had fallen as close to zero as it had ever fallen, hence lower even than during the First World War). In reality, of no course, many such a wielder had coped with such warfare, and indeed, for striking examples of such copage they need not have looked beyond the U.S.’s shores albeit rather far back into American history—viz., to the so-called French and Indian War, the American theatre of the Seven Years War of 1756-63, during which a certain dashing young British army officer by the name of George Washington successfully participated in the decisive trouncing of a guerilla-dominated enemy force of (surprise) Frenchman and Indians in the pine needle-infested and occluded wilderness beyond the then-western frontier of the British colonies; and the so-called Revolutionary War or War for American Independence, wherein the colonists-cum-future United Statesians themselves had engaged in a fair amount of sylvan guerilla activity. Ah but in adducing that second example you have just been hoist by your own petard (or perhaps, even more germanely, because we are after all dealing here with political as well as military history, in this example “the latter half of your commonwealth forgets its beginning”), for the British did after all lose the War for American Independence (whence its present sportage of that name instead of, say, French and Indian War II [i.e., inasmuch as the colonists were assisted by both the French and the Indians therein]).  Ah, but in adducing my adducing of this example you are merely shewing how abjectly benightedly beholden you are to the idée recue of Vietnam as the first-ever U.S.-lost war-cum mighty castrator-cum-decaptitater of the U.S.’s geopolitical ambitions, for whilst it is true that the British eventually stopped resisting the American colonies’ desire for independence, they were hardly compelled to do so by military inadequacy, and they were thereafter so far from limping back to Blighty to breathe their terminal geopolitically oriented breath without so much as a self-preservative wound-lick, that barely twenty years later they fended off an invasion of their shores by Napoleon, barely thirty years later they had kicked his diminutive ass all the way to Elba, by a hundred years later they had established that unprecedentedly huge empire on which the sun famously-cum-notoriously never set, which they still enjoyed full possession of a full hundred-and-fifty years later. Very fudging well: I take your fudging point. But what is the fudging point of this selfsame point? The fudging point of it is that by the micro-epoch of the Vietnam War, we reached a point by which it simply no longer made sense to speak (or, rather, write!) of war as a delimited process engaged in for certain ends by certain people, namely statesmen and soldiers (and sailors), as a process describable as “the continuation of politics by other means” in Clausewitz’s (or are they Shih-Tsu’s?) words, when war became perhaps not solely but “arguably” mainly a process engaged in for the sake of gratifying a nebulous but massive (and therefore ineluctable) congeries of sorts of people’s cravings for certain sorts of stories, stories that in contrast to their Benjaminian antecedents never offer counsel but always offer affirmation of a certain Weltansicht. Ah, I’ve got you sussed, Dougie Boy; I twig what you’re doing—viz., disingenuously serving up a rewarmed helping of Jack Baudrillard’s “The Gulf War Ain’t Gonna Happen/Ain’t Happenin’/Didn’t Happen” in microwaveable Frankfurter packaging. I am in fact doing nothing of the sort, which isn’t to say that my argument can be altogether disentangled from that propounded in Baudrillard’s parvum opus; indeed, it would be quite accurate to say that that argument occupies a moment, in the quasi-Hegelian sense, of my argument. The Gulf War of 1991, the future Gulf War I, and even further-future Gulf War, was indeed a war of a sort that never really happened, or what comes to very nearly the same thing, a war that might as well never have happened as far as a genuinely overwhelming majority of the people alive at the time should have been concerned—or, indeed, even the totality of those people, very much including the U.S. airpeople guiding those so-called smart bombs onto their targets on the Iraqi surface and the Iraqi soldiers and civilians killed or wounded by those selfsame so-called smart bombs; not because the bombs did not exist or the people impinged on by their explosion were not really wounded or killed thereby but because the war’s outcome was already foreseen and determined long before its official outbreak in January of 1991, viz. some perhaps unspecifiable point in September or October of 1990, the point by which the United States had ascertained that their attack on Iraq would be met by opposition by no party apart from Saddam Hussein’s military, that every one of the earth’s nations apart from the U.S. and Iraq themselves had either pledged to contribute to the attack (whether by supplying troops à la the UK and France or billeting and launching-space à la Saudi Arabia) or refrained from protesting it. “Ironically,” given that at least back in the 90s JB was the arch-continental philosopher’s arch-continental philosopher, the statement “The Gulf War didn’t happen” is untrue precisely and exclusively in the sense prioritized by analytic philosophers, the sense in which “The present King of France is bald” has been false since Bertrand Russell propounded it (all snarky or waggish Louisquatorzification of Monsieurs de Gaulle and Macron aside): it was not really a war because it posed absolutely no danger to anyone involved on the attacking side and effectively resulted in no casualties to that side (apart from a semi-literal handful of those resulting from so-called friendly fire). All right, Monsieur Malin-Cul, if it wasn’t a war, what exactly was it? That is a fairly (albeit not very) good question, Monsieur Con-Cul; I suppose the usual word for a one-sided use of lethally violent force is massacre, but with all due retrospective outrage at the presumptively gratuitous carnage inflicted on the Iraqi side, one tends to think of a massacre as more personal and thorough and at least tactically orientated towards the carnage as an end in itself, and it is very probably (albeit not indisputably) the truth that whatever the thing that the Gulf War actually ended up being probably wouldn’t have happened had Mr. Hussein (now that all the reasonable and unreasonable motives for getting one’s knickers into a twist over his partial namesakehood with a certain more our-son-of-a-bitchy [not to mention with the 44th U.S. presidents] have at last evaporated, I am going to take the liberty of referring to the Iraqi dictator in the proper New York Times-style guide-heeding manner in which he should have been referred to all along) withdrawn from Kuwait, that the U.S. would not have pointed and detonated any of those so-called smart bombs on Hussein’s cities and troops for the undisguised pure heck of it; perhaps, taking a leaf from Benjamin’s buddy Teddie Adorno, specifically his description of certain military engagements in World War II, we might term it a demolition operation. At any rate, vis-à-vis the pertinence of my argument to Baudrillard’s and vice-versa, it must be observed that whilst the Gulf War was very much a classic manifestation of war qua broad-based gratification of a craving for stories, it was also quite atypical of post-Vietnam wars in the purity with which it embodied (or perhaps rather ensouled) the quality about it that Baudrillard regarded as not only singular but prospectively seminal, viz. that of hyperreality. By this I mean that the representations—or, in proper Baudriallardian parlance, simulacra—of the Gulf War effectively exhausted the content of that war (or non-war) as the simulacra of subsequent wars tended not to exhaust those wars even ineffectively. Even in the midst of its very brief duration The Gulf War was known famously or notoriously—i.e., not by Baudrillard alone or indeed by the Occidental intelligentsia alone but by the entire Occidental commentariat down to the lowliest scandal-sheet—known as a war fought entirely on CNN, and whilst it was certainly unfair of the knowers to neglect the portions of the war fought on the big three U.S commercial broadcasting networks, the BBC, etc., they were essentially right in maintaining that as far as each and everybody in the world apart from the residents of Kuwait and Iraq, the war was solely a television broadcast. And of course by everybody, I very much include the abovementioned droppers of so-called bombs. Get this, Wolf, one CNN commentator would breathlessly adjure another, millions of people at home just watched that building blow up on our television screens, and we and they also just watched the guy who blew it up watching it blow up on his television screen; why, the sheer hall-of-mirrorsesque hyperreality of the whole dashed-blamed thing is enough to make your head spin (whence the word “vertiginous”)! Of course the hyperreality of the war itself did not prevent it from demonstrably entailing certain significant very (or should that rather be merely?) real consequences, notably a certain little attention-catcher now (but not immediately) known as 9/11.  You see, the sight of all those U.S. soldiers exuberantly popping wheelies in tanks on the dunes of Saudi Arabia like the Apollo astronauts on the lunar surface wasn’t exactly a sweet ocular savor to certain anti-American elements in that country and indeed caused certain sub-elements of those elements to plot some means of making the Americans feel the sense of desecration they felt at that sight.  Whence the total destruction of the Twin Towers, the partial destruction of the Pentagon, and abortive attempt at the destruction of the White House, an event or combination of events that, although it received a not-un-Baudrillardian piece of commentary in the form of Karlheinz Stockhausen’s description of it as the greatest work of art of all time (more on aesthetics in relation to war and warlike events anon) was decidedly very real to the thousands of people who suffered death or injury in the course of it, not to mention the several millions who lived or worked close enough to the World Trade Center or the Pentagon to watch the attacks on them directly, through un-electronically aided eyes. But as of the Gulf War’s official conclusion on February 28, 1991, 9/11 was perhaps not even a twinkle in the eye of Bin Laden & co., and it was not even widely known to have been such a twinkle until many months after 9/11—i.e., whenever Bin Laden &co.’s motives first began to be widely and extensively explored in the Occidental broadcast media. As far as every non-Iraqi and non-Kuwaiti alive on February 28, 1991 was concerned the Gulf War had been a completely hyperreal and completely successful success. Two further peculiarities about the Gulf War ought to be noted (and so are about to be noted). First(ly), its very hyperreality, its very nullity as a war, testified to significant increases in the value of macro-experience in the conduct of military affairs since the Vietnam War. Had military technology and the understanding of its application not improved in certain demonstrably practically applicable ways since 1975, the Gulf War never could have been as thoroughly hyperreal as it was; had the U.S. and its coalition partners not had those celebrated laser-guided so-called smart bombs at their disposal, they would doubtless have had to resort to much messier and perforce deadlier carpet-bombing, and they would most likely also have had to resort to quite a lengthy and bloody ground campaign (even if, in the light of Arabia’s much thinner vegetation, they probably could have forborne from the use of napalm and agent orange). Second(ly), on the home front—i.e., in such a global meta-hyperreal context, in virtually (!) every non-military setting in virtually every extra-Middle Eastern country (I would have written “in every country but Iraq and Kuwait” had I not suddenly recalled Iraq’s ham-fisted lobbage of a few so-called scud rockets into Israel in the opening days of the non-conflict)—while there was considerable excitement about the war from the very beginning of the run up to it in August 1990 onwards, one would have been hard-pressed to describe the favorable aspect of this excitement as anything like jingoism or war-fever or the unfavorable aspects as animated by the spirit of protest or outrage; and there was certainly nothing substantial or spectacular in the way of clashing between the two aspects. The positive half was typified by the near-universal retro-Tony Orlandan tying of yellow ribbons in “support of our troops” around car radio-antennas and reached its climax with the singing of the vapid pop hymn “God Is Watching Us” by the 70,000-or so souls assembled at the Tampa Stadium during the Super Bowl (I call the hymn vapid because its semantic essence was effectively exhausted by its title, what with the purport of the lyrics being “While there’s some war taking place halfway across the world, we are sitting here, and ‘from a distance’ God sees both us sitting here and the war taking place over there; but not even He knows why either it or us are worth watching.”). As to the negative aspect, well, for the present writer if no one else (but he expects for at least several others) it was typified by a brief vox pops-type story in the arts section of his home region’s , in which under headshots of themselves a handful of local college students, the majority (i.e., three) of them hailing from the liberal arts college I happened to attend—a notoriously hippie-dominated school—were quoted as saying something to the shared effect of, “Would it be the worst idea in the world to stop for a second and reflect if this war really is the best thing as sliced bread? I mean, I’m not saying it isn’t, but I mean, just to cover all our bases, mightn’t we just stop and reflect on that for no more than an actual only-barely-figurative second? I’m really just floating that suggestion, running it up the flagpole (and doing so without the tiniest scintilla of prejudice to Old Glory, mind you), etc.”  And then at the hippie-dominated school itself we had something that was probably called a teach-in at the time but that actually bore little or no resemblance to the teach-ins of Vietnam-microepochal yore. It consisted of an official administration-organized one or two-hour gathering at the only large lecture hall on campus (the room doubled as the site of the film club’s screenings), a gathering during which over liberal lashings of free pizza, a member of the faculty regaled us with anecdotes about his life in Iraq back in the late 1950s, when he had been his late single digits and Saddam Hussein in his late teens, and our school chaplain reminded us that he had been the school chaplain at Kent State University in 1970, and even during that year’s fatality-entailing clashes between student demonstrators and campus police. An insouciant remark by a fellow-student who had not been interviewed by the newspaper summed up the aura or spirit of the entire thing: “I don’t give a shit about the sandn****rs; I’m just here for the pizza.” In hindsight, the entire Gulf War reminds me of, of all things, the first movement of Schubert’s so-called Great C major symphony, his Symphony No. 9, or, to be more precise, of the scenario that has always been painted by that movement in my mind, the scenario of a bunch of armies trepidatiously converging from all points of the compass on a battlefield and finally arriving there only in the final resoundingly triumphal measures, after which there is obviously no alternative but to move onto something completely different (the clodhopping pastoral dance-cum-march that is the Andante con moto second movement in the case of the symphony, and the mini-recession of 1991-1992 in the case of the American Zeitgeist-cum-Volksgeist).  It’s certainly endearing and even rousing but at the same time a bit comical, the spectacle of a war brimming over with pomp and pageantry and utterly gutted of all strife and carnage. The next major Occidental military operation, the post-9/11 bombardment-cum-invasion of Afghanistan, was in certain respects an exactly identical war-animal, and in other respects an altogether different one. It was exactly identical in enjoying the support of almost every polity in the world apart from Iraq (true though it seemingly seems to be that the Iraqis were not involved at all, however tangentially, in the attacks of September 11, 2001, it is documented that the Iraqi government’s official statement on that day’s catastrophes included the unambiguously acrimonious sentence “We’re glad the towers fell.”) and in being presumed to be entirely one-sided even before its outbreak. Nobody expected the Taliban to offer any serious resistance to the Occidental forces or to occasion any significant loss of life among them.  At the same time it was not a genuinely (!) hyperreal event, inasmuch as although it enjoyed the rapt spectation of milliards of television viewers, who from the very beginning of the non-conflict onwards were treated to pyrotechnics induced by far smarter smart bombs than the ones of ’91, it was not taken in in quite a spirit-of-’91-style attitude of disinterested exhilaration by those viewers—this, partly if not necessarily, because, as mentioned above, 9/11 itself had not been an entirely hyperreal event owing to its immediate material impingement on many millions of people; partly (and corollarily) because those viewers were animated by a genuinely sanguinary desire for revenge that had been entirely absent in ’91, and finally because because (sic on the second because) the attacks of 9/11 had come so literally out of the blue (literally because the sky had been virtually cloudless on the day), people were not immediately disposed to assume that what they were witnessing on their screens would not have not-merely-hyperreal repercussions in their own lifeworlds. Throughout the remainder of September and perhaps well into October one repeatedly heard both media pundits and people in one’s immediate Umwelt nervously speak of waiting for the other shoe to fall, that other shoe of course being a terrorist attack either comparable to or greater than those of 9/11 in scale and lethality; and of course for a few suspenseful days in that interval we worried that we had encountered the thin edge of the wedge of that attack when several people in the United States died of anthrax that had been deliberately sent in powdered form through the U.S. mail (deliberately sent, as it turned out, not by the 9/11 terrorists confederates in  but a disgruntled scientist at a bioweapons laboratory in Frederick, Maryland—but that was discovered only many months later). In the light of this one could not but wonder if the destruction that one was witnessing onscreen might either precipitate the falling of that second shoe or forestall it.  And then once the war moved to the ground and various factions of Afghans began aiding the U.S. in its campaign against the Taliban, one could not help wondering whether provided one of these factions proved victorious it would concurrently and subsequently prove to be a true U.S. ally, whether it might not prove any less supportive of anti-U.S. terrorism than the Taliban had been; indeed, out of sheer nude self-interest one found oneself researching the various tribal, linguistic, political, and ethnic histories of the various factions; one learned, for instance, of the role that the Taliban had played in the insurgency against our old arch-enemy the U.S.S.R.; one learned, further, of the abovementioned Pashtoon ethnic identity of the Taliban, and of the extent to which this had fatally militated against their forming a lasting political union with their prevailingly Turkic fellow anti-Soviet rebels despite their shared religious identity as Muslims; in short one found oneself learning what a horribly complicated and unstable place Afghanistan was and how its complicatedness and instability had impinged on the well-being of other parts of the world over the decades, centuries, and indeed millennia. On the evidence of this quasi-involuntary acquisition of insight one might have been tempted to think that old fashioned micro-experience vis-à-vis the phenomenon of war was making something of a comeback, that it might indeed have at long last been on its way to recovering pre-WWI levels of value, that it might soon once again begin to be possible for an ordinary person—even an ordinary civilian—authentically to regard war as something in which he was somehow participating, however slightly or humbly, rather than as something that was being imposed on him like the proverbial sixteen-ton weight.  One might have been tempted to think this, but in point of fact one was not tempted thereto, and in any case, if one had been tempted thereto and had actually ended up thinking this, one would have been wrong, because while the war in Afghanistan did not actually end, at least not in the definitive clear-cut fashion in which the Gulf War had ended—this because while the Taliban were fairly quickly dislodged from the Afghan capital, they were by no means eradicated, and it was generally assumed that in the absence of a continued U.S. military presence they would ; and because the (at least then-) presumptive mastermind of the 9/11 attacks, Osama bin Laden, whose assassination or capture had been at least initially posited as the principal goal of the war, remained unfound after the Taliban’s deposal--it did disappear from the news headlines at some difficult-to-pinpoint moment in 2002, and within a half-year of that disappearance, we all found ourselves in the midst of a full-fledged run-up to another war in the Middle East, and that war was immediately signalized by the absolute nullification of the value of experience at all levels and resolutions, on every front and all sides. First(ly), and perhaps most notoriously (now if not then), despite their presentation of the conflict as but the second battle in a War on Terror of which the Afghan conflict had been the first, the architects (if it be not an insult to architects to compare the shamelessly slapdash daubery in point to architecture [and if it indeed be an insult thereto, don’t they basically deserve it in the light of the shamelessly slapdash average character of their own recent work?]) of this new war, the Bush II administration, seemed impervious to ascribing any casus belli-castrating relevance whatsoever to the abovementioned patent lack of any evidence that Saddam Hussein had been involved in any capacity or to any extent in the planning or funding of the 9/11 attacks. Second(ly), and perhaps nearly as notoriously (now if not then), those architects not only seemed but proved impervious to ascribing any casus belli-castrating relevance whatsoever to the patent lack of any evidence that Saddam Hussein possessed a single chemical weapon more powerful than a household bleach-and-ammonia pipe bomb.  Third(ly), and not nearly as notoriously (then or now) but even more damningly the U.S. populace proved enthusiastically willing to accept the architects’ casus belli as a (trigger warning: those intolerant of the mixture of metaphors should now avert their eyes) genuine bull with bowling-ball sized cojones rather than as the blister-scrotumed steer that it actually was. But surely not literally every (ahem) member of the U.S. populace accepted the chicanerous misrepresentation of the livestock. Surely a far from insubstantial proportion of that populace insisted on inspecting the animal from snout to tail and consequently could not help noticing that the aforementioned blister-scrotum contained no cojones and consequently withholding their stamp of approval from the animal’s flank. Or to put it in wholly literal terms, surely a goodly proportion of Americans made it clear that in the absence of dispositive evidence of Hussein’s involvement in the 9/11 attacks or possession of weapons of mass destruction they were not about to support an attack on Iraq—not, of course, that they could do anything to stop such an attack in the light of Mr. Bush’s wielding of supreme executive power, but that they made it patently clear via the exercise of their right to protest that in the event that the attack took place, they would be showing Mr. Bush and his flunkies in the Congress at the ballot-box that they had not been taken in by the sub-dispositive case for the war come November 0f ’04.  I’m afraid this surely-prefixed scenario of yours is merely yet another instance of what one might have thought would have happened. Not, of course, that a substantial proportion of Americans did not oppose the war or that that proportion failed to express their opposition via their right to protest both in the run-up to it and at the ballot-box in November of 2004, but that their protests appear not to be have been prevailingly founded on the grounds of scepticism and certainly did not express themselves in prevailingly sceptical terms. They did not, as I recall, march down the street carrying signs that read, “HEY, DUBYAH: WHERE’S THE BEEF?* *APOLOGIES FOR THE DATEDNESS OF THE REFERENCE, BUT SO FAR THE NEW MILLENNIUM HASN’T EXACTLY BRIMMED OVER WITH CATCHY CATCHPHRASES, HAS IT?” or “HEY, DUBYAH: I MAY NOT BE FROM MISSOURI, BUT YOU’VE STILL GOT TO SHOW ME” or “HEY, DUBYAH: UNLIKE THAT OTHER GUY, I AM FROM MISSOURI, SO YOU’VE DEFINITELY GOT TO SHOW ME.” Very well, then: what did the signs that they actually carried while marching actually read? I don’t know because I didn’t attend any of the protests against the inception of the war (or its initiation or whatever other –tion-terminating word has become the standard since wars seemingly permanently stopped being started by declarations) or watch any of them on television, as I was then living in a so-called group house in which the only properly functioning television set was always being used by one or more of my roommates, none of whom I cared ever to socialize with. But as I did then maintain ties to a fair number of academicians in the Humanities and so-called underground rock musicians, to a fair number of people, in other words, who participated in subcultures that were more or less united in their opposition to the war, I do (now) fancy I got a fairly good sense of the tone and tenor of the protestors’ objections—a sense, namely, that they objected to the U.S.’s going to war with Iraq because they presumed that it would occasion the deaths of many Iraqis; they objected, in other words, to a prospective consequence of the war that was exactly identical to the only unambiguously negative actual consequence of the previous war against Iraq notwithstanding the fact that that consequence had not elicited so much as a whimper of protest from any measurable proportion of the American populace (as my above account shews, the very few Americans who did demur at the 1991 war were actuated by the very sort of question that was not in evidence chez the protestors of ’03, viz.,What is the prospective indispensable use of this prospective conflict?). Mightn’t one almost perforce infer from this that they had learned from the 1991 war that, contrary to the expectations in the run-up to that war, that war had in fact inflicted substantial casualties on the Iraqi populace? By no means, because the casualties inflicted on the Iraqis during the 1991 war had elicited zero outrage from any substantial proportion of the American populace of that year; and indeed, the no less substantial casualties inflicted on the Afghan populace during the much more recent (and indeed still-ongoing) post-9/11 war in Afghanistan had not elicited any outrage therefrom (and indeed were still not yet eliciting any therefrom). Indeed, to alight on a parallel degree of popular morally based Occidental outrage to a war, one had to go nearly half a century back in time—all the way back to the microepoch of the Vietnam War. And indeed at the time, in ’03, whenever I was in the presence of my anti-war acquaintances, I got the impression that they were enjoying their outrage just a bit too much, that they were indeed above all delighted to be finally getting a Vietnam of their own to protest against.  And in this Spirit-of-’03 delight we are in turn finally arrived at the real limit-case, the nadir, the anti-apotheosis, the event horizon of the revival of storytelling at the expense of experience, the moment at which experience—even if and perhaps especially when it is of objectively highest value—must be not only marginalized but banished and to the greatest possible extent obliterated for the sake of storytelling as an end in itself. But is this necessarily a bad thing? Indeed, should it not, from a Benjaminian point of view, be regarded as a salutary renaissance of a noble lost art? Indeed it is necessarily a bad thing; and indeed, from a Benjaminian point of view it cannot be represented as a renaissance but rather as the death of the noble lost art of storytelling, inasmuch as—let it be remembered—according to Benjamin’s lights, a great story is inalienable from the counsel it contains, and the kind of storytelling that began to take hold during the Vietnam War and that reached its anti-apotheosis in the run-up to the Iraq War has no counsel to offer and indeed prides itself on its lack of counsel. If one walks up to the purveyor of such a story and asks him or her, “Is it wise or prudent or—dare I even say it—cost-effective to engage in this war?” he or she is duty-bound thwack one with his or her “STOP (OR BETTER YET DONT START) THE BLOODSHED!” sign and scream, “How can you talk of wisdom or prudence or—horribilime dictu—when there are innocent people dying over there?!” The thoroughly morally grounded story is ineluctably committed to such a response to any request for counsel because it is not a jot interested in the well-being, health, safety, or prosperity of the agent to whom it appeals; because it can only appeal to that agent as the ower of a moral debt to someone else. Complementarily—if perhaps not-unparadoxically—the thoroughly morally grounded story is incapable of absorbing counsel either from older stories or from any phenomena it encounters in the world. Once one has interpellated this or that latter-day war as “our Vietnam” there is nothing for one to do but to see to it that the analogues to the Vietnamese people in that war are done right by until one has exhausted all one’s resources for doing right by them, until, that is, one has more than figuratively spent one’s and one’s fellow-countrymen’s very last cents and spilled one’s and one’s fellow-countrymen’s very last drops of blood. But this narrative ineluctability by no means further entails the ineluctability of the enactment of the narrative down to the actual spending of those last cents and blood-drops, for the human organism and the financial substrates thereof are after all wholly subordinate to the dictates of the human will, which is in turn all too prone to buckle long before it is subjected to Dr. Johnson’s half-a-guinea or ounce-of-meat test; long before, in other words, the supporter of the moral cause is forced to do without a sufficiently large amount of food or money to occasion even minor disruptions of his daily routine. But surely this case in point of illustration of Matthew 26:41 is entirely salutary inasmuch as it provides useful counsel for the next conflict, vis-à-vis which the would-be abettors of the supposed good guys will be less eager to beat the drum and pass the cup around, knowing as they will by then that the stamina of their own charitable impulses is embarrassingly limited.  Yet again you are deriving inferences founded in a no-longer-existent functional state of affairs, a state of affairs in which stories were still capable of fulfilling their classic function. In the recent-to-present dysfunctional meta-narrative (NB: this adjective is not to be confused with the noun-cum-academic argoteme metanarrative) state of affairs, one thoroughly morally grounded story simply falls away into oblivion the moment it begins to pinch the will, and it is immediately succeeded by another morally grounded story, generally one hailing from a completely different geographical and social domain than that of the previous one; thus in the case of the Iraq War, once it became clear that the conflict or occupation or whatever it was most appropriately called by the mid-oughties was not going to go anywhere, so to speak, except at the cost of a considerable number of American or Iraqi lives—once it became clear, in other words, that a definitive U.S. victory could be attained only via the replacement of the system of serial tours by a military draft that would have resulted in most Americans’ immediate or only slightly mediate personal involvement in the conflict (i.e., as a soldier or as the spouse, parent, aut al. of one) and that the U.S.’s complete withdrawal of its forces would plunge the country into internecine civil war [not a pleonasm, inasmuch as internecine means not internal but mutually destructive {so the 1990 COD}]–the collective imagination-monopolizing thoroughly morally grounded story became that of the Occupy Wall Street movement, wherein the bankers were styled the bad guys simply in virtue of having more money than most non-bankers. And of course (albeit again quasi-paradoxically) the quasi-infinite portability of the thoroughly morally grounded story only militates against the coalescence of anything in the way of the most important units of morality, namely, principles, inasmuch as a principle by its very nature requires a reference to an objectively unportable entity or phenomenon; thus, for instance, the principle of pacifism requires engagement with the objectively unportable phenomenon of war, with a phenomenon that has occurred many times in the past and that bids fair to occur many times in the future, and consequently to deal with more than one war and to bear the consequences of taking a stand against one war by taking an identical stand against any other war of prominence that subsequently breaks out. The opposition to the Vietnam War itself was at least nominally a pacifist movement, a war against a war qua war (even if that war qua target of protest was largely a stalking horse for normal wholesome living within well-defined libidinal bounds), but the opposition to the Iraq War, being a war against a war qua “our Vietman” qua object of protest eo ipso had no impingement whatsoever on Occidental attitudes to subsequent wars. Thus, contra if pace Peter Hitchens (with whom I agree about virtually everything), it is not at all surprising that the so-called Left, formerly the anti-war faction in Occidental politics, has been zealous to the point of madness in favor of American and N.A.T.O intervention on the Ukrainians’ behalf in the now (i.e., April 4, 2022)-current conflict in Ukraine because they are not opposed to the war qua war but qua manifestation of aggression by Vladimir Putin qua “our Hitler.” The war in Ukraine is indeed but the latest installment in “The Story of Our Hitler” that began way back in 2015 when Donald Trump came to the fore as a U.S. presidential candidate and became the original “our Hitler” for reasons that, while not exactly obscure, are somewhat mystifying—to be sure, he was running against the anointed successor of Barack Obama, the first black president; but then so would have been Jeb Bush, Ted Cruz, autc. To be sure, he made tighter security along the U.S.-Mexican border his central campaign platform-plank, but of course so had many preceding presidential candidates from both political parties, and it is of course a far cry from wanting to keep people out of one’s polity to seeking Lebensraum for it. In a way, though, the mystification was all par for the course of the thoroughly morally grounded story; it is very likely that if Donald Trump had not come along, some other figure would have been anointed “our Hitler” in his place; that “our Hitler” was a non-idea whose idea had simply come by default in the middle of the second decade of the 21st century because all the less spectacular thoroughly morally grounded stories had already been tried out by then. And the fact that the entire story was really about Donald Trump specifically qua “our Hitler,” i.e., a specific individual who was thought to embody all the targeted evil and whose disappearance was destined to spirit away all the evil from the institutions associated with him, became starkly evident in the behavior of the quasi-organization styling itself Antifa—a quasi-organization whose name is short for Anti-Fascist and that pretends to be organically inseparable from the communist paramilitary bands that clashed with Nazis in 1930s’ Germany—between Donald Trump’s election in 2016 and the immediate aftermath of the election of his successor, Joe Biden, in 2020. From the very day of after onwards the quasi-organization directed a succession of unspeakably brutish assaults on governmental buildings—courthouses, police stations, and the like—on the ostensible grounds that all of them, regardless of their level (federal, state, or local) or branch (executive, legislative, or judicial) were radically and irredeemably fascist in purpose and function; after Biden’s election, these assaults all but entirely ceased. Why? After all, the buildings and the institutions they participated would be operating as vigorously under Biden’s presidency as they had under Trump’s. Why, simply because Biden, unlike Trump, was no Hitler, indeed could never be a Hitler because he was an avowed enemy of Donald Trump. And those who were cheering on the Antifa attacks on local police stations over which Donald Trump had no jurisdiction most enthusiastically two years ago are cheering on the legal proscription of everyone and everything Russian under the sun (and not only today’s sun, but the sun of a century-and-a-half ago, the sun under which Tchaikovsky and Dostoyevsky lived) simply because Vladimir Putin has become “Our New New Hitler” because Donald Trump once said he “could do business with him.” But by the very logic of the narrative, Mr. Putin’s days as “Our New New Hitler” have got to be numbered: he will either have to be deposed, assassinated, or killed in a military assault (or forced to kill himself to avoid being killed in such an assault once it reaches his quasi-literal doorstep).  And once that happens, won’t the “Our New Hitler” narrative finally have exhausted itself? The “Our New Hitler” narrative will finally have exhausted itself if and only if exactly one sub-sub-that of the sub-thats of that that happens—namely, if Putin takes the rest of us with him in his efforts to stave off a military assault against him; if, that is, he retaliates against an attack on Russia—or Moscow in particular—with a nuclear counter-attack of sufficient force and volume to kill off the entire Occidental population, and with it every last exponent of the “Our New Hitler” narrative. But in the admittedly not super-probable event that this sub-sub-that does not happen, regardless of what happens to Putin in the next year or so, the “Our New Hitler” narrative will either be given a Never Say Never Again-style treatment as Donald Trump steps back into the role in his capacity as the 2024 Republican presidential candidate or whoever the Republican nominee turns out to be will be newly anointed or minted as the New New Hitler (indeed, one is already hearing a fair amount of talk of the most likely contender, Ron DeSantis, as being “worse than Trump” on account of his consistent opposition as governor of the state of Florida towards the implementation of authoritarian policies in the name of COVID mitigation [because COVID is “Our Holocaust” because it caused the deaths of millions of people, such that anyone wielding supreme executive power within a given polity is responsible for deaths occasioned by that cause therein under his watch {unless, of course, that wielder is a Democrat, in which case he is seen to have done everything he possibly could have done to suppress the death toll}]). And if either of these thats happens, why, then, the “Our New Hitler” narrative will exhaust itself when and only when some atrocity incontestably even more appalling than the Holocaust supervenes—provided, of course, that enough people survive that event to fashion it into the New Worst Thing Ever and are sufficiently detached from the immediate hand-to-mouth aspect of survival to derive any sort or amount of pleasure or utility from ruminating on that selfsame NWTE.  Isn’t there a rather ordinary word for one of these stories, like the “Our New Hitler” story, that people ascribe a ponderous amount of metaphysical heft to and that just won’t go away, however little they correspond to reality…? What, legend? No, no, not legend…What, tall tale? No, no, no not tall tale…What, ripping yarn? Definitely not. What, myth? Ding-ding, as they used to say, back when the “Our New Hitler” story was presumably but a twinkle in the Coke bottle-lensed eye of some Jolt cola-swilling Usenet user in ca. 1983. Yes, myth! Well, what of it? Why should it come as such a relief to learn that there is such a word and that it applies to stories like “Our New Hitler”? Why, because presumably on account of the just-touched-on tendency of myths to correspond poorly to reality, in addition to denoting those ancient stories about Zeus and Oedipus et al., the word “myth” enjoys or suffers from a very well-known secondary sense of “any story that happens not to be at all true,” a sense in which it is indeed virtually a synonym of “fib” if not of “lie” itself.  Indeed it, myth, does enjoy or suffer from that secondary sense. But what of that? Pray explain its pertinence to the Chinese (!) tea-price that is “Our New Hitler”?  Its relevance consists in the tendency of people not to believe in things that are manifestly not true. Such that if enough people came to see “Our New Hitler” as a myth, those same enough people might consequently come to disbelieve it and consequently stop looking for new Hitlers in the world of the late-early twenty-first century?  Why, yes. So you essentially see our problem as fundamentally being one of what is so loutishly known as branding? Well, not wishing to think of myself as a lout, I blush to brand the problem one of branding—but you’ve certainly twigged the gist of my diagnosis of the problem. Well, I’m afraid I can’t quite agree with you that your diagnosis is entirely correct—this because myth in the pejorative secondary sense overlaps in its precise denotative particulars with another word, namely, fiction, that (yes, yes, yes, despite the currency of the adage “Truth is sometimes stranger than fiction” [an adage that itself comes as close to making truth into a synonym of lie as it possible to do without nullifying it entirely]) is by no means generally taken in a pejorative sense in any of its senses (however strongly it may deserve to be taken in a pejorative sense in all thereof [more on these just deserts of fiction anon-ish, Lord willing])—such that whatever it is about myths that renders them not only intrinsically but uniquely pernicious will have to be sought elsewhere than in their untruthfulness. And in pursuit of this search we may just find at least a modicum, a crumb, of counsel in Benjamin’s essay, for therein he does animadvert—admittedly very fleetingly—on myths, or rather, as he terms it, “the myth,” thereby positing myths as collectively embodying a genre like that of the story and of the specific sub-genre of story with which he polemically contrasts them-stroke-it, viz., the fairy tale:

Whenever good counsel was at a premium, the fairy tale had it, and where the need was greatest, its aid was nearest. This need was the need created by the myth. The fairy tale tells us of the earliest arrangements that mankind made to shake off the nightmare which the myth had placed upon its chest. In the figure of the fool it shows us how mankind “acts dumb” towards the myth; in the figure of the youngest brother it shows us how one’s chances increase as the mythical primitive times are left behind; in the figure of the man who sets out to learn what fear is it shows us that the questions posed by the myth are simple-minded, like the riddle of the Sphinx; in the shape of the animals which come to the aid of the child in the fairy tale it shows that nature not only is subservient to the myth, but much prefers to be aligned with man. (“The Storyteller,” Chapter XVII)

In referencing the riddle of the Sphinx and in positing the fairy tale as a genre devised expressly for the purpose of demystifying the myth—in other words, inter alia, as a genre relatively modern in relation to the myth—Benjamin seems to intimate that his conception of the myth is relatively narrow, that in his view the only true myths are those super-ancient ones like those of the ancient Greeks, and hence (given that he  understands Herodotus to be the first Greek storyteller), those that antedate the genesis of storytelling itself. Given that myths themselves always delineate a narrative arc, one might reasonably ask why they do not qualify as the earliest stories, but to dwell on this lacuna in Benjamin’s argument now would be like pointing out the umpteenth sprung leak on a sailboat that while noticeably listing is already within sight of land and moving towards it with a strong tail wind. But despite the two just-mentioned favorable conditions, the ship is not absolutely guaranteed to make it ashore in one piece and with its most precious cargo intact, for here as with all and even the most fecund parts of Benjamin’s argument qua sailing-vessel analogues, a goodly proportion of the cargo consists of veritable pacotille or packthread that must be separated from the essential goods and jettisoned to keep the list from turning into a prow-upward perpendicular, and as the packthread and the essential goods are packed together in the same crates, the separation-operation will occasion a great deal of crate-opening, unpacking and re-crating. Speaking of unpacking, Cap’n or, rather, matey (because at the moment it’s I who is or am giving the orders and you who will be obliged to follow them), you are going to have to do a lot of that yourself to make me understand this wee gecko of a metaphor now mushroomed into a Godzilla of a conceit ([sic] on the only seeming mixture of metaphors, for as everyone knows, Godzilla was himself merely a metaphor for the mushroom cloud produced by an atomic bomb). Avec plaisir, Cap’n (if it be not mutinous for a first mate to accept an order via an avec plaisir rather than via an aye-aye): the packthread is Benjamin’s whole…erm…thread (apologies for the partial defiguration of the conceit) about the fairytale qua superior alternative genre to the myth. You see, even if one grants that the fairy tale once was such a superior alternative (and I am quite strongly disposed to grant that [albeit with some apprehension occasioned by my nonplussedness on how to absolve fairy tales of participation in the cookie cutter-exquisite corpse-like [and hence myth-like] schematic quality plausibly attributed to their close cousins, folktales, by Benjamin’s contemporary Vladimir Propp]), one must refrain from embarking on any fairytale-like quest for such superior alternativeness in fairytales now, for such a quest will ineluctably lead us straight into the gaping maw of the Big Bad Wolf  that is the horrifyingly irredeemable nearly-century old filmography of Disney fairytale adaptations; not to mention the perhaps even more capacious gullet of the Lord of the Rings and, a peiori, the god-awful Harry Potter franchise—in short, an entire Alexandrian library-sized collection of texts that collectively embody the exact antipode of the sort of artifacts and practices to which we should be turning as an alternative to bad storytelling, as I shall (LW) shortly attempt to shew but cannot yet shew because the immediately to-hand explication of myth qua version of bad storytelling is an indispensable preliminary to understanding the just-mentioned exact antipode. And so for now we must try to consider “the need for counsel created by the myth” without accepting Benjamin’s postulation of the fairy tale as an infungible meeter of that need. Having done this, having separated out the packthread that is fairytale-prizing, we must try to understand in what sense the myth creates the need for counsel, with an emphasis on the try, because Benjamin most certainly does not spell it out for us. To be sure, the various “figures”—or to be specific, three “figures” and one “shape” [check the German original to see if “Gestalt” is used in all four cases]--allegedly provided by the fairy tale seem intended to outline in negative this need—i.e., the “nightmare” imposed on mankind by the myth—but of these four the youngest brother alone unequivocally points to a real human institution of historically unbounded solidity. The figure of the fool has numerous real-world correlates from the court jester to the stand-up comedian, but none of these figures overlaps exactly with any of the others in social function, and in no case is that function particularly easy to specify, particularly in terms of the exigencies of world-maintenance. We are told that, for example, the court jester was necessary because somebody had to be allowed to get away with telling the truth to the king, with “speaking truth to power,” and we nod with feigned understanding of this assertion as a supposedly self-evident truth, but in (self-evident) truth it is hard to see why a king couldn’t do without the fulfiller of such a function, and plenty of kings in plenty of kingdoms in plenty of historical periods seem to have done without one, or to have opted to have the truth spoken to them by conservatively dressed po-faced chamberlains, privy councilors, masters of the stole, autc. And of course the recent metamorphosis of oodles of formerly supposedly hyper-subversive stand-up comics into toadying mouthpieces of the most imperious powers that have ever be’d munificently gives the lie to any notion that they are inalienably latter-day truth-to-power-speakers. “The man who sets out to learn what fear is” seems to apply either to every hero of every story ever told or written or to none that at least the present writer has ever heard of depending on how tightly or loosely one defines “learning what fear is.” I suppose anyone who leaves home in search of adventure of any sort is at least partly impelled by the sense of suffering from a deficit of knowledge of fear, but I have yet to hear tell of anyone in the actual world or in any counterfactual one who left home expressly in search of such knowledge. And are the “questions posed by the myth” actually “simple-minded”? Perhaps they generally are, but if so, the riddle of the Sphnix seems a rather poor-chosen example of their simple-mindedness. Perhaps to so complicated-minded a cove as Walter Benjamin it seemed especially simple-minded, but even if it did, he should have reflected that he did not enjoy the benefit of a still ocularly endowed Oedipus-eyed view of the riddle, the benefit of ignorance of its answer, a benefit that presumably would have made it seem just a wee bit more complicated. The present writer at least unflatters himself that he would have had a Sphinx’s sphincter of a difficult time deriving the answer “a man” from the Sphinx’s riddle on account of his stubborn predisposition to think of adult male human beings as prevailingly permanently bipedal creatures. And tantalizingly appealing though the notion that nature “prefers to be on the side of man” despite its subservience to myth is, the “animals which come to the aid of the child” take us straight back into the Disneyian fairy-tale inferno. (Incidentally, Benjamin was no fan of Mickey Mouse qua partially anthropomorphized animal and wrote an essay laying into Mickey with a ferocity with which Tom was only ever to be suffered to imagine to lay into Jerry; an essay whose applicability to “The Storyteller” doubtless deserves a separate essay that I may eventually get around to writing provided I ever get around to rereading Benjamin’s essay.) But as for the figure of “the youngest brother” who “shows us how one’s chances increase as the mythical primitive times are left behind”—well, I reckon that chap has got some legs, not in the sense in which a nice-looking dame has got nice pair of gams or pins, but in the sense in which an idea or practice has got legs in the cant of the demimonde of journalism (or at least so I presume, for it was from a journalist that I learned the expression); in this case legs extending all the way back (or down) into presumptive prehistory and all the way forward (or up) into the present. What I meantersay (or, rather—skewed me!—meanterwrite) is that this chap, the youngest brother, seems to offer a critique of primogeniture, an institution that both seems to antedate myth and survives at least vestigially even now, such that the aforesaid critique has always been timely and has yet to become entirely obsolete or superfluous. In the myths, primogeniture seems so often to be taken for granted that more often than not the father to be toppled or succeeded by his progeny has only one son: thus Oedipus is never in any danger of sharing top tragic-heroic billing by killing his father only to arrive in Thebes to find his brother Tex (see P.D.Q. Bach’s Oedipus Tex for the counterfactual genealogical details) already in bed with his mother, and thus Ulysses, having dispatched Penelope’s suitors with the help of his only son can enjoy an easy proto-fairy tale (i.e., “happily-ever-after”) retirement in the certain knowledge that after his death a second bloody dispute over his estate will not break out among his heirs. At most two brothers of almost identical age–notably Romulus and Remus—will duke it out for the top-dog position, and the very fact that there is a top-dog position implies that one male per generation per family is the ideal state of affairs. But of course, as a matter of biological quasi-inevitability, firstborn sons have always been outnumbered by their younger brothers, and the world has always somehow managed to accommodate these younger brothers while at least nominally granting first-born sons their legatial pride of place. One thinks in this connection—to cite an example that ultimately hails from a genre-cum-mode {i.e., the proto-realistic mode-cum-genre of romance, the source genre-cum-mode of all of Shakespeare’s comedies} comes perilously close to the fairy tale but is ultimately unassimilable to it—of Orlando in As You Like It, who, on escaping the clutches of his miserly older brother wins a wrestling contest and consequently the heart of the daughter of the rightful ruler of the realm, an older brother who has been exiled by his usurping younger brother and who on reclaiming his throne bestows his daughter’s hand in marriage on the new realm-wide wrestling champion. Primogeniture ultimately underwrites the system of life imagined by the play, but that system is liberal and rational enough to reward the exercise of certain talents by those not fortunate enough to be born first. And not only does Orlando’s story reflect a state of affairs that was already actual to a certain extent in Shakespeare’s time, it also anticipates the golden age of younger and youngest sons in eighteenth-century Britain, an age dominated by nouveau-riche manufacturers and merchants who had been obliged to go into trade as youngsters because they had known that they would inherit nothing from their parents unless they chanced to outlive their older or oldest brothers. The British nineteenth century, by contrast, was the golden age of the self-made man, who was obliged to go into trade even if he was a first-born son because he stood to inherit nothing from his parents because they had nothing to bequeath to him. Primogeniture as a system of life-underwriter has effectively been dead in the Anglosphere since the early twentieth century because the titled nobility who still depend on it for the assignment of their Baronetcies and Earldoms no longer collectively call the shots of the system (which is not to say that families do not still call the shots thereof, but merely that these families [e.g. the Rockefellers], in virtue of being untitled, are not obliged to make their eldest sons their standard-bearers).  And yet the first-born do continue to enjoy certain advantages simply in virtue of having been born first, advantages that are probably not eradicable as long as children continue to be raised prevailingly by their biological parents. And of course even in conceding as much I am (or would be, if  anyone of any influence—including any so-called influencer—were ever to read this essay) adding yet another feather or ammunition-clip to the already-brimful quiver or munitions-magazine of the would-be abolishers of the traditional family, but of no course but no less undeniably I am (or would be, if etc.) suggesting an entire armory or fortress-sized means of more or less painlessly fending off that abolition—whichistersay a means of coping not only with the particular inequities attending residual primogeniture but for coping with all other traces of myths (or echoes thereof, depending on the answer to the probably ultimately unanswerable question whether the states of affairs in question would have come into being to begin with or survived thereafter in the absence of the initial telling or subsequent transmittal of the myths) in the present-day world—viz., that of simply looking to the historical archive of qualifications of and exceptions to the states of affairs postulated by the myths. Simply knowing that lots of younger and youngest siblings have been famous and successful (and indeed in some cases more famous than their older siblings) despite their inferior rank in the birthing-order ought to be enough to quell the ressentiment of latter-day younger and youngest siblings tempted to kick up their heels and bawl out their eyes for the entire duration of their naturals simply because unlike their oldest sibling they have never enjoyed Mummy and Daddy’s undivided attention; and this knowledge should likewise put paid to the impulsion of utopians (or dystopians masquerading as utopians) to preempt such ressentiment by taking all the newborn siblings from the firstborn onwards away from Mummy and Daddy and consigning their upbringing to the State. And one could easily imagine a similar course of correction’s being applied to other myths: the myth of Icarus would inculcate the lesson that one must be wary of soaring too high, but the true stories of the Wright brothers, the Apollo moon landings, and so on, teach us that “man’s reach must” at least sometimes “exceed his grasp.” Fine, this is all well and good vis-à-vis myths officially and traditionally so called—myths like that of Oedipus and the Sphinx (and Laius and Jocasta) and Icarus, but what of the myths of the sort that seem to beset our age most perniciously and on which we have dwelt at greatest length, myths like “Our New Hitler”? Surely these myths signally differ from the ones traditionally and officially so called in virtue of being based on historically actual phenomena palpably, demonstrably, and from the outset (for even when myths have been traced to some historical substrate, as the various myths associated with the Trojan War to the site in Asia Minor unearthed by Schliemann, this substrate is always destined to remain an afterthought in the mind of the learner of the myth), and often on events of quite recent historical provenance, such that the material that might undermine or qualify the myth must be drawn from material that is just next door, so to speak, to the material that has engendered the myth itself. Can one really debunk the notion that Donald Trump or Vladimir Putin is not our new Hitler by simply pointing out that, say, Franco or Salazar or Pinochet was not exactly a Hitler despite being a right-wing twentieth-century dictator? Up to a point I think one can indeed if not quite debunk such a notion thereby then least thereby problematize (as the Lefty milquetoasts would say) it; at minimum, knowing that there was no Holocaust in Spain, Portugal, or Chile proves that for all their other undeniably serious faults right-wing dictators are not ineluctably bound by nature to commit genocide. But at bottom (or, rather, top, because “up” is the operative direction here) I take your point rather than the one I just mentioned: one needs something rather more robust than a single problematizing counter-narrative to break the hold on the collective psyche of our modern myths. Fortunately, you and I are not the first people to recognize this, even if Benjamin himself seems to have failed to recognize it despite being born some years after the death of that first recognizer, Karl Marx, who expressed his recognition of it in the following passage of The Eighteenth Brumaire of Louis Bonaparte:

Hegel remarks somewhere[*] that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce. Caussidière for Danton, Louis Blanc for Robespierre, the Montagne of 1848 to 1851[66] for the Montagne of 1793 to 1795, the nephew for the uncle. And the same caricature occurs in the circumstances of the second edition of the Eighteenth Brumaire.

Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past. The tradition of all dead generations weighs like a nightmare on the brains of the living. And just as they seem to be occupied with revolutionizing themselves and things, creating something that did not exist before, precisely in such epochs of revolutionary crisis they anxiously conjure up the spirits of the past to their service, borrowing from them names, battle slogans, and costumes in order to present this new scene in world history in time-honored disguise and borrowed language. Thus Luther put on the mask of the Apostle Paul, the Revolution of 1789-1814 draped itself alternately in the guise of the Roman Republic and the Roman Empire, and the Revolution of 1848 knew nothing better to do than to parody, now 1789, now the revolutionary tradition of 1793-95. In like manner, the beginner who has learned a new language always translates it back into his mother tongue, but he assimilates the spirit of the new language and expresses himself freely in it only when he moves in it without recalling the old and when he forgets his native tongue.

Marx does not use the word “myth” here, but it is apparent that he ascribes exactly the same degree and kind of force to history—or at least history as manifested in and by “circumstances existing already, given and transmitted from the past,” by the “tradition of all dead generations”—that Benjamin ascribes to the myth, because he uses the very same word to articulate this force, namely, “nightmare,” albeit that he does describe this “nightmare” as weighing on mankind’s “brain” rather than on its “chest,” a difference from which we may draw the salutary reminder that nightmares are after all not purely psychological phenomena, that they are oppressive and constrictive of the body as well as of the mind and the will. I must also draw attention to the fact that while Benjamin posits storytelling as the chief somatic reaction to the nightmare and also posits this reaction as intrinsically liberating, Marx posits dramaturgy, play-acting, as the chief somatic reaction to the nightmare and also posits this reaction as perhaps only incidentally liberating and at least initially enchaining inasmuch as it at least initially only perpetuates the nightmare, or perhaps even intensifies it because it is a process of “conjuration,” of calling forth the illusion that the historical figures of the past live again because people are wearing their costumes and talking like them again. Marx’s examples of these dramaturgical-cum-conjurational set pieces-cum-rites certainly do read like prolepses of the modern myths of our own day, but they are manifestly not all of the same proleptic intensity. Luther costuming himself as Paul and the French revolutionaries of 1789-1814 costuming themselves as ancient Romans quite manifestly differ from “Our Own Hitler” in the size of the chronological gap between the original figures and the figures donning their costumes: Luther lived nearly 1500 years after Paul, and the French Revolutionaries lived at least 1300 years after the least ancient of the ancient Romans.  And this chronological distance is attended by a host of significant manifest knock-on discrepancies—for example, those of nationality and political identify: the inhabitants of Lutetia, the city that was to become Paris, in the Roman Empire, were Gauls by nationality but Romans by citizenship (unless, of course, they were slaves or other persons of lower civil status, a condition that was only vaguely commensurable with that of being a member of the third estate in pre-Revolutionary France [here we see yet another knock-on discrepancy]); the inhabitants of France were Frenchmen and hence subjects of a kingdom (i.e., neither a republic nor an empire) that was called France because it had been founded by the Franks, a sub-tribe of the Germans, the least-ancient Romans’ chief adversaries and most nightmarish embodiment of barbarism. Paul was a Jew by nationality, a citizen of the Roman Empire, a Tarsan and an Asian (what with Asia’s merely being the name of the Roman province roughly coextensive with today’s Turkey back then) by nativity-cum-residence; Luther was a German by nationality, a Wittenburgian by citizenship, and hence at least nominally a subject of the Holy Roman Empire, a loose confederation of mostly German-inhabited states that was called Roman largely owing to an earlier episode of play-acting, that of Charlemagne’s donning of the Caesarean laurels with the help of Pope Leo III. In short, in the case of these two examples the differences between the play-actors and their predecessors-cum-models are so numerous and so great that regardless of however closely the play-actors themselves may have thought they were re-embodying the people they were impersonating, it is impossible for any rational exegete to misread the play-acting as merely repeating earlier events or as flowing from them by any causal chain whose individual links can be itemized; and complementarily (and yet again, not un-quasi-paradoxically) no rational exegete would be inclined to try to deflate the play-acting as mere play-acting by pointing to the most obvious discrepancies between the play-actors and their predecessors-cum-models: to say to Luther, “Face it pal: you’re no Paul, ’cos this obviously ain’t the Roman Empire we’re living in; the Pope obviously ain’t Tibby Caesar; you were born a Christian, not a Jew, and in Central Europe, not Asia Minor,” etc.; or to Danton, Robespierre, aut al., “Face it pal: you’re no Brutus, ’cos this obviously ain’t the Roman Republic we’re living in; Louis XIV obviously ain’t no Julie Caesar; you were born in France (i.e., on a piece of land that only became properly Roman as part of the Roman Empire), not Italy, etc.” The case is quite different with the last example, that of the revolutionaries of 1848 parodying the French Revolutions of 1789 and 1793. Here the chronological, geographical, and political gaps are quite small indeed: 1848 is separated even from 1789 by less than a single human lifetime; and indeed a goodly proportion of people who witnessed the revolutions of 1848 (and indeed presumably a badly proportion thereof who participated in them) must have retained memories of the earlier revolutions; one of 1848’s revolutions—and “arguably” the most important of them—took place on the exact same territory as the earlier ones had taken place on, namely, France—and the remainder had taken place no farther away than a three-day coach ride (or, in 1848, a one-day train ride) therefrom; and despite the substantial political changes that were wrought in the course of them, they by and large took place in polities that (yes, yes, yes, the best-laid plans of Metternich and some doubtlessly actual fellow Conference of Vienna-attendee with a surname starting with “M” notwithstanding) retained the same names and borders as they had possessed in 1789. This is, in other words, a case whose spatio-historical coordinates may be fairly snugly mapped onto ours vis-à-vis the “Our New Hitler” narrative: the Second World War took place only about three-quarters of a century ago, and the principal belligerent nation-states of that war are still extant, even if the principal nation-states on the defeated side thereof are all governed under the auspices of constitutions essentially imposed on them by the victorious, such that they may in a certain “very real” sense be regarded as entirely different countries. The play-acting of the revolutions of 1848 had at least a superficial edge of verisimilitude over the play-acting of the present micro-epoch inasmuch as one of the principal political figures of those revolutions, Louis-Napoleon Bonaparte, was the nephew of one of the principal political figures of the French Revolution(s), Napoleon Bonaparte. Ah, I see. So in other words, the Hitler figure of the revolution of 1848 really was their new Hitler-equivalent! Not quite, or at least not immediately quite: for you see, in France the revolution of 1848 could not but initially mimic the external forms of the revolution of 1789 inasmuch as like that revolution it had to contend with an established monarchy, the monarchy of King Louis-Phillipe (which itself was in more than a manner of speaking the nephew of the monarchy of 1789 inasmuch as it had been based on a dynasty founded by the nephew of the monarch of 1789, Louis XIV), which meant that Louis-Napoleon initially had no choice but to opt for the relatively humble role of  a semi-anonymous civilian revolutionary rather than that of the role created by his uncle, that of the military hero stepping in to quell a welter of post-revolutionary anarchy. Only once the new revolution had succeeded, a post-revolutionary government been established, and finally he himself had been elected its president, was Louis-Napoleon in a position to establish himself as a permanent head of state, a more-than-nominal successor of his uncle who lost little time in having himself crowned Napoleon III (the would-have-been Napoleon II, Napoleon Bonaparte’s son, having died many years earlier). So then Louis-Napoleon only eventually got to become the reincarnation-cum-successor of the original Napoleon. What difference does the delay make, or, rather what greater difference does it make than, say, the delay that temporarily prevented Lizzie Windsor—who, let us not forget, until her uncle Ed abdicated was slated to round out her life as a mere duchess—from becoming the reincarnation-cum-successor of Elizabeth the First? The difference—at least if we are to regard Marx’s analysis of the earlier of the two comparanda as accurate (as for the purposes of the present argument we cannot help doing [not that at the level of generality at which we are considering that analysis any historian of any political persuasion is likely to take exception to it])—is far greater inasmuch as the reestablishment of the Bonaparte dynasty with Louis-Napoleon’s coronation did not in any sense mark a return to some Bonapartean status quo ante that had ended with either of Napoleon I’s dethronings (not that Elizabeth II’s coronation in any sense marked a restoration of some Elizabethan status quo ante that had ended with Elizabeth I’s death [because Lizzie Windsor merely happened to be the second Elizabeth to fall into the British line of royal succession {not that certain pundits have not done their sophistical damnedest to proclaim the present reign a “second Elizabethan Age”}] but that it did mark the continuation of the Hanoverian status quo established with the Acts of Union way back in 1707) but rather a consolidation of the changes in the balance of political and economic power that had taken place in France in the years since Napoleon’s second dethroning, i.e., since 1815.  The first French- Revolutionary period, the one lasting from the storming of the Bastille in 1789 to Napoleon I’s seizure of supreme executive power in 1799 via the original 18th-Brumaire coup, marked the transformation of France from a feudal into a post-feudal society; in other words, it marked the transfer of the preponderance of France’s political and economic power from the nobility to the bourgeoisie in the most capacious sense, to the entire stratum of society separating the nobility from the peasantry. This transformative and-cum-transferrative characteristic of the period was typified by Napoleon’s own ascent from low-ranking army officer to general to consul to emperor, for although he technically hailed from the nobility, his father had made his living as a pettifogging lawyer, petty governmental official, and petty Danny Partridge-esque money-making schemes in the petty province of Corsica, an island whose inhabitants were c**lturally and linguistically more Italian than French and that formally became a part of France only in the year of Napoleon’s birth; and being a fourth son of this perennial loser, he was obliged to put up with a rather hardscrabble, almost proto-Dickensian childhood that obliged him to put up his dukes day after day to avoid being pummeled into a tiny greasy spot to by bullies on the playgrounds of the boarding school and military academy he attended from the age of nine onwards; and it (the transformative-cum-transferrative characteristic) survived into Napoleon I’s reign, when one of his generals found himself invited by the Swedes to become their king (and founder of a Swedish ruling dynasty that survives to this day) merely on the basis of his prowess on the battlefield. In short, the entire first revolutionary-cum-first revolutionary epoch’s retention of titles of nobility belied was a meritocratic epoch, an epoch of the self-made man, the sort of man whose ambition in an earlier period of French history never could have led him to any higher position than that of temporary kingmaker or king’s right-hand man, à la Richelieu or Colbert, because the deepest deep structure of French society had placed insuperable obstacles to his advancement to the highest and most powerful stratum of it. And the simultaneous rise of thousands of self-made men attended the rise of the entire bourgeois class—the class of lawyers, manufacturers, and shopkeepers--to the level of that uppermost stratum, to its displacement of the nobility-cum-aristocracy from that stratum. The Napoleonic empire’s retention of titles of nobility consequently belied its fundamentally anti-aristocratic ; the titles bestowed on its nobles, right on up to that of the emperor himself, merely marked them as primissimi inter the pares of the bourgeoisie. The post-first imperial epoch, in contrast to the post-first revolutionary period, marked the consolidation of group or collective or corporate power within the bourgeoisie itself (or themselves), the consolidation of small maman-et-papa businesses into big industrial firms and big banks funding these firms’ expansion. Corallarily, in contrast to the late first-revolutionary period, the sup-period leading to the original 18th Brumaire of 1799, the period of the run-up to the 1848 revolution and the second imperial period inter alia marked an increase rather than a decrease in social order, the 1848 revolution was but a pseudo-revolution, and Louis-Napoleon’s 1853 assumption of the emperorship via the presidency of the first revolutionary government was but a pseudo-coup—whence the scornfully ironic title of Marx’s tract. The thesis of Marx’s polemic in The Eighteenth Brumaire of Louis-Napoleon Bonaparte is that despite his biological kinship with Napoleon Bonaparte (or at least official biological kinship therewith [for there were plenty people at the time, including Marx himself, who believed that Louis-Napoleon was the product of cuckoldry on the part of Napoleon’s sister-in-law]) Louis-Napoleon is but a sort of Dan Quayle to Napoleon I’s John F. Kennedy, that “He’s no Napoleon I” not only because he is a man of inferior abilities but also because he serves no authentic function in relation to the social forces that are actualizing themselves in his assumption of nominal exclusive sovereignty, that he is but a figurehead for those forces--and worse, very much a wrong sort of figurehead, inasmuch as the mid-nineteenth century was marking the eclipse and defeat of the bourgeois individual by the bourgeois collective. But surely if Louis-Napoleon crowned himself emperor with the assent of the bourgeois collective, with the assent of the big manufacturers and bankers, he must have basically been a man of their own kidney.  Indeed he mustn’t—or at any rate needn’t—have been; and indeed he was actually a man of a kidney that could not but have been violently and scornfully rejected by the organism of any big manufacturer or banker into which it had been transplanted; for as Marx writes later on in the Eighteenth Brumaire:

Along with ruined roués of questionable means of support and questionable antecedents, along with the foul and adventures-seeking dregs of the bourgeoisie, there were vagabonds, dismissed soldiers, discharged convicts, runaway galley slaves, sharpers, jugglers, lazzaroni, pickpockets, sleight-of-hand performers, gamblers, procurers, keepers of disorderly houses, porters, literati, organ grinders, rag pickers, scissors grinders, tinkers, beggars–in short, that whole undefined, dissolute, kicked-about mass that the Frenchmen style “la Bohéme.” With this kindred element, Bonaparte formed the stock of the “Society of December 10,” a “benevolent association” in so far as, like Bonaparte himself, all its members felt the need of being benevolent to themselves at the expense of the toiling nation. 

It’s interesting that “La Bohéme” here seems to include a great many types that we don’t associate with the bohemian lifestyle from Murger-Puccini onwards and to exclude a certain type that we tend to think of as the bohemian par excellence—viz., the so-called starving artist. Why, yes, it is, although I suppose at a stretch it’s just possible to subsume that certain type—or at least a specific sub-type thereof, that of the starving poet—under the heading of “literati.” And perhaps before at long last the end of this essay is reached, I shall find an opportune moment for discussing at some length the concept of “La Bohéme”’s volatility and factitiousness as highlighted by this passage—a moment, in other words, in which “the story that is told” about bohemians whether of the old-Murgerian-Puccinian or the new-Brickellian variety has failed precisely to correspond with actual bohemians’ material social function. But at the moment—the present moment—we must consider what that passage seems to say about Louis-Napoleon Bonaparte, namely…namely, what you’ve effectively already said, namely that his followers were at the opposite pole of society from bankers and manufacturers, that the very loftiest of them were but the “dregs” of the bourgeoisie, but this discovery just seems to mystify the trajectory of the 1848 revolution even further, for if Louis-Napoleon was organically a man of the terminally-down-and-out and derived the preponderance of his support from his fellow terminally-down-and-outers, why did the society over which he presided and eventually ruled not end up being a helotocracy rather than a plutocracy, a society dominated by its poorest members rather than by its richest ones? Why, simply because that society’s official political institutions did not competently mediate its power relations. For the revolution of 1848 led to the establishment of a Second French republic, and the governing principle of a republic is after all that each and every citizen has an equal share in the governance of the polity as a whole, and in one respect the Second French republic was perhaps unprecedentedly democratic, inasmuch as it allowed for the election of its president by universal suffrage (sic on the absence of whinging boilerplate about the exclusion of women from the political process), which meant that every lazzarono, pickpocket, runaway galley slave, scissors-grinder, et al. got as much of a say in who became president as every banker and factory-owner. But in another, and perhaps even more salient, respect, the Second Republic was extremely anti-democratic inasmuch as once elected the president wielded essentially unlimited power (tedious qualifying passage on the very slightly more than theoretical limitations on presidential power imposed by the quasi-democratic Legislative Assembly omitted for the benefit of the writer’s mental hygiene [but presumably also to the benefit of the reader’s]), which quasi-paradoxically gave him every incentive to kowtow to the bankers and manufacturers, inasmuch as it was they and they alone who had the financial wherewithal to allow him to be “benevolent to himself,” which in his case meant to live in the style to which he felt entitled to become accustomed, what with being the nephew of a bone fide emperor and all. So then Louis-Napoleon’s self-coronation as emperor effectively merely put the seal of permanence on the political patronage he had already long been enjoying as president. “You got it,” as Glenn Gould would have said (of course milliards of other people have also said “You got it,” but only Glenn Gould would have said it just now and in exactly the tone Glenn Gould alone could have said it in). Now that is quite farcical—a kleptocratic-cum-plutocratic empire founded on a borderline-helotocratic democratic republic. Indeed it is farcical, and Marx himself pointed out this farcicality in the very first paragraph of The Eighteenth Brumaire, which I hereupon quote in its entirety:

Hegel remarks somewhere that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce. Caussidière for Danton, Louis Blanc for Robespierre, the Montagne of 1848 to 1851 for the Montagne of 1793 to 1795, the nephew for the uncle. And the same caricature occurs in the circumstances of the second edition of the Eighteenth Brumaire.

But of course, from the paragraph that follows this opening paragraph, the paragraph I quoted earlier, we may gather that in Marx’s view this farcicality was all just part of the historical process, inasmuch as he reckoned that eventually the French of the mid-nineteenth century would stop translating the events they were living through “into their mother tongue” of First-Empirism and start speaking a “new language” that adequately conveyed the sense of those events. In other words, that the French proletariat would realize that they weren’t living in a true monarchy inasmuch as their real rulers were the bankers and manufacturers, and that they’d then bring about yet another revolution; this time a revolution in which the proletariat replaced the bourgeoisie insgesamt as the ruling social principle. Yes, that’s right, Jo. But of course we know that didn’t actually happen. Well, yes, we do. But-- --I know, I know, ‘But hindsight is always 20/20. Don’t try to get your comrade Karl off the hook with that excuse, Commie, ’cos after all, he was propounding a theory of history that was supposedly scientific, meaning that its prognoses were of events that were supposedly inevitable, such that their non-realization could never be redeemed by anything as folksily commonsensical as hindsight. I wasn’t about to say “Hindsight is 20/20” or to try exculpate Marx by means of the apologia articulated by it; I was about to say, rather, “But I’m not sure that such an outcome is what is really implied by the first two paragraphs of the Eighteenth-Brumaire, or at any rate, everything in those paragraphs up to that concluding sentence about expressing oneself in a new language and forgetting one’s mother tongue.” I was about to say, in other words, that in these paragraphs perhaps “the end of  Marx’s commonwealth forgets its beginning” and to add that although no genuine proletarian revolution ever eventuated in France, several farcical approximations of such a revolution did take place in France and elsewhere after 1853—for example, the Paris Commune (which Marx himself “ironically” took for the real revolutionary deal, even if it was actually closer to an Old Testamental reversion to a state of homo homini lupus than to a dictatorship of the proletariat), and even more notably the second Russian Revolution of 1917. And then if one moves ahead into the 20th century, one finds still more farcical iterations of revolution—e.g., the gratuitously and wantonly destructive 1968 student uprisings in France and the even more gratuitously and wantonly destructive so-called BLM riots in 2020 in the United States. Perhaps there’s another way to look at these paragraphs. Perhaps the generality of Marx’s statement that “history repeats the second time as farce” belies the limitation of its applicability—even in his own view—to events that had taken place since the beginning of the revolution of 1848. After all, in pairing the principal figures of 1848 onwards— Caussidière et al.—with their antecedents in the first French Revolution--Danton et al. in the sentence that follows this statement, he is perforce implicitly averring that the antecedents acted out a tragedy rather than a farce. And by extension, because in the second paragraph he mentions Luther’s impersonation of St. Paul before the French Revolution of 1848, we may infer that he regarded the Reformation as intrinsically tragic rather than farcical. And as Marx cites no other examples of farcical historical reenactments, we may at least conjecture that even in Marx’s own view, the revolution of 1848 was the first historical farce rather than merely the most recent in a long series of historical farces. And if this was so in his view, perhaps it did not occur to him that people would behave any differently in the aftermath of an historical farce than they had acted in the aftermath of an historical tragedy; and in truth he had no reason to presume that they would, to expect that the Revolution of 1848 would inaugurate a golden age of farce (golden in the sense that urine is golden, natch) in which nobody learned a dad-blamed thing about anything or ever picked up a single dad-blamed word of a foreign language. I find you alternative view of the opening paragraphs entirely plausible up until just before the very end, up until the point just before you infer that Marx had no reason to presume that a post-farcical age would play out any differently than a pre-farcical one. For after all, there are certain differences of epistemological complexion inherent in the very fiber, the very woof and weft, the very D*A, of a tragedy vis-à-vis that of a farce. I for one find these differences a tad difficult to specify with precision, but I believe it is safe to say that they center on the hero’s degree of awareness of the objective situation in which he finds himself at the start of the drama. The heroes of tragedy are never omniscient, and indeed their lack of knowledge of certain things is quite often instrumental in their downfall—here one conveniently immediately thinks of the tragedy that incorporates the abovementioned riddle of the sphinx into its dramaturgical gearbox, Oedipus Rex, wherein it is solely the hero’s ignorance of the circumstances of his birth that separates him from irreparable ruination—but they always at least have a very good idea of where they objectively stand in the world of the drama at its outset, and they always at least initially proceed in a manner befitting that position. At the outset of Oedipus Rex, a plague is plaguing Thebes, and as the King of Thebes, Oedipus is expected to do something to drive the plague away; it is therefore entirely fitting of him to act as he does in reaction to it—to seek counsel (q.v.issimo!) from the wisest sources thereof available to him, the oracle at Delphi and Tiresias the prophet. Of course, everything he says and does once he has subsequently learned from the oracle that the plague will never go away until the murderer of his predecessor on the throne has been apprehended is foolish, betokens a lack of judgment inasmuch as it brings him ever closer to the discovery of his horrific personal history (even if it is paradoxically may be regarded wise vis-à-vis the fate of the city inasmuch as it brings him ever closer to the discovery of the identity of the regicide), but it is never ridiculous because it never shows him behaving in a manner that suggests that he has forgotten his position as king, that he has suddenly come mistakenly to believe that he occupies a lowlier position in the . In a farce, the hero—or, perhaps, rather, protagonist, to employ an appropriately more neutral term of poetics (i.e., inasmuch as it is probably inappropriate to refer to anyone intrinsically incapable of heroic actions as a hero)—is defective in his knowledge at the outset, defective in his knowledge of his situation in the world in which he finds himself. Consequently from the outset every action he takes in that world is bound to make him look ridiculous. Consider in this regard Basil Fawlty—a petty hotelier who can never be brought to stop thinking of himself as a king. Every episode of the sitcom of which he is the protagonist centers on his effort to attain or achieve something—winning a bet on a racehorse, attracting a posh clientele to his dining room, receiving the patronage of a nobleman—that he should know better than to aim for, and the harder he strives to attain or achieve his aim in the course of the episode, the more he succeeds only in increasing chaos in his environment and thereby only in looking more ridiculous. His machinations and the attendant chaos are brought to an end only by a restoration of the prosaic, ignoble status quo by a dea ex machina in the person of his wife Sibyl (! [i.e., seemingly providentially {or perhaps, rather, “nightmarishly,” in the light of the present essay’s ]), and only at the cost of a humiliation of Basil that fails to be attended by the impartment to him of any knowledge (or, rather specifically of our most favored form of knowledge, counsel; it teaches him that he is bound to be caught but fails to teach him that he deserves to be caught and therefore ought not to try anything similar in future) and therefore leaves him looking as ridiculous at the episode’s conclusion as at its beginning. So if Marx had taken these epistemological-complexional differences between tragedy and farce to heart… He would have realized that Napoleon III was far closer to Basil Fawlty than to Oedipus (or Antigone, Hamlet, Lear, or any other proper hero of a proper tragedy). I believe that realization is an historical impossibility because…Yes, yes, yes, because the first episode of Fawlty Towers was produced over ninety years after Marx’s death. Well, I don’t think we need to be too concerned about that anachronism; for after all, Fawlty Towers is generally regarded as an apotheosis of farce, and even if we need be concerned as about it, there isn’t a thing I can do to address that need inasmuch as Fawlty Towers is the only farce with which I am well acquainted. Fine, fine: so then your point is that contra Marx—or contra what Marx explicitly seems to be saying—with the supervention of farce on the historical scene in the person of Louis-Napoleon, we reach a point at which history is enacted by people who have no sense of their objective place in the world and who never learn from their failures. Correct, with the qualification that because history in contrast to drama is open-ended, the failure of its farce-protagonists to learn from their experience is not merely a personal one, that their ignorance is inherited by the people who succeed them at the historical center-stage, such that they get their own turn at being the owner-manager of Fawlty Towers—altho’ of course, owing to the fact that circumstances that have brought them to center-stage have also often substantially changed the scenery, they, unlike Basil Fawlty, will often find themselves managing an entirely different sort of operation than the one of the previous episode, and consequently bringing to bear an entirely different set of delusions on its mismanagement. Thus, once Napoleon III was deposed and exiled, the people who found themselves in charge by default in Paris in the ensuing administrative vacuum had no interest in having themselves crowned emperor and instead turned to a model of dominion  essentially invented by Marx and his sidekick-cum-Hintermann Engels—the Commune, not, so one surmises, because they individually craved power any less ardently than Napoleon had done but because—as is hardly surprising chez the recent capital of a decadent pseudo-empire—not one of them was a person of sufficient cachet or brand-recognition to don the requisite costume with even a whiff of vraisemblance. Mightn’t this business of turning to as-yet-essentially fictive forms of government, to forms of government as yet extant only in the imaginations their devisers, be another novel facet of history-as-farce?  Yes, I think it very well might be. And this presumably unprecedented reliance on fiction brings us back to Marx’s pre-Louis Napoleonic examples qua de facto instances of tragedy in history. For surely if Luther as he was promulgating his 95 theses and getting the Reformation going was knowingly “putting on the mask of Paul” as the hero of an authentic tragedy it must have been the case that “in a very real sense” he was worthy of reenacting the role; for a Luther who had knowingly but unfittingly donned that mask would have been the star of a farce. And to conclude that he was worthy of reenacting the role is perforce to conclude that he understood Paul’s place in history, which is perforce to conclude that there was something real to be understood about Paul’s objective place in history, that his promulgation of the gospel and establishment of the early church were not mere figments of some crackpot projector’s imagination like Communism. And of course, for all his detractors (Roman Catholics and heathens alike) have said to the contrary over the past half-millennium, Luther had not uncompelling objective reasons for regarding himself as a latter-day St. Paul, for the sole authority on which he founded his proposed reforms was that of Scripture, meaning, essentially, the Gospels plus St. Paul’s interpretation of them in his numerous epistles. Why, then, given that the unfolding of a tragedy depends, though not on an absolute lack of knowledge, then at any rate on a relative imperfection of knowledge—though not (to revert yet again to the example of Oedipus) on not knowing one’s duties as a king but on the natural limits to one’s power as a king—and given that unlike Paul, Luther did not end up being executed for his proselytizing, in what sense was his participation in history even a tragedy? And don’t tell me, “Why, in the sense that he didn’t know that he wasn’t going to be executed,” because that would make him yet another Basil Fawlty en avant la lettre. I’m certainly not going to tell you anything of that kind. What I am going to tell you is something that at first blush may seem to be something of that exact kind even if it is something entirely different, namely, “Why, in the same sense as Napoleon III’s participation in history was a farce at the end as well as at the beginning despite the fact that he died a sufficiently dignified death in his own bed, which was a farce at the end because the people who took over the management hotel of history after his forced retirement from the business, the Paris Communards, were as farcically mismatched with their starring role as he had been with his.” So then, the tragic hero of the Reformation at the moments of his tragic misstep and subsequent downfall was not Luther himself—who himself died a sufficiently dignified death in his own bed—but the leaders of the Reformation who took over for him, Calvin Knox, et al.  Yes, yes, yes: and their fellow Protestants among the mighty post-Lutheran laity—Henry VIII, Mary Queen of Scots, et al.—and all the principal figures, both religious and secular, of the Counter-Reformation—Pope Paul III, “Bloody” Queen Mary et al.; people, that is, whose adherence to or opposition to the principles of the Reformation impelled or at least encouraged them to promulgate laws and issue orders that led only slightly mediately to the murder of thousands of people and the destruction of thousands of buildings, statues, etc. So the tragedy of the Reformation is simply that hoary old shopworn fossil of a chestnut of a cliché, “the tragedy of unintended consequences”?  In a word, yes, but the fact that it is a cliché does not make it any less (of) a tragedy. People take certain actions for certain reasons that are perfectly rational (or just) and whose rationality (or justice) is transparent to them at the moment at which they are taking them, but these actions have irrational (or unjust) repercussions that they cannot foresee, and it is irrational and unjust to expect that they will ever reach a point at which they will always be able to foresee such repercussions and consequently always be held unequivocally to blame for them—and hence to be regarded as either laughable or wicked in every situation that bids fair to eventuate in such repercussions, and hence to be instantly pigeonhol(e)able as the protagonist of a farce or (to name the as-yet unnamed downbeat counterpart to farce) a melodrama. OK, but in the light of the aforementioned open-endedness of history, and in particular the light of the fantastic (and largely bloodless) success Protestantism enjoyed after the Reformation and Counter-Reformation, notably in Britain and America, is it not possible to see the drama initiated by Luther as something other than a tragedy or a farce or a melodrama, namely, a, how do you say…? The word I believe you have on the tip of your brain’s tongue is “comedy.” In point of fact I don’t have it on the tip of my brain because I genuinely don’t know what the word is, and “comedy” can’t be the word because there wasn’t anything particularly funny about the Great Awakening, the Plymouth Brethren, etc. et al. Nor is there anything particularly funny about many a comedy—e.g., The Divine Comedy of Dante, the Human Comedy of Balzac, etc. You see, the essential thing that distinguishes a comedy from a tragedy is not its tone of its dialogue and stage directions but the emotional valence of its outcome (or in strictly dramaturgical terms, the fortunes attained by its protagonist or protagonists); the essential difference between a comedy and a tragedy is not that it is humorous rather than serious but that it has a happy ending and not a sad one. I see, but then again, in the light of the death of God in the late nineteenth century at the very latest, mightn’t one ultimately see the drama of Lutheranism—and indeed not only of Lutheranism or even of Protestantism as a whole and indeed not even of Christianity as a whole but of theism as a whole--as a tragedy after all?  Why, I suppose that all rather depends on whether one believes God really died in the late nineteenth century at the latest or that he ([sic] on the lowercase haitch for the personal pronoun designating the deity [for the uppercase haitch is perhaps not accidentally a nineteenth-century innovation, and possibly even a late nineteenth-century one, and I don’t see how any Anglophone with a respect for his own literary tradition can in good faith conform to a usage unknown to the translators of the Authorized Version, Sir Thomas Browne, Samuel Johnson, et al.]) never died or that he died and came back to life (!). And then of course one must give some consideration to the brute demographic fortunes of Protestantism. To be sure, these fortunes have not been looking good in the hyperoccident for some decades, with Britain having been a country with “more Muslims than Methodists” since at least 1993 (the publication-year of a New Left Review article on Philip Larkin in which that phrase appears) and church attendance been in freefall decline even in the most religious-minded of all hyperocciental polities, the United States, since at least that far back. But Evangelicalism has all the while been thriving in Latin America and Episcopalianism in Africa, and inasmuch as both the cohort of Latin American Evangelicals and the cohort of African Episcopalians have reliably shown themselves to be hearty breeders, there is good reason to suppose that if, in the demographic projector Eric Kaufmann’s words, “the religious shall inherit the earth,” a goodly proportion of those Terran heirs will be card carrying-cum-Bible thumping espousers of some version of the creed established by Luther. In short, in the quasi-apocryphal words of that Chinese-Communist statesman apropos of Marx’s other big pre-farcical example (I call the words “quasi-apocryphal” because they [or presumably their Chinese counterparts] were apparently actually spoken, they were apparently spoken not apropos of the original French Revolution but of the abovementioned farcical students’ revolt of 1968), “It’s too early to determine” if the drama of Protestantism is a tragedy or a comedy. And I suppose the same remark might plausibly be made about most of the other big world-historical phenomena that originated—or at least are now held to have originated—a quarter-to-three quarters of a millennium ago—the Renaissance, the Enlightenment, the Industrial Revolution, etc. And I suppose I agree with you on that point. But where does this cozy agreement between the two of us on our double-lonesome get the two of us-plus-the rest of the world vis-à-vis the off-casting of the burden of the nightmare of myth? Is a single layer of skin exfoliated from the nose of that nightmare by the conceptual substance of this agreement—viz., the banal augmentation of the banally nominalistic adage that “History is one damn[ed] thing after another” (an adage whose apparent unattributability [by which I mean not that it can’t be traced to a specific identifiable person, or at least quite a  small number of specific identifiable persons, but that from its very earliest appearances in print, in the 1950s, it consistently serves as a summation of a concept of history ascribed to an adversary of the writer rather than of the writer’s own concept of history] incidentally eloquently testifies to a perduring pan-Occidental aversion to nominalism in the teeth of nominalism’s anciently quasi-official charter as the philosophical school of unexfoliatably hard-nosed common sense) by the observation, “Sometimes these damn(ed) things are happy, and sometimes they’re sad”? I am inclined at least to hope that it—meaning not our agreement eo ipso but this conception of history specifically as drama, as the mode of literature common to tragedy and comedy [though not farce, which is but a parasitic epiphenomenon of drama]—will allow us to cast off the burden of myth by retro-figuring the actions of human beings in history as a phenomenon that is intrinsically less enchaining, less amenable to devolving into myth, than the story and storytelling. Before you go any further, I’ve got to raise, if not quite an objection, then at least a suspicion, namely the suspicion that drama tends to be more rather than less enchaining than storytelling; namely, that at least in the long today’s world, the world of all persons now living plus the world of all deceased persons whom those persons have known personally, when people find that other people are making too much of a fuss about something, they are prone to disparaging the excessive fuss-making as “drama” and not at all prone to disparaging it as a “story.” Why, that is certainly true, but unless I am greatly mistaken, the disparagement there is always occasioned by the fuss-maker’s misapprehension of the scale of the particular fussed-about matter rather than by fuss-making insgesamt and eo ipso; such that, say, a general’s wife who ridicules her spouse for making a “drama” out of his own hangnail in shedding tears over it would by no means ridicule him for making a “drama” out of an imminent thermonuclear missile attack in letting out a profane interjection on receiving news of the attack from HQ. In other words, actions and events that are genuinely important are generally and approvingly regarded as genuine drama. And the ought-to-be-proverbial moment’s reflection ought to make it clear why such actions and events are thus regarded—namely, because (although the supersession of stage performances by cinema as the primary medium [a supersession notably marked by latter-day Francophones’ preference of cinema to drame as the pejorative term for things about which people are making too much of a fuss]) the act of acting corresponds exactly to the mode and manner in which human beings present themselves to each other in the world. Wherever we are and whatever we happen to be doing, we are always engaged in doing something that is instantly amenable to being mistaken for the performance of a play by anyone who happens to behold it—to be sure, that play may not be one that is likely to captivate a spectator; it may be virtually indistinguishable from a stage adaptation of Andy Warhol’s Sleep, or one of the dialogues in an elementary foreign-language textbook (“How much are these strawberries?” “Fifty cents a pound.” “I would like two-and-a-half pounds of them, please.” “Certainly, sir. That will be one dollar and twenty-five cents.”). Still, it requires no interposition by anyone not a party to be rendered indistinguishable from a play-performance. The same action, conversation, autc., can never be mistaken for a story; at most it can serve as the subject-matter of a story, and it can serve as such subject-matter only via the mediation of someone not party to it. What, even in the case of first-person narratives; i.e., even when the teller or writer of a story, conversation, autc., was a party to the action, conversation, autc. Even so, as your employment of the past tense, of was, shews; for one can turn an action, conversation, etc. that one is a party into a story only once one is no longer a party to it, when one is reflecting on it as something that has already happened and hence already vulnerable to vagaries of memory and interpretation (yes, yes, yes, even if one improbably rigorously eschews interpretative commentary à la Benjamin’s mischaracterization of Herodotus). For the first three or four centuries of the modern era—a three or four centuries in which, incidentally, all manner of literary rivals to drama, all manner of narrative genres from the fairy tale to the romance to the early novel, flourished—the preeminence of drama as the literary mode most closely approximate to reality (or nature, as the aggregate of the real tended to be called then) was taken for granted, as can notably be seen in Shakespeare’s numerous famously splendiferously extensive employments of the stage as a metaphorical vehicle—e.g., in Jaques’s “Seven Ages of Man” speech in As You Like It, the eponym’s “Out, Out, Brief Candle” speech in Macbeth, and the chorus’s prologue to Henry V. (By witheringly unfavorable contrast, the only invocation of storytelling as such a vehicle is Lewis from King John’s piddling figuration of life as “tedious as a twice told tale/Vexing the dull ear of a drowsy man,” which is so underwhelming that I have just managed to quote it in its entirety) This last example in particular is notable in that it figures the succession of historical events on which it is based as a literal drama that happens to have been regrettably confined to “the wooden O” of an Elizabeth playhouse merely because the playwright lacks the “muse of fire” that would allow him to bring it to life in its true dimensions, with “a kingdom for a stage” and its eponym comporting himself not like a mere king but like the god of war, Mars. And separate and singular mention must be made of the moment in Julius Caesar when the conspirators, having just killed Caesar and bathed their hands in his blood, speculate about “How many ages hence/Shall this our lofty scene be acted over/In states unborn and accents yet unknown!,” about “How many times shall Caesar bleed in sport,/That now on Pompey’s basis lies along/No worthier than the dust!,” and conclude that “So oft as that shall be, So often shall the knot of us be call’d/The men that gave their country liberty.” Oh, I see! Here Shakespeare’s breaking down the so-called fourth wall; he’s having his characters reference the very play that the audience is seeing performed at that very moment. It’s something straight out of Pirandello (or perhaps Beckett or Artaud or Brecht; I’m no student of the history of theater). It’s amazing how ahead of his time the Bard was. To the contrary, here Shakespeare is keeping the fourth wall firmly in place and intact. True, Julius Caesar is itself a play written in “accents unknown” to the Romans of the first century B.C., i.e., in a language, English, that then existed only in embryo, but this does not detract a jot from the “realism” of Shakespeare’s portrayal of the conspirators at that moment. After all, the Romans of the first-century B.C. had grown up seeing performances of the plays of the great Greek tragedians and comedians and their Roman imitators and successors, such that it would have been entirely natural of the conspirators to imagine plays dramatizing their conspiracy and its repercussions in languages other than Latin. This is all fine and plausible vis-à-vis the first two cluster of lines, the ones ending with the mention of Caesar lying on Pompey’s basis. But by the concluding couplet, the one that promises that with every performance of the just-mentioned conspiracy-and-its-repercussion dramatizing-play the conspirators will be called “the men that gave their country liberty” is not Shakespeare not ultimately breaking down the fourth wall, or, what seems to come to more or less the same thing, having the last laugh, and having it at the conspirators’ expense? For after all, as everyone who has lived since 2 BC (i.e, the year of the official end of the Roman Republic and the beginning of the Roman Empire) has known, the conspirators’ assassination of Julius Caesar ultimately served only to subordinate the Romans to unabashedly full-fledged tyranny. Why, I grant you that history has dumped a heaping helping of irony onto the conspirators’ prediction that the play in which they will figure as characters will be a dramatization of the liberation of the Romans rather than of the preparation for their enslavement, but again this prediction is as “realistic” as the assertion made in the previous two line-clusters, because in the moments immediately after the assassination, the conspirators would still have had every reason to believe that they were in the midst of liberating their country and hence would have been fully entitled to imagine themselves as being portrayed as its liberators in some play yet to be written, in a version of Julius Caesar whose last two acts would in no way have resembled the last two acts of our Julius Caesar (and that indeed probably would have been a comedy entitled, say, Brutus &Co). Such moments of such irony are utterly typical of drama (whence, natchissimo, the designation of such irony as dramatic irony in Lit-Crit 101 textbooks) because drama perforce always presents people even at their most knowledgeable as knowing only as much as they can have gathered from the preceding events of the drama (or, to be more precise, those events plus their explicitly or implicitly known antecedent events [Oedipus’s encounters with Laius, the Sphinx, and Jocasta; Caesar’s struggle for supremacy with Pompey and Crassus; Hamlet Sr.’s military struggle with the Poles, etc.]). OK, I now finally take all your points—or at least all your recent points--regarding history as drama, which is to say that I now find understand why it is entirely natural and reasonable for people involved in major historical events—or even in events that enjoy the prospect of being major historical events—to regard themselves as characters in a drama, and I now fully understand why in so regarding themselves these people are utterly unfettered by myth, but I so far fail to see why they would be fettered by myth if they instead regarded themselves as characters in a story, and I further fail to see how those of us not so fortunate (or unfortunate) as to be involved in events that stand even the faintest ghost of a chance of being major historical ones can ever benefit from the unfettering qualities of drama. To serve as your optometrist vis-a-vis your first unespied point: one is still more vulnerable to being fettered by myth in figuring oneself as a character in a story than in figuring oneself as a character in a drama for reasons that flow ineluctably from two attributes of the story that I have already mentioned—viz., its dependence on events that have already happened and its dependence on the mediation of a person who was not party to those events, namely, of course, the storyteller. In figuring oneself as a character in a story one at least tends to fetter oneself as straitly as one would in figuring oneself as a character in a myth (or, rather, as the effective reincarnation of a character in a myth, as I suppose such figurers actually figure themselves) because events that have already happened are by their nature irreversible and because in conceding the necessity of the storyteller’s mediation one is perforce conceding that he is epistemologically more enlightened than any of the characters in the story—nay, one is conceding that those characters are no authorities whatsoever in the events of the story, inasmuch as whatever commentary they might make on these events is perforce filtered through the definitely authoritative perspective of the storyteller (yes, as mentioned before, even in the case of stories in the first-person narrative mode, for in such a case one’s former self is a separate figure from one’s present self). And once that authority has been ceded to the storyteller, a phenomenon that one might term knowledge creep begins to occur: the storyteller begins to report in greater and greater detail and in greater and greater depth on the events that he is recounting; for having been granted absolute control over the materials of his story, which are always de facto lifeless and inert objects, he has every incentive to treat these objects as if he understands them completely, at every conceivable resolution of their existence; in other words, he asymptotically approaches an attitude of complete omniscience towards them. He moves from recounting the outward and immediately incontrovertible surface of events—to reporting on them in the manner of an eyewitness of an incident or spectator of a play, who simply notes what people have said and what they have said in done in manifest reaction to what others have said (“Take that, you bastard!” said Bob Fockcuck as he delivered a right hook to the jaw of Cob Cuckfock.  Whereupon Cob Cuckfock said, “Who are you calling a bastard, you bastard?” and delivered a sucker-punch to Bob Fockcuck’s puss.)—to reporting on genuine phenomena to which he has no immediate access, and which hence will always be conjectural, such as motives (Doubtless in an outburst of jealousy occasioned by Cob Cuckfock’s liaison with his wife, Bob Fockcuck  said, “Take that, you bastard!” and delivered a right hook to the jaw of Cob Cuckfock.”) to reporting on pseudo-phenomena to which no-one has immediate access, such as bits of internal monologue (“It’s not so much because he’s having an affair with my wife that I can’t stand the sight of him,” said Bob Fockuck just before delivering a right hook to the jaw of Cob Cuckfock, “as because his last name is ‘Cuckfock.’” [I call such a phenomenon a pseudo-phenomenon because while it is doubtless true that many a man has been at least partly impelled to hit another man by his aversion to that man’s name, it is probably unanswerably debatable whether any man has ever articulated and quantified this aversion verbatim immediately before hitting that other man]).  Theodor W. Adorno remarked on this epistemological presumptuousness on the part of the storyteller in an extremely frustrating (or, in more positive lingo, tantalizing) letter to Benjamin written immediately after Adorno’s first reading of The Storyteller. The letter is frustrating (or tantalizing) because it starts out by praising the essay unqualifiedly and concurring with its thesis that it is no longer possible to tell authentic stories, then tenders the just-mentioned epistemological presumptuousness as the only example of the just-mentioned impossibility, and finally in illustration of this presumptuousness exclusively cites works not mentioned at all in the essay— some short stories by a childhood friend of his, a short story by Arthur Schnitzler, and (negatively, as a counterexample to the other two) Goethe’s Elective Affinities. He criticizes Schnitzler for opening his story with a sentence like [here and in all other quasi-quotations from this letter I am obliged to rely on my broadly reliable but not lexically punctilious memory], “At ten o’clock on the morning of July 25 Heinrich Stuckenschmidt was sitting at a table in his kitchen” by exasperatingly querying, “How could he ever presume to suppose that he knew enough about another person to write a sentence like that?” And then without glossing his exasperation any further he writes that this sentence from Schnitzler reminds him of these stories of his old buddy, wherein dialogue was assigned to characters in the traditional “he said, she said” format, only with the particularity of the characters being underscored by the German prerogative of prefixing definite articles to proper names—e.g., “Then the Johann said, ‘Blah-blah-blah’; to which the Franz replied, ‘Blah-blah-blah,’ etc.” He contrasts this practice unfavorably with Goethe’s in Elective Affinities, wherein Goethe introduces his characters by a formula to the effect of “So there’s this guy. Let’s call him Johann [in all candor I cannot deny that Goethe actually wrote Eduard, albeit that I owe this candor-actuated imposition entirely to my consultation of the reference work of default first resort].” And in the course of articulating the contrast, he implies that Goethe was, as it were, laudably ahead of the curve in not naming his characters as a matter of course, in signaling the arbitrariness of the names he assigned to them; TWA implies, in other words, that by Goethe’s time, the late eighteenth century-stroke-early nineteenth century (or more precisely, by the sub-time of that time when he began writing Elective Affinities, i.e. the late eighteen-oughties at the latest), it had somehow already become somehow (sic on the iteration) wrong for an author of a narrative composition to refer to his characters by name without editorial comment, to bandy their names about as reflexively as if they were names that had been imposed on him rather than freely chosen by him. Frustratingly (or tantalizingly) Adorno does not explain, even implicitly, what might have changed by ca. 1800 to make such about-bandying unprecedentedly wrong. But from certain writings in his official corpus, we can perhaps at least tentatively glean what that potential what (sic on the iteration) might have been. Here I am once again apparently hampered by lack of access to original sources, for I am specifically recalling a passage from an interpretation of Sartre’s novel La Nausée (Nausea) that mystifyingly does not seem to have been included in Adorno’s Gesammelte Werke (Collected Works). Anyhow, this passage focuses on a passage in the novel wherein the narrator, the historian Roquentin, undergoes an attack of the book’s eponymous gastric condition on catching sight of the root of a chestnut tree while sitting on a park-bench. Fortunately I do have access to La Nausée (albeit only in the form of a crummy electronic bootleg chockfull of ASCII-induced typos) and so can roughly English the passage in point thus:

The root of the tree was burrowing into the earth directly underneath my bench. I don’t even remember anything anymore other than that it was a root. Words had vanished, and with them had vanished the signification of things, the instructions for their use, the feeble points of reference that human beings have traced on their surface. I had sat down on my own, a bit stooped, my head lowered, opposite this black and gnarled entirely brute mass that scared me. And then I experienced the following flash of insight. It took the breath out of me. I had never, before the previous few days, had an inkling of what it meant to “exist.” I was like other people, like the people who walk along the seashore in their springtime attire. Like them, I said “the sea is green; that white spot up there is a seagull,” but I didn’t feel that it existed, that the seagull was an “existing seagull”; ordinarily existence is hidden. It is here, around us, inside us; it is us; one cannot say two words without speaking of it, and in the end, one doesn’t touch it. When I believed I was thinking about it, I had to believe I was thinking of nothing; I had an empty mind, or, strictly speaking, a single word in my mind, the word “being.”  

This passage articulates what in Negative Dialectics Adorno terms Sartre’s “extreme nominalism,” a philosophical position that maintains that the world is made up of nothing but mutually incomparable particulars. In the world as conceived by Sartre there are no trees or even any roots of trees but merely certain “brute masses” that have been mislabeled trees or roots of trees by people (not that there are even really any people either, that each person is not himself ultimately a “brute mass”) and that in fact do not even deserve to be termed “brute masses” and that in truth are but unnamable constituents of an infinitely differentiatable, and therefore ultimately undifferentiated, single non-something only provisionally nameable as “being.” While not an “extreme nominalist” himself, Adorno admired this passage in Nausea on account of what he saw as its revelation of a certain truth about the historical moment in which it had been written—a historical moment coinciding, incidentally, exactly with that of “The Storyteller” and his letter to Benjamin regarding it; the truth that as of that moment the self-evidence of the connection between words and things had at least temporarily vanished because a certain system that had underwritten an understanding of the relation of things to one another had collapsed. At the same moment—i.e., in 1946, less than a decade after the publication of Nausea and “The Storyteller”—the young Philip Larkin articulated this same sense of a loss of referentiality when he queried in his poem “Going,” “Where has the tree gone, that locked / Earth to the sky?” In querying this he of course did not mean that he couldn’t actually identify a tree as a tree but that there no longer seemed to be a point in singling out a tree from its immediate surroundings when its position in the larger cosmos was no longer clear, when it no longer served not merely as a symbol but also as a synecdoche for the aspiration of life towards goals situated above and beyond mere biological existence—i.e., beyond mere “being.” And so, to plug this all back into the sort of old-fashioned third-person narration exemplified by the sentence from Schnitzler’s story, it is presumptuous even to write that a person of such and such a name is sitting at a table belonging to him in a house belonging to him because the very notion of a person of a certain name and standing in a certain relation to certain objects, a relationship of ownership, has ceased to be self-evident. To be sure, the Adorno who excoriated Schnitzler’s story presumably would have conceded, at this very moment there are presumably thousands of men of certain names sitting at tables in houses in and for which there exist documents attesting to these men’s ownership of those houses (and perhaps even documents—e.g. or perhaps even i.e., merchants’ receipts—attesting to their ownership of these tables). All the same, this selfsame Adorno would have added, on a metaphysical plane it is wrong to refer to such men in relation to such objects in such a manner as Schnitzler has done because by the beginning of the fifth decade of the twentieth century the very notions of bearing a certain name, of residence, and of ownership have lost all solidity and stability. And what, pray tell, has caused this loss of solidity to these notions? Well, it’s a long story…Oh really? Why then it axiomatically must be the one and only story that can still be authentically told. All right, all right, I suppose I should rather say it’s a long historyNow don’t even begin to pretend that the prefixing of that wee homonym of a possessive adjective to story is going to let you off the hook, for as everybody including you knows (sic on the singular verb-form, natch), story is just a sort of abbreviation or nick-word for history. To the contrary, I have let myself as far off the hook as a whale-shark (i.e., the largest of all actual fish, and hence the hardest to snag with an ordinary angler’s fishing-hook), inasmuch as—as I seem to have to be continually reminding you—the concept of the story that is the topick of this essay was propounded in an essay in the German language, and the German word for history is Geschichte, which is as etymologically distant from the German word for story, Erzählung, as Saarbrucken (the westernmost German-speaking city) is from Vienna (the easternmost one). For that etymological datum I shall take your word, but by that same toke-fest, you must acknowledge that “it’s no accident” that story is derived from history; that history is after all a kind of narrative, such that at least some portion of whichever strictures you end up conclusively applying to stories and storytelling will perforce apply to history and historiography “in a certain very real sense.”  I must acknowledge the (f)actuality of that very non-accidentality, and I hereby do acknowledge it, and I promise that Lord willing I shall address the application of the just-mentioned strictures to those two Haitches. But I trust that for the moment you won’t mind my answering your question? By no means will I mind that. Splendid. I would hazard that what has—or (to switch from the present perfect to the past perfect now that we are no longer writing from the point of view of Adorno in ca. 1940 but from the vantage point of our own historical moment) had—had called the very notions of bearing a certain name, of residence, and of ownership to lose all solidity and stability by the beginning of the fifth decade of the twentieth century was or were the great cataclysms of the second, third, and fourth decades of the twentieth century—chief among them the Great War, the Great Inflation, and the Great Depression (not to mention the then-just-commencing war that would rechristen the Great War the First World War).  And this hazarding immediately prompts me to c***le b*ck to the inaugural image of The Storyteller, that of “a generation that had gone to school on a horse-drawn streetcar now [standing] under the open sky in a countryside in which nothing remained unchanged but the clouds” an image meant to crystallize the situation of Benjamin and his contemporaries at the end of the 1920s. For of course among the things other than the clouds that had occupied that countryside before the war but that by then no longer occupied it or occupied it only radically altered shape, must have been the houses, kitchens, kitchen tables and so forth that the members of that generation had taken for granted as their own alienable possessions and that had since been radically alienated from them either by annihilation, devaluation, or appropriation—annihilation by ordnance, devaluation by market forces, and appropriation by States that had in many cases not existed at all before or (as in the case of Austria) had existed in a radically different (geographical as well as political) shape.  And such being the case, it “arguably” really was presumptuous to write à la Schnitzler of a man sitting at a table in his own kitchen. Fine, but Goethe wrote “Elective Affinities” more than a century before the First World War. Surely he was then still in a position to write unpresumptuously of people’s inalienable possession of their own names, not to mention of their own kitchens, tables, and houses. Up to a point, up to a point, yes, but the French Revolution had unsettled property relations and the social hierarchies throughout Europe to a certain extent; for the most part this disruption had only impinged on a small segment of European society (or societies, if one tends to think of the beaux mondes of discrete nations and polities as discrete entities rather than constituents of a single pan-European beau monde), namely the aristocracy (think of the exiled French nobles obliged to take up work as butchers, bakers, candlestick-makers, and the like), and that only temporarily, but it had been a disruption that Goethe, being a member of that segment himself, must have been keenly (not to say paranoiacally) attentive to. And then there was the seemingly unending torrent of new theories and systems in natural philosophy, Goethe’s interest in which is attested to by the very title of the novel (the “affinities” in question being only quasi-metaphorically chemical ones) and attitude to which was not altogether welcoming, as witnessed by his development of his own theory of light and color in opposition to Newton’s. And then finally one thinks of Goethe’s attitude to the so-called industrial revolution, which was prevailingly negative, as instanced by a letter of his old age in which he grouses that the Weltlauf is already in the decidedly raffish hands of the builders of railroads (again, I am obliged to rely on my memory; I believe the letter is quoted in Charles Rosen’s Romantic Generation) and that even top-shelf savants of the old school like himself are effectively dinosaurs (not that he uses/used the word dinosaur, which probably wasn’t available to him then, although he may very well have used the word fossil, which certainly was [and yes, in German as well as English]). In short, it seems reasonable to surmise that Goethe had what one might term intimations of the reign of contingency that fully set in with the First World I; and of course one gets sharper intimations of that in-setting as the nineteenth century progresses–Matthew Arnold’s “Dover Beach,” large chunks of the oeuvres of Dostoyevsky, Nietzsche, Hardy et al.-stroke-et al. Yes, yes, yes: but surely the question can be posed to much more mystifying effect from the opposite direction—posed, namely, as “What was it about the state of the world before Goethe’s time that allowed writers of narrative prose to name their characters as a matter of course?” Why, I admit that that is a legitimate re-posing of that question, and that thus re-posed the question is at least initially mystifying. But when I think of a novel like Tom Jones, a novel published, incidentally, in the year of Goethe’s birth, 1749, and perhaps the most toweringly self-confidently narrated of all novels; and I think, specifically, of the numerous prefatory episodes in that novel in which the author, Henry Fielding, addresses the reader without the slightest trace of complaisance and in one of which he goes out of his way to say that he is peppering the novel with untranslated Latin tags in order to weed out the semiliterate churls; when I think of these episodes in conjunction with Fielding’s position in the English society of his time, of his effortless stagecoach-shuttling to and from the management of his West-Country estate and the elaboration of the jurisprudence of metropolitan law enforcement in London; when I think of these things, I say, it suddenly does seem entirely natural in him to name his characters and write about their sayings and doings without a moment’s self-conscious reservation. Oh, I see. There’s nothing remotely unnatural about a name like Squire Allworthy. Why, no, of course there’s nothing remotely natural about such a name ea ipsa, but it was entirely natural for Fielding, given that he was a country squire himself and well acquainted with the full range of country squires, to write about a person with all the attributes of his Squire Allworthy and to posit these attributes as eminently laudable ones and so, ultimately, entirely natural for him to assign to that person a transparently allegorical name like “Allworthy.” What then do you say to Samuel Richardson’s remark as reported by Dr. Johnson that “had he not known who Fielding was, he should have believed he was an ostler”? Why, to that remark I say that pace both Richardson and Dr. Johnson, it simply goes to underscore the self-evidence of Fielding’s status-conferred narrative authority, for only a gentleman insecure of his status as a gentleman would have felt self-conscious about depicting the minutiae of downstairs life at a country inn, the frequent occurrence of which depictions in Tom Jones presumably being what occasioned Richardson’s inclination to pigeonhole Fielding specifically as an ostler. Hmm, I see. But hadn’t England had its own violent revolution a mere hundred years and change before the writing of “Tom Jones”? Why, yes, but pace Marx in a passage of The Eighteenth Brumaire that I have not yet had occasion to quote (and no, the present sentence regrettably does not quite afford such an occasion [unless the single inverted comma-braced word to the right of this parenthesis counts as a quotation]), was by no means a “bourgeois” revolution, was by no means a revolution that dislodged the aristocracy and installed the middle class in its place, and indeed it was by every means a revolution that, minus a temporary usurpation of the monarch, left the polity-wide social hierarchy entirely untouched—not that there were not significant changes in the political economy of Britain during the Revolution, but they were changes of an evolutionary rather than a revolutionary nature. Very well, but by that selfsame tokefest, mightn’t it be persuasively argued that the Anglosphere, in contrast to most of the continental Occident (but by no means all of it: think, for notable exception, of Switzerland), has enjoyed such continuity of social operations ever since then, such that today’s English-language novelists ought to be able to christen their characters and hold forth on them with a degree of aplomb not an inch (or whichever Imperial unit of measurement serves to quantify aplomb) less than Fielding’s? One might argue such an argument, but I don’t find it particularly persuasive. For the Anglosphere did undergo changes that were as ultimately as disruptive of social relations as the French Revolution and its sequelae were of those relations on the Continent; the Reform Act in Britain being perhaps the most prominent sign—not instance, mind you, but merely sign—of these changes, which consisted not in any direct reconstitution of the relation of the aristocracy to the lower orders but merely a demographic explosion of those lower orders, an explosion that in turn, pace Thomas Malthus, was disruptive not because it presented any threat to humanity’s ability to feed itself, for it presented no such threat, but merely because it required a wholesale redefinition of the average person’s relation to the social whole, a whole in which he perforce played a less prominent role (sic on the dramaturgical loadedness of “role”!) than his predecessors had done, even if he like his grandfather was an Earl or Duke rather than a stevedore or street-crossing sweeper. The old social order of the Anglosphere was typified by a figure like Samuel Pepys, who was the son of a mere tailor and could count among his cousins both the second-most prominent military figure of his time and a full-blown bum, a strolling violinist.  Pepys rounded out his life as the top civilian in the British navy. A century-and-a-half later, a tailor’s son who ended up secretary of the navy (or its British equivalent) would have been termed a “Horatio Alger figure,” but the term is anachronistic in more than a technical sense when applied to Pepys because the “bottom” from which he rose was a much narrower one than the “bottom” from which Horatio Alger’s heroes were to rise and because in his ties of kinship he had ready to hand many means of rising from that bottom; complementarily, if Pepys had fallen into destitution from the heights, it would not have been apt to call his trajectory a “from shirtsleeves to shirtsleeves” trajectory because in his ties of kinship he retained material connections to the shirtsleeved even on attaining those heights and would have retained material connections to plenty of bigwigs even in his destitution. Well, then what about the Horatio Alger novels themselves? What about them? Well, they did after all get written. Ergo, it must have been possible to write about the phenomenon of rising “up from the bottom” after the age of the explosion of the lower orders. I don’t deny that it was possible to write about that phenomenon in a manner in which “aboutness” denotes an ultra-low resolution form of referentiality, the kind of referentiality comprehended by a single-sentence summary like, “This is the story of the son of a street-crossing sweeper who became the director of a railroad company.” But I don’t imagine any of the Horatio Alger novels came anywhere close to giving an accurate and coherent account of the experience of such a person’s ascent either from his own perspective or that of those who knew him. [You, your eyes becoming “like slits”]: You say you don’t imagine that the Horatio Alger novels even approximated such an account, which implies that you haven’t actually read any of them…Indeed, and indeed I have not read any of them. But I think it entirely fair to assume that if they had approximated such an account they would now be regarded as the greatest Anglophone literature of their time rather than as a combination of proto-airport novel and proto-boy’s-own-adventure novels. Well, then what about the greatest Anglophone literature of their time—say, the novels of Henry James. Did they not tackle such phenomena, or phenomena closely akin to them, more or less accurately and coherently? They did, but only by marginalizing those aspects of the phenomena that were most materially central to them. Take The Ambassadors’ hilarious refusal to divulge to the reader the common noun denoting the “small, trivial rather ridiculous object of the commonest domestic use” that Chad Newsome’s family’s manufacturing firm produces. This divulgence has to be omitted because the reader’s knowledge of the identity of the object would break the enchantment of James’s account of the impeccable savoir vivre Chad has acquired in the course of his residence in Paris and his semi-covert liaison with a woman of the old French aristocracy. OK, but why mention the object at all, even apophatically? Because ultimately, “in a very real” if laughable “sense,” it is this object rather than Chad (let alone Lambert Strether who, despite being the author’s surrogate, and despite being a manifestly more complicated and absorbing figure than Chad, “in a very real sense” exists merely to showcase Chad) that is the true protagonist of the novel, because in the end the entire French phase of Chad’s life-history proves to have been a mere finishing school-education that has prepared him to enter into his career as a manufacturer-cum-dispenser of the object (or perhaps even more ridiculous objects) with greater panache (not to mention aplomb). And from Chad Newsome it is an easy—an almost completely effortless—transition to J. Ward Moorehouse, the public relations tycoon who is in “a very real sense” the hero of Jon Dos Passos’s mighty U.S.A. trilogy, in the present writer’s opinion the greatest of all works of twentieth-century American prose despite its reputational eclipse by any number of novels by Faulkner, Hemingway, and Fitzgerald (not to mention many novels by such epigonal pipsqueaks as Bellow, Updike, and Pynchon), which eclipse itself is but proof of the Anglospheric literary establishment’s (and a large portion of the global literary establishment’s) gormless obliviousness of the long-since-elapsed evaporation of the raison d’être of storytelling and novel-writing, for for (sic on the repetition) despite its lazy universal interpellation as a sort of miniaturized twentieth-century version of Balzac’s Comedie humaine, its masterly-ly realized purpose is not to present the totality of early twentieth-century American society in all its supposed big, bold, brass band-propelled splendiferousness (such that the endless carping whinging about Dos Passos’s omission of representatives of various marginalized peoples from his fabulae personae is completely misplaced) but rather to chronicle the transformation of the so-called American experiment into an appalling mass exercise in utter futility. I say, I say: my memory of the “USA” trilogy from sixth-form college in the late 1980s is a bit rusty…Sixth form college, indeed?  I had no idea you were a Brit, let alone a formerly higher education-bound one in his or her early-to-mid fifties. Well, one learns new somethings every day—not that there’s any point in mentioning any of those somethings, inasmuch as apparently none of them ever amounts to anything substantial, i.e., anything answering to the name of experience. Now, now, now: I have yet to assert categorically and definitely that experience is non-existent now. Aye-aye-aye, laddie—What, so you’re specifically a Scotsperson? Nae, I was just taking the Buckfast-powered p*ss; tho’ I’m far from wishing to intimate that because I’m not a Scotsperson I’m necessarily a Sassenach—i.e., that I am an Englishperson and not a Welshperson, Manxperson, Cornishperson, Northern Irishperson, aut al. And far be it from me to intimate that I have automatically interpellated you as an Englishperson. But anyway(s), I hope that I was not wrong in automatically interpellating that “Aye, aye, aye” as the preface to a remonstration. Aye-aye-aye (or perhaps rather Nae-nae-nae [how one longs for an English equivalent of si in French]), you were not wrong thus to interpellate it, and the remonstration was to have been something to the effect of “I can see where this eldritch Scots-word-for-something-like-a-juggernaut-qua-unstoppable forward-moving-entity is headed.” I applaud your clairvoyance, for I am by no means able to descry such a telos (or any other clear-cut one) to the present essay. But anyway(s), I believe that that remonstration was itself merely a by-me-imposed digression from some utterance about the U.S.A. trilogy that you were quite eager to interject. Yes: I was about to interject that I believed that in one of the novels in that trilogy, that J. Ward Moorehouse fellow is somehow instrumental in getting the U.S. involved in the First World War. And that belief of yours is correct, and instrumental is very much the mot juste, because he certainly doesn’t consciously initiate any action that leads to that involvement. Rather, in his capacity as a public relations man he is enlisted by the only nominally neutral U.S. federal government to produce anti-German propaganda, and because producing it involves being ever on the lookout for examples of potential German malfeasance, in the course of producing it he becomes ever more Germanophobic in mindset, such that when war is finally declared, he joins in the immediately ensuing jingoistic fervor as demonstratively as he can from his Manhattan office. In other wordsat least inter alia—he never experiences the buildup to the war and his contribution thereto as objective realities. Correct. But this lack of access to the mainsprings of his own psyche is not inaugurated by his participation in the pre-war propaganda campaign; it is, indeed, present at the very origin of his public relations career: at a certain point when he is involved as a journalist covering the steel industry in Pittsburgh in the 19-oughties he suddenly finds himself thinking that he should give over reporting indifferently on both management and labor in favor of actively shilling for the magnates. Dos Passos gives us no explanation for the appearance of this idea in Moorehouse’s mind at this particular moment; it simply happens. And indeed Dos Passos presents all psychological events in this hypotactic, cause-free fashion: his characters simply find themselves wanting to do something, work doggedly at trying to do it, and generally perish in the course of the attempt. But they perish not from mere overwork but because their initial impulses are rooted in certain features of the Zeitgeist that are themselves woefully evanescent, such that yoking oneself to them is an ineluctably losing proposition. Thus a few years after the end of the war Moorehouse has a retirement-exacting heart attack in the course of contriving a publicity campaign for the old-fashioned snake oil salesman Doc Bingham (who plays a prominent role in the very first part of the first book of trilogy, which is set at the very beginning of the century, and he survives to the trilogy’s end precisely because he is an old-fashioned huckster of the old republic rather than an organically new-style American like Moorehouse) in response to the federal government’s new unswerving determination to regulate the Wild-Western mare’s nest of businesses that would eventually coalesce under the heading of the term “the pharmaceutical industry.” From a(n) hermetic public-relations point of view the Bingham account is a windfall or a steal or a triumph or some other sort of very good thing, because no proto-industry is more sorely in need of good publicity at that moment than the proto-pharmaceutical industry, but from a zeitgesitsmässig point of view the account is an ineluctably losing proposition because the federal government now has its sights as unbudgeably set on that proto-industry as it had them set on the German State a decade earlier. It sounds as though Moorehouse would have done much better to become a federal bureaucrat than to become a public-relations man. At first aural blush it does indeed sound as though that, but then one must consider that the operative stability of the federal government over the course of two or three decades is by no means contingent on the psychic stability of the individual bureaucrats comprising it over that same interval. If Moorehouse had joined the federal government in the late oughties he might have gradually found himself being roped into the war-buildup in some other capacity only to find that that capacity was no longer valued as highly once the government had switched its priority to regulation after the war (I shan’t extend my scenario-spinning so far as to spin specifications of these two capacities, lest I contribute malgré moi-même to what Thomas Bernhard, or, rather one of his translators, David McLintock, termed “the literature of the three-ring binder,” on which more anon, LW).  I see: so the real villain of the trilogy is what would later be called Big Government; which is to say that the trilogy is in a “very real sense” the story of the so-called average Joe’s ever-more-futile struggle against the embiggening of government. No, for the trilogy has no single real villain. The growth of government is something that Dos Passos registers and registers with disapproval, but it is but one of the villainous forces in American society whose pernicious effects are charted by the trilogy.  Public relations, as indicated before, is another, the outsized magnification and concentration of capital (which he terms “the Big Money,” a term that serves as the title of the third volume), Europeanization—in politics, the arts, and sexual morality—is yet another, the hyperindustrialization of the technology of everyday life is yet another still. And it is a wrong to describe the average Joe’s encounters with any of those villains as a “struggle” because none of the work’s average Joes (save perhaps a certain hapless schlub, a U.S. Navy-deserter turned foreign merchant-marine matelot “ironically” forenamed Joe) puts up a fight against any of them; quite to the contrary, upon encountering one of them they supinely surrender to it from the outset; they fall under its spell the moment some thought indissociable from it pops into their mind, as in the above example of the irruption of a public relations-associated thought in the mind of Moorehouse; thereupon they are helplessly led along by it as by a will o’ the wisp, which wouldn’t necessarily be an altogether bad thing (for one can after all think of plenty of examples of harmless wills o’ the wisps, Laurence Sterne’s hobbyhorse being the locus classicus and blueprint of all of them) were not the other villainous forces plotting their own inscrutable trajectories towards him and all the other average Joes all the while, such that at any moment he can be diverted onto an altogether different course by the supervention of a thought occasioned by one of them or fatally driven off course by some other average Joe pursuing one of them (as the aircraft-engine pilot-cum engineer **** ***** literally is thanks to his infatuation with “the Big Money” in the person of a fabulously wealthy high-society dame). And inasmuch as these forces all were actually in prominent and mutually interfering play in the U.S.A. throughout the first three decades of the twentieth century, the U.S.A. trilogy possesses a high degree of what Adorno called Wahrheitgehalt, meaning truth-content, meaning, essentially, truth. And yet, as you have already divulged, this supposed truth content-container grants the reader free access to the very phenomena that you yourself, extrapolating from Adorno’s stricture-and-a-half on Schnitzler, regard as being most inaccessible to the would-be storyteller or novelist—viz., the so-called thought processes, the so-called inner lives, of supposedly empirically actual human beings. How do you explain this discrepancy, sir(rah)? I explain it by rightly ascribing to it that epistemological character of certain phenomena whose non-demonstrable referentiality doubtless impelled Adorno to style Wahrheitgehalt Wahrheitgehalt (sic on the repetition) and not merely Wahrheit. It’s true that we can’t prove that there was ever a man called J. Ward Moorehouse into whose mind the thought of going into public relations suddenly emerged in 19-ought-somewhen. On the other hand, it is “arguably” equally true that if any past or current reader of U.S.A.—I write “past or current” inasmuch as it would seem that present Americans (yes, yes, yes, the present writer not excepted) are at least as prey to such forces as their forebears and predecessors were a century ago—were to engage in an hour or two of reminiscent introspection, he would find himself running into bygone moments in which he had undergone the same sort of quasi-epiphany as the one undergone by Moorehouse, and that he had consequently been led to pursue some will o’ the wisp of some sort, albeit perchance (albeit again not necessarily) not one that had eventually led him to such a disastrous pass as the one reached by Moorehouse at the end of The Big Money. In this connection I once again have occasion to mention Jean-Paul Sartre, who averred that what he found both irresistibly entrancing and ineluctably repellent about U.S.A. was this very quality of personal verisimilitude, the fact that he found it almost irresistible to cast his own quotidian existence down to its smallest atoms in Dos Passosian narrative mode and that the result—“Jean-Paul met Paul-Jean for lunch at the café. They ordered steak frites and escargots. Jean-Paul and Paul-Jean agreed that the plight of the workers was lamentable. ‘It was good that we got to discuss the plight of the workers,’ Jean-Paul reflected afterwards” (or something to that effect [again my capacity for accurate quotation is hamstrung by my lack of access to a decent circulating library {or my own n****dliness}]—invariably made him feel that he and everybody else he knew was or were a first-class heel because it conclusively proved to him with what a pitifully small handful of preoccupations they were all preoccupied with. But were they all actually preoccupied with such a pitifully small handful of preoccupations? Indeed, by the publication-date of the first volume of U.S.A. had not a certain writer already conclusively proved that human beings of the early twentieth century Occident enjoyed far richer inner lives than those of Dos Passos’s characters? And who would that conclusive prover of the richesses of the Inneren of early-twentieth century Occidentals? Why, a certain plucky middle-sized Irishman answering to the name of Jimmy Joyce, of course. To the contrary, if anything—i.e., to the extent that a handful of unexceptional Dubliners may legitimately be taken as a gestalt-synecdoche for the early-twentieth century Occidental psyche—he shewed just as conclusively as (albeit far more windily than) Dos Passos how radically impoverished those Inneren were, whichistersay that by eavesdropping more intently on the psyche than Dos Passos he merely exposed a greater mass of inconsequential thoughts therein. So you do at least concede that the stream-of-consciousness portions of Ulysses accurately register the contents of the early twentieth-century mind? Essentially, yes, which istersay that I think it does the best that grammatical English is able to do in the way of representing the sorts of phenomena that a person encounters when introspecting on the mnemic record of his own thoughts. So then, what’s not to love about Ulyssess—or, rather, what about it’s not to fall down prostrate before it—nay prostrate behind it and deliver to it a prostate examination with one’s tongue? What’s not to love—or, indeed, even to be endured, about Ulysses is any sense of what significantly or even trivially distinguishes these phenomena from the phenomena of other minds at other times and places—or indeed these phenomena in the mind of any one of Ulyssses’ characters from those in the minds of any of the others, or indeed the phenomenon (sic on the singular number) in the mind of any one those characters at any given moment from the phenomenon in that mind at any other given moment. Ah, now at least with regard to the penultimate of these whats, you are indeed completely mistaken, for at some point in the novel Joyce specifies that the distinction between the minds of its central characters—Leopold Bloom and Stephen Dedalus—is the distinction between the scientific mind and the artistic mind. He does indeed assert that there is such a distinction between the minds of these two characters, but he by no means shows that there is any such distinction. I suppose passages as the one in which Leopold Bloom speculates about the nature of cheese are supposed to be characteristic of a scientific mind and the passage in which Stephen Dedalus speculates about the reality and exclusivity about Shakespeare’s love for his wife are supposed to be characteristic of an artistic one, but I fail to see how either one is characteristic about any sort of mind whatsoever. The mere fact that the generation of cheese relies on processes analyzable by the so-called scientific method does not make speculation about its nature quintessentially scientific just as the mere fact that Shakespeare’s attachment or lack thereof to his wife has become a matter of interest thanks to the existence of certain works or art—namely, Shakespeare’s plays and poems—does not make speculation about that attachment quintessentially artistic. And even if these facts effected either of these states of affairs, Bloom’s “scientific” speculations constitute a paltry fraction of his stream of consciousness, just as the younger Dedalus’s “literary” gossip constitutes but a paltry fraction of his: Bloom’s mind is, rather, dominated by sexual fantasies and the younger Dedalus’s by pedantic rumination on non-“literary” matters. But in neither case does the dominant strain characterize the character in the full and true sense of plausibly and significantly (that’s significantly in the sense of significance-bearingly, not in the sense of merely noticeably) distinguishing him from the other characters; rather, Joyce has not-very-artfully produced the semblance of characterization by apportioning the findings of his “spy in the house of me” (a term invented by one of the writers for Seinfeld whose name escapes me, a term by which this fellow [for he was a fellow] meant his own faculty for impressing on his memory even his most absurd and preposterous thoughts, such as the reflection that spilling water all over his trousers would be a more immediately effective way of concealing a small spot of water on them than waiting for that spot to dry, a faculty that I would argue he cruelly undersold in describing it merely as a source for his contributions to the show’s zany scenarios, for I would further [or nearer] argue that this faculty is the source of most of most so-called creative writers’ so-called material) between the two personages. In point of all inferable meta-psychological fact—in point, that is, of all that the present writer has been able to gather from his own spy in the house of me and the findings of other people’s spies in the house of me that those people have seen fit to share with him—the sorts of phenomena that pass through Stephen Dedalus’s mind and Leopold Bloom’s mind (and yes [lest one be tempted to think Joyce has at least preserved some supposedly actual distinction between masculine and feminine thoughts], Molly Bloom’s mind)—are the sorts of phenomena that regularly pass through every human being’s mind. For that very reason, they are or at least ought to be devoid of readerly interest, although it is easy enough to see why they attracted a considerable amount of readerly attention on their publication in Ulysses, for thitherto no writer had if not reported on them at all (for many a writer had over the centuries if not millennia reported on most of them in letters, diaries, and the like) then at any rate reported on them at such length within the confines of something styling itself a literary composition. Until Ulysses, they had rightly been relegated to the realm of what Jose Ortega y Gasset termed the infrarealistic in polemical reaction mainly to Ulysses (albeit to some extent also to Proust’s A la recherche, a together-lumping that LW I shall address and appraise in a separate thread once I have emerged from this Anglo Saxon wormhole-cum-rabbit hole), or of that about which, in Samuel Johnson’s words, “every mind shrinks from in horror” (altho’ it must be admitted that he wrote about this shrinkage in the context of his biography of Jonathan Swift, and specifically of his observation that Swift did not blush to include such material in his literary compositions in defiance of such shrinkage, but all invidious comparisons of Swift to Joyce are instantly disarmed by the consideration that Swift did not merely report on the phenomena themselves but rather described the external referents of the phenomena—namely human bodily functions—in the context of the satirical mode and thereby effectively seconded Johnson’s assertion of their shrink from-worthiness). B-b-but what about the thing that really makes Ulysses Ulysses (sic on the repetition); viz. its more-than-figuratively-on-its-sleeve-worn Homeric allegorical substrate? What about it? Well, doesn’t that substrate ultimately redeem all the squalid infrarealistic detail by shewing how it’s all part of this wonderful scheme as grand and awe-inspiring as the constellation of constellations (sic on the near-repetition) in the firmament? To the contrary: to the patchy extent that it actually underlies the material of the work’s narrative, that substrate merely underscores the vapidity and vacuity of that detail. For what does that substrate do but subordinate that detail to the nightmare of myth, to the thesis that life is nothing but an ever-recurring recurrence of the same? I hope I shan’t be over-pedantic in pointing out, as I am about to do, that the Odyssey, the work of which Odysseus, a.k.a. Ulysses, is the hero, is not a myth but an epic. I don’t think it’s pedantic at all to point that out, but the Odyssey is manifestly inextricable from myth, and I can’t think of any way of disentangling its mythic from its non-mythic elements, let alone of elucidating how the non-mythic elements contributed to the poetics of Ulysses. Presumably by now Kenner or Ellmann or some other Joyce scholar has attemptted such a project of disentanglement-cum-elucidation more or less successfully by some more or less non-trivial criterion. But I cannot help presuming—that is to say, while I am prepared to have what I am presuming overturned, I do not think it very likely that it can be—that the findings of this project would not undermine my above-tendered meta-thesis about Ulysses’s allegorical substrate. For the Odyssey—or at least its principal events, its highlights if you will, the events that form the preponderance if not entirety of the substrate—had certainly been effectively universally received as mythemes for centuries if not millennia by the time of Ulysses’ genesis. By ca. 1910, for centuries if not millennia Odysseus had been universally regarded as the eternal homesick wanderer, Penelope as the eternal long-suffering faithful housewife, Circe and Calypso as the eternal temptresses, etc., such that any man pursuing an arduous journey homeward had been instantly figurable and often figured as a Odysseus-type figure, every long-suffering faithful housewife as a Penelope-type figure, every home-wrecking minx as a as a Circe or Calypso-type figure, etc. And of course a moment’s reflection will remind the reader that the works of Joyce’s immediate and mediate predecessors, the novelists of the eighteenth and nineteenth centuries, are liberally peppered with overt and covert comparisons of their personages to figures from classical mythology and peri-mythology. Such that in its allegorical figuration of Leopold Bloom as Odysseus, Molly Bloom as Penelope,  Stephen Dedalus as Telemachus, etc., Ulysses essentially—i.e., given that the non-explicitness of these figurations is effectively so much Where’s Waldo-ism avant la lettre, especially given further that few if any empirical readers have ever read the book in ignorance of them—simply makes a previously-only-intermittently-employed device an obsessively repetitive tic, much after the manner in which jazz contemporaneously routinized the syncopation.  (I will not go so far as to write off Joyce’s use of stream of consciousness as an instance of such routinization of an established device inasmuch as even the most proto-microrealistic of James’s or Flaubert’s interior monologues keep the character’s psyche more or less transparently on task vis-à-vis at least one of the work’s structuring threads or themes.) Isn’t it a bit more complicated than that?—a bit more complicated than Leopold Bloom qua homeward-bound husband echoing Odysseus qua homeward-bound husband, Molly bloom qua long-suffering faithful husband, etc.? For after all, Bloom differs from Odysseus in certain artfully pointed ways—notably, in the fact that he is not coveted as a conjugal partner by women other than his wife, that, indeed at best he can secure coition with them by paying them,  and that indeed vis-à-vis most of them he must solace himself with onanism while fantasizing about them. And after all, Molly differs from Penelope in certain artfully pointed ways, notably in not keeping all her suitors at bay and indeed inviting at least one of them into her bed; and after all, Stephen is but either a virtual or illegitimate son of Bloom; such that the substrate is at the same time an anti-substrate. Yes, yes, yes, and doubtless certain other characters differ from their counterparts in Homer in certain artfully pointed ways but I don’t see how any of this makes the allegory more complicated in the sense of being more enlightened. And it incidentally isn’t a jot more innovative than the first-level or posi-substrate. Such up-sending of the classics had been going on for centuries: there is even a name for it—low burlesque. Pope’s Dunciad and Offenbach’s Orpheus in the Underworld are perhaps the only canonical Ulysses-anteceding long-form works governed by low burlesque from start to finish, but virtually every Christmas pantomime ever written or staged contains examples of it.  But even if Ulysses did contain the very first instance of high burlesque, what would be the merit or value of that containment? Why, I should think it’s obvious—viz., the merit or value of undercutting or puncturing the fatuity of the pretensions of the entire epic-cum-heroic mode, of showing how utterly impracticable is the realization of the epic-cum-heroic ideal in the real world. But is that realization really utterly impracticable? Are we simply to assume that no man has ever spent or could ever spend ten years trying to find his way back to his country of birth and residence? Or that no wife has ever spent or could ever spend ten years faithfully awaiting her husband’s homecoming? Well, I suppose not… Then what is the merit or value of undercutting or puncturing a narrative in which a man is said to do the one and a woman is said to do the other?   In this low burlesque register of Ulysses are we not simply dealing with an early manifestation of the perfidious cardinal credo of the so-called sexual revolution, the credo that all attempts to constrain the libido are inherently futile, the credo that exactly a hundred years after the publication of Ulysses has led to the introduction of hardcore pornography into the curricula of American schools on the grounds that “the kids are just going to see it on the interweb anyway”? Schon gut, schon gut, but even if we dismiss every attribute of Ulysses that we have so far discussed as so much trivial or perfidious piffle, can this book not still be admired—nay, revered—as a marvelous tour de force of a panorama of life in Dublin in the very early twentieth century? No, because the version of life in Dublin in the very early twentieth century of which it presents a panorama is anything but admirable. I own that even before reading Ulysses, I did not have a flattering picture of the Irish in my mind’s portrait gallery of nationalities and ethnicities, but to the Irish’s benefit that portrait was monopolized by a single vice—viz., habitual drunkenness. On reading Ulysses I learned that the Irish were not only peerlessly thirsty dipsomaniacs but also peerlessly lazy sluggards, peerlessly randy and kinky sexual perverts, peerlessly malodorous hydrophobes, peerlessly smug egomaniacs, and peerlessly percussive crashing bores. That is the sum total of what I learned from my reading of Ulysses. Admittedly, it is a sum total composed of a very large number of subtotals, but not one of these subtotals conveys knowledge of anything but a particular nationality, and a nationality of no world-historical significance whatsoever. If only for the sake of forestalling your afoul-falling of the hate speech (why, incidentally, are they are always called hate speech laws given that 99.99954% of so-called hate speech incidents take the form of T***ts; surely they ought to be termed hate-writing laws or, if the .00046% of viva voce incidents must be acknowledged, hate-language laws), I should point out that today’s Ireland and Irishperson are almost unrecognizable in juxtaposition with the Ireland and Irishperson of a century ago. I mean Dublin in particular is the most amazing cosmopolitan city…Aye, aye, aye, laddie or lassie (sic on the Scotticisms [for who has ever felt more richly entitled to remonstrate about what Ireland is all about than a bloody Scotsman?]), I know Dublin is lousy with semi-celibate teetotal vegan so-called knowledge workers hailing from 300 countries and 30 continents, that within its borders it’s ten times easier to get a bowl of pho or trencher of beef awaze tibs than a pint of Guinness or jigger of Jameson’s, etc.--in short, that it’s virtually interchangeable with every other capital of every other small-to-mid sized Occidental country. All o’ which goes to shew that Ulysses is a total washout-cum-dead end, that it portrays a people and a so-called way of life that are far deader than Dillinger (for Dillinger after all remains very much alive qua archetypal freelance gangster).  So whence, then—given that Ulysses is if nothing else the most famous and respected English-language novel  of the first quarter of the twentieth century--springs the profusion of at least purportedly serious Anglophone literature of the past hundred years? I do not know exactly whence it springs, but I know exactly how it behaves—namely, as if the socially and metaphysically corrosive events of the nineteenth and early twentieth century, the events remarked on by Benjamin and Adorno and concretized by Sartre and Dos Passos, had never occurred, and hence (or at least concurrently [for I am not quite sure that the obliviousness {if it even be an actual obliviousness rather than assumed one} of the events is a cause of the phenomenon I am about to describe]) as though the numinousness of entities and notions like mother, father, husband, wife, son, daughter, profession, and home and the various home-stemmed words (e.g.,  homeland, hometown, and homeownership could be taken for granted, hence (sic on the bare hence, for here in contrast to in the preceding parenthesis I am more confident that the link is causal) as if the inner lives of postulated characters could be written about without further ado, and indeed in quite a straightforward, pre-Ulysses and pre-USA-style way; as if people actually spent hours uninterruptedly thinking about their mothers qua their mothers, their hometowns qua their hometowns, etc., rather than sporadically qua (and vis-à-vis) any number of factitious notions and entities like gadgets, fads, and mass hysterias. Nay, as if in Mama Hamlet-esque protestation that mothers, fathers, etc. et et (sic on the repetition of et) al. are not only numinous but the most numinous entities and notions ever, they have written about these inner lives with a degree and kind of free-handedness that no pre-mid-twentieth-century novelist or storywriter would have presumed to arrogate. Where their predecessors, the exponents of the old-school interior monologue, at least confined themselves to attributing to their characters statements and sentiments that were plausibly  specific mental moments, of what one might term thoughts in the imperfect or past progressive mode-cum-tense, thoughts at the immediate moment of their thinking (“It occurred to Bob that he was hungry and should fix himself a sandwich,” “Suzy was about to leave for the library when she suddenly remembered that the library was closed for Candlemass and Groundhog Day,” and the like), the novelists and short story writers of the mid-twentieth century and later have taken it upon themselves to report on one might term thoughts in the simple past mode-cum-tense, the mode-cum-tense in which events in the non-immediate past of the external world, most typically historical events, are reported. It is perhaps no accident, as they say, that this mode-cum-tense came to flourish if not uniquely than at least most refulgently in literary so-called fiction specifically of the Anglosphere, because in English, in contrast to in the other major Occidental languages, the inflectional simple past rather than a compound inflectional tense is the past tense of default, such that in English one says, “I have cleaned my room” rather than “I cleaned my room” only when one needs to emphasize that the fact of the room’s having been cleaned has some specific bearing on the present—say, on one’s mother granting to one permission to go out and play with one’s little friends.  How ironic that within minutes, as the eye’s bird flies, of excoriating these writers for their taking for granted the numinousness of the parent-child bond, you’ve adverted to an illustration snipped directly from the Big Book of Parent-Child Clip-Art!  I don’t find it particularly ironic, inasmuch I am after all not writing about my mother but about one’s mother, a mother who might have lived and possessed playing-granting privileges at any moment in the meta-history of motherhood, including the moments at which that life and that possession were at their most numinous (or at least nearly numinous [for numinousness is after all “in a very real sense” something not of this world]), and in case I have adverted to motherhood by way of preparing to give especially strong emphasis to the galling presumptuousness of employing the simple past tense in reporting on certain kinds of mental events. Anyway, as I was saying, in the Anglosphere we use the simple past as the past tense of default, and so we are particularly vulnerable to being blind to ways in which it is being used inappropriately in a durative way. It is reasonable enough to concede the plausibility a sentence like “Bob was angry at his mother” because “was angry” plainly denotes by default a fleeting, a momentary, mental state, and because we’ve all been there, as they say; because we can all remember moments at which we have been indisputably angry at a specific person, and most of us can remember moments at which we have been angry at our mothers; but a sentence like “Bob loved his mother” ought to raise red flags, as they say, because it implicitly denotes a durative state that is of at least debatable verifiability when transposed into the first-person singular. To be sure, one would like to think that one loves and always has loved one’s mother as long as one generally regards one’s mother in a kinder light than the author of Mommie Dearest regarded Joan Crawford, but who among us can both candidly and frankly say that he loves his mother right now, let alone six months ago, let further alone six years ago, let further alone at a certain moment or over a certain stretch of time within those preceding six months or six years? And who among us would be so presumptuous as to claim to know whether another person loved his or her mother at a certain moment or over a certain stretch of time? And yet it is just such a degree of epistemological certitude about such unascertainable affective states chez les autres that Anglophone novelists and short-story writers of the recent past-to-present habitually evince in their treatment of their characters, “using” them, in Stephen Mitchelmore’s words “as vehicles for comment, with their inner lives as accessible as mustard in a jar.” Mitchelmore has wittily named this treatment “the Munro Doctrine” after one of its most recently illustrious wielders, the American short story writer Alice Munro, but despite her 2013 Nobel Prize for Literature, she is outrivaled in fame (and should be outvied in notoriousness) by a writer who wielded the technique with even more obnoxiously tendentious heavy-handedness, John Updike, as Mitchelmore shows us via a quote from Updike’s 1996 novel In the Beauty of the Lilies nested within a quote from an anti-Updike screed by Gore Vidal:

Clark is in rebellion against the Communism of his mother and her friends pinks if not reds [sic] and, worse, unabashed enemies of the United States in the long, long, war against the Satanic Ho Chi Minh. "Mom, too, wanted North Vietnam to win, which seemed strange to Clark, since America had been pretty good to her." As irony, this might have been telling, but irony is an arrow that the Good Fiction Fairy withheld from the Updike quiver. Consequently, this non sequitur can only make perfect sense to a writer who believes that no matter how misguided, tyrannous and barbarous the rulers of one's own country have become, they must be obeyed; and if one has actually made money and achieved a nice place in the country they have hijacked, then one must be doubly obedient, grateful, too.    

“How,” Mitchelmore asks, “did the narrator know Clark found anything strange?” How indeed, and how, indeed, could Clark himself even have known that his mother wanted North Vietnam to win the war? Perhaps she had told him that she did, but presumably if she had, in telling him so she had at even her most candid been registering an approximation of her attitude to the war. The quoted sentence shows not only that Updike regards his characters’ minds as being accessible as mustard jars but that he takes it for granted that his characters regard their fellow-characters’ minds as being as accessible as mustard jars because, apparently (i.e., inasmuch as he implicitly posits Clark’s position on the war as normative and thereby implies that he regards Clark’s ascription to his mother of a pro-North Vietnamese attitude as well-founded), he regards the minds of real people as being as accessible to each other as mustard jars—not, of course, thanks to any sort of immediate telepathy akin to the one-way telepathy he enjoys in relation to their characters, but thanks to people’s viva voce utterance of their purported opinions in declarative statements in natural languages like English. And why should he think such statements afford such accessibility? Vidal’s dyspeptically acerbic commentary on the sentence hints at an answer to this question, albeit to the detriment of its own claims to omniscience or at least to knowing more than John Updike). Vidal effectively asserts that Updike is a scoundrel and a simpleton for believing that a person’s enjoyment of material prosperity under the auspices of a certain government fatally vitiates any criticism such a person may direct at that government’s policies, and this assertion is doubtless a just one if Updike indeed really believes (or believed [for we are now talking about the empirical Updike, who is now deceased, not the Updike whose opinions can be directly inferred from In the Beauty of the Lilies and who “in a very real sense” still lives]) anything of the sort. But Updike doubtless believed—or in any case was certainly authorized to believe— that it was reasonable and just to believe what he believed about a prosperous American citizen’s criticism of U.S.-government policy because, as I have far-above effectively mentioned, by 1996 the Vietnam War had long since become incontestably regarded as the first war the United States (had) ever lost and therefore a war about which it was absolutely mandatory for every American (and indeed also most non-American Occidentals) to maintain passionate and deeply held opinions-cum-convictions; and specifically to maintain one of two opinion(s)-cum-convictions—the liberal or dovish or left-wing opinion-cum-conviction (i.e., the one implicitly espoused by every Hollywood movie about the war or its aftermath apart from Hamburger Hill) that the U.S. never should have become involved in Vietnam in the first place, that the Vietcong had been the good guys in the war because “say whatever you will about Communism [i.e., “I personally self-identify as a Commie but at least affect to respect your right not to do so {cf. and contrast the contemporaneous bienpensant(e) idée reçueI am personally against abortion but I respect every woman’s right to choose”}”], they were at least fighting for their homeland” and the conservative or hawkish or right-wing opinion-cum-conviction—the opinion-cum-conviction held, or affected to be held, by Updike—that the United States’s intervention in Vietnam had been entirely just and that the termination of that intervention had been unjust and premature, that the U.S. would have won the war decisively had the nation (as legitimately embodied and empowered by the U.S. military and its most hawkish civilian overseers and partners) not been “betrayed by  its enemies on the home front,” by the student protestors and their allies in journalism and Congress (and sometimes Presidents Johnson and Nixon as well, depending on one’s views on the situations of their respective hearts and the degrees of liberty enjoyed by their respective pairs of hands). Having accepted the incontestability of the idée reçue about the Vietnam War and plumped for the second of the only two opinions-cum-convictions entertainable about it, Updike cannot help ascribing a super-deep-seated and rock-solid adherence to this opinion-cum-conviction to a character of the period of the War whom he conceives of sympathetically, Clark, and complementarily ascribing a super-deep-seated and rock-solid adherence to the first of those opinions to a character of the period, Clark’s mom, whom he conceives of unsympathetically, and consequently cannot help interpellating as a political ingrate. It would indeed be scarcely an exaggeration to say that for the Updike of 1996, the American mind of the mid 1960s-to-early 1970s was nothing but a container for a good or bad belief about the Vietnam War—or, rather, for a good or bad belief about the Vietnam War plus a sub-handful of beliefs about other so-called hot-button issues of the time—e.g. (or, more probably, i.e.), civil rights and the so-called sexual revolution.  The notion that any American of the time might not have had any opinion whatsoever about any of these issues, or have had only opinions about them that were too vague, ambivalent, or intermittently entertained to be readily verbalized, was more than figuratively unthinkable by him, or at any rate (because we can no more readily read his mind than he could read Clark’s or Clark’s mother’s) for him, meaning that his position as a respectable Anglophone literary writer of the late twentieth century precluded his even seeming to believe, however intermittently, that the contents of the minds of his contemporaries were not effectively exhausted by readily verbalizeable opinions about such issues. In the triumph of the Monro Doctrine, of hyper-omniscient narration, in Anglophone literary writing, one witnesses the total corruption of storytelling by myth and the total corruption of the Anglophone social fabric by myth-corrupted storytelling. It is oppressive enough to be compelled to pay lip service to such mythified stories as the one in which the Vietnam War terminates in the United States’ thitherto unprecedented loss of a war, but it is insufferably oppressive to be compelled to pay eye-service—and hence, ineluctably (because the mind [or at any rate, the present writer’s mind and the minds of many other people as plausibly represented in the best pre-Munro Doctrine novelists] cannot help at least temporarily assimilating a goodly portion of what it takes in through reading) at least a smattering of mind-service—to the notion that meditation on such myths qua the articles of a creed is the only suitable occupation for the mind. Surely you don’t believe that thanks to Updike & co. Anglophones have ceased to think about anything but so-called hot-button issues. For after all, Updike & co. were and are literary writers, hence axiomatically writers whose writings quasi-axiomatically are consumed only by a tiny (or at largest small) minority of the Anglophone reading public. I certainly don’t believe Anglophones have actually ceased to think about anything but so-called hot-button issues, and I don’t even believe that to the extent that they have ceased to think about things other than hot-button issues, Updike & co. are immediately responsible for that cessation, because, as you have just asserted in negative, most Anglophones are utterly innocent of any direct acquaintance with their writings. Still, there is no—or at any rate precious little—denying that Updike & co. have written for and been favorably reviewed in periodical publications—e.g., the New Yorker, the Washington Post, the New York Times Book Review—that are enormously influential in the Anglosphere, publications that are (or at any rate were until very recently) highly respected, if not revered, even by the great mass of Anglophones who do not read them, who do indeed regard the producers of them as “in a certain very real sense” their betters, however scornfully they may dismiss them as them as poncey or pointy-headed city-slickers. Lookee here, Mable or Reuben, at this here nacho cheese dip-jar gazer calling the mustard-jar gazer green (that’s green as in “greenhorn” not green as in “green with envy,” natch)! How d’ye mean, Cletus? In what way or in what sense am I a nacho cheese dip-jar gazer? Why, in the way or sense that you’re ascribing an opinion or attitude—viz., reverence of the producers of the “New Yorker” etc. as their betters—not merely to a specific person or a tiny cluster of persons but to a demographic tranche that must be composed of hundreds of millions of people. That is a fairish cop, and I frankly and candidly concede that I do not know that even a single one of those hundreds of millions of people has explicitly thought to himself, “Them there pointy-headed or poncey city slickers are better than me.” Nonetheless, I maintain that my ascription of an attitude or feeling of inferiority to the great mass of Anglophones vis-à-vis the mini-mass of Anglophones responsible for the production of the New Yorker, etc. is much less presumptuous than Updike’s ascription of patriotic gratitude to Clark and unpatriotic ingratitude to Clark’s mother. For my ascription is the result of a synthesis that has been decades in the making and based on reasonable inferences that I have drawn from utterances that have been made to me, or made in my hearing, by hundreds if not thousands of bone-fide card-carrying non-pointy headed, non-poncey, non-city-slicking Anglophones. From these utterances I am at minimum entitled to infer least such Anglophones at least do not attempt to conceal whatever picayune smattering of acquaintance with the pointy-heads’ and ponces’ productions that they have acquired and thereby further entitled to infer that they hold these productions in a certain grudgingly high regard, the regard, specifically, of people who touchingly believe that they have encountered something that is beyond their intellectual reach (even as it is, in fact, more often than not something that is intellectually too lowly to be worth their while to pick up).  So, what, at bottom are you   saying about the espousers of  the Munro Doctrine qua influence on the Anglophone body super-politic?  I’m saying that they effectively generate a kind of intellectual superego for that body-politic, that they give every member of that body-politic what one naturally cannot help styling its thinking points, the things that one should be thinking about all the time even if one can’t bring oneself to think about them eis ipsis (as opposed to thinking about them qua thinking points, which one can’t help doing at least occasionally) at all.  (It should, moreover be added, and added if not here then at any rate at no more self-evidently more apposite place in this essay [this point being at least not entirely inapposite in the light of my not entirely felicitous invocation of a Freudian concept in the preceding sentence], that the espousers of the Munro Doctrine have themselves to some extent acquired their metaphysics of subjectivity from officially non-literary types, specifically from psychologists, anthropologists, and sociologists of various excremental layers of pop-dom.)  Well, then, is there any sense in which the works of the espousers of the Munro Doctrine are superior to the trashiest so-called airport novel?  Well, I suppose, to “steel-man” the would-be defenders of those espousers, one could plausibly argue that dwelling on anything reflectively at great length—and the novels of Updike et al. do undoubtedly reflectively dwell at great length on the Vietnam War and Civil Rights and motherhood and apple-piehood, and anti-apple-piehood, etc.—puts a writer at a higher intellectual plane than the one occupied by writers, like the airport novelists, who, with their punchy, cross-cutting cinematic approach to the world do not generally dwell or reflect on their subject matter at any great length at all. But by a complementary if not quite identical tokefest, it could plausibly be argued that inasmuch as one is doing something that is intrinsically mendacious, namely (in case you have forgotten), treating people’s minds as mustard-jars, the airport novelists are superior to the Monro Doctrine-espousers qua conveyors of truth content inasmuch as although they, the airport novelists, are not a jot less bumptious than the Monro Doctrine-espousers in their attitude to and treatment of the human psyche, they tend to devote a substantially smaller proportion of their print-foot to descriptions of phenomena purportedly occurring in that psyche, prevailingly besotted as they tend to be with describing things purportedly occurring in the extra-psychic world of inanimate things—mostly cars and airplanes and windows and trousers and other sorts of things that everybody already knows a great deal about, and mostly in ways that impart no new knowledge about these things to the reader, but occasionally (and at any rate much more often than the Monro Doctrine-espousers) things-e.g., feeler gauges, retorts, quantity surveyors’ calipers, etc.—about which most readers know little or nothing, and even very occasionally (and at any rate, etc.) about familiar things in ways that impart new knowledge about them.  Moreover, by the same tokefest (i.e. the tokefest of the immediately preceding sentence, not the tokefest of the one before that), it might plausibly be argued that inasmuch as they devote a substantially larger proportion of their print-foot to inanimate things than do the Munro-Doctrine espousers, the airport novelists, although ultimately no better conveyors of truth-content, collectively exert a more salutary effect on the collective Anglophone ethos-cum-habitus, in encouraging Anglophones to devote a greater degree of outward-vectored attention to what David Riesman called “the hardness of the material,” to tangible and manipulable objects in the external world, than to allegedly psychically sited chimeras and intangible abstractions. So, on the whole the utter philistine—nay, the unreconstructed lout—who as a youngster reads nothing but police novels and consequently winds up becoming a workaholic police officer or a forensic pathologist is on the whole a superior contributor to the present system of life than the, shall we say, upper-upper middlebrow or lower- upperbrow reader who grows up reading John Updike and Alice Munro and consequently becomes a social worker who spends even a goodly proportion of her on-the-job hours wondering about whether her “relationships” with her husband, parents, children, et al. are authentically loving, equitable, autc. On the whole, yes, very probably, although this is not to say that the reader of airport novels is on the whole a superior contributor to the present system of life for having read airport novels eis ipsis; on the whole, I suspect, a young reader of the pre-Monro Doctrine literary prose canon would be more likely to finish up being a first-rate police officer or forensic pathologist than would a young reader of detective novels much after the manner in which Thomas Jefferson was probably a better statesman for having read Tristram Shandy [I mention Jefferson only because he is the only late eighteenth-century statesman whom I can recall explicitly mentioning TS as an influence, but presumably, in the light of TS’s phenomenal popularity, a goodly proportion of the remaining so-called founding fathers, as well as of their approximate counterparts on the other side of the pond, the period’s prime ministers and other conspicuous Members of Parliament, were ]. Ah, yes, but Tristram Shandy was a brand-spanking new novel when Jefferson was a youngster, and its fabulae personae centered on Anglophones of his parents’ and grandparents’ times. Indeed the former was, and indeed the latter did. But whilst one blanches at extolling any such probably ineluctably factitious quality as timelessness, one finds oneself tentatively entitled to hazard the guess that the qualities of TS that proved most instructive to Jefferson qua statesman were not bolted to the eighteenth century like the furniture to the floor in a fast-food restaurant in a so-called inner-city neighborhood (if it be even logically possible for anything of eighteenth-century provenance to ape [!] anything of twentieth-century provenance), that they influenced him in a manner not entirely alienable from (albeit not entirely inalienable à la certain rights chez TJ) the manner in which he was presumably influenced by works from earlier historical periods—by Shakespeare’s plays, Otway’s Venice Preserved (for no even remotely Whiggish eighteenth-century politician was not a fanboy of this play), Don Quixote, Plutarch’s Parallel Lives, Cicero’s Dream of Scipio, etc. I cannot help noticing that this list contains but one novel. Well-spotted, and of course that singularity is “no accident”; I omitted to include any other novels in the catalogue because apart from TS there is no novel that Jefferson can be more or less assumed to have read other than DQ (altho’ to be frank and candid, in the light of the far-abovementioned Tom Jones’s TS-topping popularity, I should be greatly surprised if he had not also additionally read TJ [altho’ by the same tokefest, it is striking that he held up TS and not TJ as a universal primer in morals]), and partly because I do not believe novels (or indeed any other story-based or story-derived genres) have at any time in Occidental history deserved to dominate the reading matter of even aspirantly serious people. And yet by your own account, Jefferson’s epoch was one in which the novel was flourishing and in which it was possible to write in the novelistic mode with full-fledged and unblemished authenticity, whence an epoch in which it probably was possible to form a just and reasonable notion of how to comport oneself in the world from novels. It is a paradox, no? I don’t know if it’s quite a paradox, but it certainly is at least ironic in a super-Morrisettian sense. It’s reminiscent of what Adorno said about movies when the so-called Hays code was still in force—viz., that a film that both adhered to every one of its stipulations and held its own as a great work of art could be made if and only in a setting in which the code did not exist (which of course leads one to surmise that now that anyone of any age can at the touch of a button witness every single act, image, and utterance prohibited by the Hays code, a setting at long last obtains for the making of such a film, but that is of course the subject of another essay, or at least a sentence or two that don’t belong in this essay). But in any case, knowing what Thomas Jefferson could have safely chosen as his course as reading is as irrelevant to us as the knowledge that if our aunt had balls she we would be our uncle was until ca. 2015 (when of course it started to become a mandatorily open question whether our impregnating parent’s or birthing-parent’s sister was not our uncle despite having no balls). And so you would simply ban novels from every syllabus in every course of study at every education-level? No, I wouldn’t ban them, but I would assuredly prohibit their inclusion in syllabi absent a preliminary lecture on what can and cannot (sic on can instead of may, for we are dealing with downright impossibilities here) be learned from them—and of course, that lecture would probably have to be custom-tailored to every novelist, or at least every novelist traditionally regarded as great (and as for the novelists not traditionally regarded as great, although I am willing to hear out a succinct sales pitch for Gissing or Galt or Mrs. Gaskell qua indispensable knowledge-bearers, on the whole I am inclined to believe that it is the canonical novelists to whom we ought to be paying closest attention to the extent that we are to be paying attention to novelists at all) if only as a matter of happenstance. For whilst the most salient differences in degrees and kinds of truth-content between Dickens and Balzac are owing not at all to their self-consciousness qua practitioners of the supposed art or craft of novel-writing, it happens to be true that there is no other novelist who has gotten just the same certain things right and certain other things wrong as Balzac has or Dickens has. Vis-à-vis Balzac, one must be pre-apprised that although he can be trusted almost implicitly in the matter of reportage on the institutions, fashions, folkways, technologies, etc. of his time (or, to be more precise, the historical moment in which his novels are set, a moment typically antedating their composition by two to three decades [and hence a moment less aptly described as B.’s time than as his former time], one must not expect his depiction of the psychology of his characters to be even coherent (such that three-quarters of the way through Lost Illusions David Sechard expresses a bitter feeling of ill-usage by his father that comes entirely out of left field [sic on the seeming anachronism of the metaphor {for after all, our reading of Lost Illusions takes place at a moment postdating the invention of baseball by about a century-and-a-half, even if Lost Illusions was completed more than a quarter-century before that invention}] because at the trilogy’s beginning Balzac described David as charitable to the point of guilelessness and has not qualified this description a jot since). And vis-à-vis Dickens, one must be pre-apprised that while his depiction of the psychology of his characters is dependably coherent (coherent, mind you, although by no means plausible), such that Scrooge’s metamorphosis from a permafrost-hearted skinflint into a butter-hearted male cash-cow proceeds in reliable and gradual lock-step with his encounters with a series of people (and ghosts of people) who expose to him various moral shortcomings of skinflintism as an ethos-cum-habitus, he is not to be trusted at all in the matter of historically accurate reportage on institutions, etc., such that it is impossible even to specify the decade of the “Christmas present” of A Christmas Carol, let alone in what branch or version of commerce in which Scrooge has made his fortune; such that, in Vladimir Nabokov’s words [Speaking of Nabokov, is he exempt from excoriation as an exponent of the Monro Doctrine? By no means, but I am citing Nabokov as a critic of the mid-nineteenth-century novel not as an exponent of the mid-twentieth-century one]  the Deadlocks, the titled family at the nominal center of Bleak House, “are very much dead,” such that functional aristocratic-ness of Lord Deadlock, the family’s head of unspecified rank, is exhausted by his sitting in place while suffering from his “hereditary gout.” You just referenced in passing the coherence of Dickens’s treatment of psychology whilst at the same time overtly terming that treatment implausible.  I’d like you to unpack the fudge-collection for me, if you would. Why, if his portraiture of psychology lacks verisimilitude is it even worth taking seriously at all by a lover of truth-content? It is worth taking seriously because it is somehow bound up with its unprecedented conveyance of a certain truth-content about the mind despite its implausibility as portraiture, its unprecedented conveyance of the extent to which thought is permeated with imagination even in the course of its most humdrum quotidian operations. (When I term it unprecedented I mean unprecedented in the forum of the novel, for preceding examples of it are abundant enough in so-called non-fiction genres, particularly those of a quasi-private nature such as the letters of Madame the Sevigné and the journals of Pepys and Boswell.) One sees this in, for example, Pip’s remark on the miniature gravestones of his deceased siblings at the beginning of Great Expectations, his observation that the gravestones’ shared lozenge shape and their arrangement in a row suggests to him that these children whom he never met all resigned themselves to death in cheerful unison. In drawing attention to this permeation Dickens anticipates Proust and thereby affords me the abovementioned occasion for discussing that writer’s place in the history of the novel qua ever-more equivocal conveyor of truth-content. For in virtue of reflecting on the mind’s operations in the first person, in relentlessly reflecting on the operations of his own mind (and however snippily and stridently pedants may demur to the contrary, Occam’s razor impels us to regard the “I” of A la recherche as none other than that of the author [But what about that massive chunk of the Recherche in the third-person, Un amour de Swann? The third-person-ness of that episode does indeed constitute something of a poser in this context; but I would tentatively hazard that it can be redeemed in epistemological terms by the “I”’s explicit framing it at the end of the episode that precedes it as a retrospective digest of the anecdotes, such that we are encouraged to bracket the free-indirect discourse as speculation about the contents of Swann’s mind rather than as the direct presentation of those contents]), Proust provides the only fully epistemologically unimpeachable reportage in the history of the novel, or at any rate (what almost comes to the same thing) in the history of the novel up until his day. At the least epistemologically compromised extreme, we can forgive or even endorse Fielding’s presentation of Tom Jones’s thoughts in the third person because Fielding’s lofty social position vouchsafes him the authority to hold forth on the ideation of a feckless younker only illegitimately “to the manor born,” but we cannot imitate such a presentation in good faith because our society does not afford anyone the privilege of occupying a social position that is more than highly equivocally lofty (e.g, that of the functionally illiterate Hollywood superstar or the World’s Richest Person for the Warholian 15 Minutes). Proust requires no such epistemological disclaimer because he does not pretend to comment on any mental phenomenon that is not directly accessible to him (and by directly I do not mean necessarily immediately, because for the most part his reportage consists of synthetic generalizations about multiple phenomena rather than descriptions of phenomena in isolation). But it’s not as if Proust writes entirely on his own behalf and with utterly unpresumptuous nominalism, as if he is consistently effectively saying, “Don’t you worry your pretty or ugly little or big head about any of this; it’ all only about little ol’ me; ” for he does use the pronoun “we” with striking frequency. Why, so he does, but this employment of “we” is by no means imperiously invasive, by no means mustard jar-gazing; his sentences with “we” as their subject by no means purport to be conveying proto-closed-circuit television footage of some collective psyche; rather, like my earlier generalization about the Anglophone masses’ attitude to the producers of the New Yorker et al., they propound reasonable inferences about people derived from comparisons of their own words, gestures, etc. with Marcel’s own words, gestures, etc. qua outward manifestations and consequences of his thoughts. It’s true that his employment of “we,” although not at all presumptuous, could be made even less presumptuous via the prefixing to it of a disingenuous query of “Are you with me here, folks?” to every “we”-subjected sentence, but what, vis-à-vis the conveyance of truth-content, would be the point of that? A person who uses “we” in the way in which Proust uses it (in contrast to the genuinely arrogant manner in which, say, the fan of an athletic team uses “we” instead of “they” in statements about the triumphs and travails of that team on the field or pitch, or the citizen of a country in statements about the achievements or indeed the crimes  [for laying claim to great crimes one did not commit is no less arrogant than laying claim to great achievements one did not achieve] of his fellow-citizens that antedated his birth) is always implicitly to be understood as merely propounding his own so-called take on things, such that to qualify it with folksy self-depreciation (sic on “depreciation” for “deprecation”) is implicitly to depreciate it as merely half-true and possibly even completely false. Fine, but you must admit that this combination of first-person introspective analysis of Proust’s own psyche and inductively extrapolative generalizations about the psyches of others constitutes but a portion of the Recherche. I’m not going to be so cavalierly cocksure as to assert that it constitutes but a tiny portion, but I am going to be so cavalierly cocksure as to hazard a guess that the portions of it that are up to something quite different from such analysis-cum-extrapolation must be much larger than tiny, inasmuch as by all the accounts that one reads in the abovementioned  print-media organs, the New Yorker et al.—and hence, sadly but probably incontrovertibly, a reasonably representative sample of the empirical readers of Proust—for most Proust-readers the Recherche is above all else a rollicking period novel,  a sort of Belle-Epoque analogue to the Jane Austen corpus (which it seems conveniently to equal [or near-equal] in length and number of installments), a work one reads principally for the pleasure of tracing the intricate folkways of a bygone age, an age both immeasurably more civilized than the present one and at the same time (and even more significantly) ludicrously superannuated in virtue of that intricacy, and hence, ultimately, for the experience of luxuriating in one’s own modernity, in having disencumbered oneself of that corset and bustle or starched shirt and tailcoat and slipped into a comfortable sweat-drenched and food spattered-sweat-suit. Indeed for such readers it is that very kind of work; we might say, on the model of my styling of cinematic depictions of old-school totalitarian dictatorships as bad-cop porn, that it is read by these people as prim-governess porn. Well then, if the prim governess’s shoe fits the princess’s foot, is not the princess obliged to wear that shoe, nay (to follow this quasi antigraph of Cinderella to its logical telos) to attire herself cap-a-pie as a prim governess? By this do you mean that Proust’s depiction of the social world of his time is ultimately objectively as uninstructive regarding our time as Jane Austen’s of hers is, that its present-day interest is purely escapist in essence? That is exactly what I mean. Well, perhaps—and it could pain few if any living persons more than the present writer, the most ardent Proustian personally known to himself, to entertain this notion—that is after all true. At minimum, it seems to me, one must concede that the case for reading Proust propounded by Jean Amery in his lecture commemorating the centenary of the author’s birth, namely that the lecturer and his audience of the early 1970s were still living in the “late-bourgeois age” inhabited by Proust, no longer holds water; that we are now definitely living in a post-bourgeois age. We definitely are, and surely Amery was mistaken even in supposing that the bourgeois age had not ended by the early 1970s. You mean on account of rock ‘n’ roll, the sexual revolution, women’s lib, etc. etc.? Exactly. Well, I don’t deny that not merely the seeds but the weeds of the end of the bourgeois age were contained in these phenomena. Still, one must remember that 1972, the next year after Proust’s centenary, was the release-year of Luis Buñuel’s film The Discreet Charm of the Bourgeoisie, and that whilst that film obviously ought not to be mistaken for a Michael Apted-style documentary snapshot of the typical Frenchpeople of its time, it cannot be denied that it is unequivocally set in its own time and that its personages dress, speak, and act in ways typical of people of their time of life, social position, and wealth-level throughout the Occident of that time; or that the preoccupations of these characters are at least broadly identical to those of Proust’s novel—viz., dressing smartly, attracting the right sorts of people to their dinner tables, keeping the servants at as many arms’ lengths as possible while still availing themselves of their services, etc. This doesn’t strike me as a very good case that the early 1970s were part of the age of bygone bourgieness, for aren’t the preoccupations of today’s so-called elites essentially the same as those you have just listed?  Perhaps—albeit only perhaps (and I shall address the empirical occasioners of this perhaps-ness anon), but “ironically” you have just divulged your own belief at your insu that we are still living in a bourgeois age, for only a believer in such a(n) SOA would reflexively identify the elites of a society as the definers of its Gesellschaftsgeist.  And today even if our elites—meaning the richest and most politically high-placed—are in a certain sense still very much bourgeois (and in a “certain very real sense” even more bourgeois than their counterparts of a half-century ago, to judge by the lately burgeoning demand for nannies, gardeners, and other old-fashioned-type servants [recall that The Brady Bunch was quite an outlier among quasi-naturalistic late-twentieth century American situation comedies in centering on a household with a live-in domestic servant {what with The Jeffersons’ featuring of a live-in maid being a kind of reparation for all the early twentieth-century radio shows and movies centered on white households with live-in black housekeepers and Mr. Belvedere being a deliberately quasi-anachronistic revival of an early twentieth-century cinematic franchise}]), they by no means set the tone of our age, as is attested by the currency of an argoteme a derivative of which you have just employed, bourgie qua resentful synonym of posh, upmarket, and connoisseur-worthy, along with its near-universal misspelling as bougie, neither of which would be possible were this still a truly bourgeois-toned age, for in the bourgeois age proper, outside academic scholarship, bourgeois was used adjectivally as a snooty pejorative synonym of dull, mediocre, and borderline-trashy, and as a working knowledge of French was a minimum requirement for membership of the bourgeoisie throughout the Anglosphere, every bourgeois of that age knew that bougie was the French word for a candle (specifically a tallow candle), had there then been any impetus to derive an argoteme from bourgeois, that argoteme never would have made it into print minus a decisively disambiguating “r.” And I believe it is plain to anyone who keeps his finger on the pulse of the Zeitgeist-cum-Gesellschaftsgeist (if the very notion of keeping one’s finger on the pulse of the Zeitgeist be not an oxymoron, for it is in the very nature of a Zeitgeist-cum-Gesellschaftsgeist to thrust its oversized breasts or testicles into one’s face however assiduously one is striving to ignore it) that the tone-setters of our age have very different preoccupations. Don’t even get me started on how the so-called social media have supposedly completely introduced a whole new gesellschaftsgestaltend(e) ballgame, for it can be conclusively shewn that the popularity contests of F*****k, T*****r, and Instamatic ([sic] on the malapropism qua asterisk-obviator) are but classic class conflicts staged on a different battleground. I don’t doubt that that very shew can be conclusively shewn, but my withers are unwringable by such a demonstration, for unlike most polemicists across the so-called political spectrum, I do not regard the advent of the so-called social media as the decisive epoch-making event of our time—or to be more precise of the transition of our (or at least those of us who were already around by, say, 1990) earlier time to our present time. Well then, what do you regard as that event? Surely not the advent of the ubiquity of proprietary names you described in “Proprietary Names: the Name/Proprietary Names: the Place,” for that ubiquity must have antedated the Proust centenary by at least two decades. Indeed I don’t regard the advent of that ubiquity as the decisive event, although I do believe that that advent significantly participated in the build-up to the event, and that the fortunes of the proprietary name since the event (notably the swelling of a sub-handful of proprietary names into a metaphysical prominence not enjoyed by even their most hyper-ubiquitous predecessors [e.g., Amazon by comparison with Coca-Cola], and the subsumption of certain clusters of proprietary names under other propriety names with which they formerly seemed non-compossible [notably the subsumption of Marvel, Lucasfilm, the Muppets, etc. under the Disney magic rainbow-umbrella]) have participated in its aftermath in certain decisive ways. But to the event sir, to the event! The event, it seems to me, is the apotheosization of stories, and preeminently among them, a certain class of stories, a story-class mentioned at the very beginning of this essay, the class of stories narrated in the juvenile-orientated so-called fantasy novels from those of J. R. R. Tolkien onwards. I own that this class of stories has become wildly popular, and probably even more popular than any other class of stories, but in what sense has this popularity signaled the definitive ejection of the bourgeoisie from the throne-room of the roost? Surely you are not going to argue that these stories are more popular among the proletariat or the peasantry than among the middle classes? No, no, of course not, for at least if class in the social, economic, or socioeconomic sense be defined by the old-fashioned criteria of degree of financial liquidity and manner of acquiring and maintaining that liquidity, this story-class (here, I must acknowledge the confusion occasioned by the use of class in two quasi-different senses while not apologizing for it in the slightest, for as should be patently clear from my attitude towards the class of stories in point since the just-mentioned essay-beginning, I regard the stories comprising this class as the lowliest, the basest, the most ignoble of all stories told or written to date) is slightly (mind you, only slightly) more popular among the middle classes than among the proletariat and peasantry. I regard this event as decisive because it seems to me that Occidentals—or at least virtually all Occidentals under the age of about seventy—are now chiefly preoccupied with positioning themselves in relation to some story from this story-class, with somehow fashioning themselves (yes, yes, yes, Prof. Greenblatt: in an uncannily {albeit by no means redeemably} Renaissance-like way) as typological echoes of the personages of these stories, and as repeating the events narrated in these stories. I scarcely know which of my myriad bristles to set bristling first against this conjecture. For starters, such identification with the personages of narratives dates at least as far back as to the fourth century-B.C., because Plato’s Socrates likens himself to Homer’s Achilles. Then one must consider that such identification seems to differ not an essential jot from the supposedly productive and virtuous identification of historical figures with their predecessors in “the drama of history,” an identification for which I am still waiting for you to make a case. And finally—for I fancy three out of a thousand bristles are quite devastating enough to explode such a soap-bubble-thin conjecture as the one now in point—it’s certainly rich to find you implicitly denouncing an immersive engagement with airport novels within minutes (as the eye-crow flies) of vouchsafing at least a Foresterian two cheers to such engagement.  Regarding the pertinence Socrates qua would-be Achilles a number of objections might be raised and applied qua effective bristle-flatteners—for example, that Achilles is a figure not only from Homer but from myth, that Plato’s Socrates is “in a very real sense” an invention of Plato, and that  that very same Socrates takes a very dim view of Homer in The Republic, but I think that a certain one is effective enough to be raised and applied on its own, namely that Socrates’s likening of himself to Achilles is after all a mere likening and a passing one at that, whereas the engagement of today’s younkers (and an alarmingly crescent proportion of today’s middle-agedsters and oldsters) with so-called fantasy stories takes the form of a full-fledged and lasting embodiment-cum-enactment, notably via the phenomena of so-called cos play (the donning of the costumery of the characters of such stories) and so-called larping (the reenactment of events from such stories or events that are regarded as extending the corpus of such stories).  Regarding the similarity to the identification with figures from “the drama of history,” one must rejoin, first, the obvious and devastating rejoinder that history is something that has actually happened, such that one can objectively appraise the justness of the identification as Marx did Napoleon III et al.’s identification with Napoleon I et al. (and, less Luther with Paul), and that, as mentioned before, drama in contrast to narrative is intrinsically outwardly orientated (although I admit I have yet to make my case for the preferability of plays to stories qua bases for living in the present [while at the same time not apologizing for not having yet made that case, as its proper place is my peroration, which is not yet in the offing]). And finally, as regards the pertinence of my previous guarded praise of airport novels, it should be obvious that that praise applied exclusively to airport novels that are set in some version of what more or less legitimately purports to be the real world and that are accordingly ultimately subject to debunking qua models of conduct by the standards of that world. Aye, aye, aye: but practically speaking is there really any difference between fancying oneself an entity, like a wizard or an elf, that one could absolutely never become in the real world because such entities do not exist at all in the real world and fancying oneself an entity like a spy or special-forces operative that one could hardly ever become in the real world because although there are such entities as spies and special-forces operatives, statistically speaking, hardly anybody ever becomes one? Yes, there is a very big difference because the existing history of these statistically rare but real entities automatically preempts a near-total identification with them because merely in virtue of having existed in the real world, such entities have suffered fates that have for one reason or other made them implausible or undesirable as so-called role models. It is all very well and eminently practicable up to a point to fancy oneself a James Bond-type figure, but James Bond, merely in virtue of being a spy, is tainted by the legacy of Kim Philby and Aldridge Ames. Such that, what, any accountant or quantity-surveyor who habitually indulges in Walter Mitty-esque fantasies in which he figures as Agent 007 is eventually compelled to entertain the possibility of Agent 007’s becoming a double agent and thereby to call into question the desirability of being a spy and there-further-by to do a better job at being an accountant or quantity-surveyor? Essentially, yes, although the phenomenon needn’t manifest itself as anything so epiphanic; indeed, it might manifest itself in an entirely negative way, in the cutting short of one’s total identification with 007 in virtue of one’s background awareness of double agents like Philby and Ames. But mightn’t one say the same thing about people’s identification of certain character-classes in so-called fantasy novels?  Mustn’t one in imagining oneself as a Gandalf-type figure, as a good wizard, worry that one might be tempted to become a bad wizard like that one fellow, Saur…Surely not Sauron? No, no, of course not: I know Sauron’s not a wizard but rather a disembodied eye; but the name of the fellow is quite close to Sauron; it definitely begins with an “S” and some sort of “a” sound… Sarumin? Yes, that’s right; mustn’t a would-be Gandalf be constantly worried about succumbing to the temptation to cross over to the dark side like Sarumin? No, because Gandalf isn’t merely a “good wizard” or perhaps not even a good wizard or even a wizard at all despite being explicitly called a wizard by Tolkien, because the characters in these so-called fantasy novels are not members of common noun-labeled sets (as James Bond, for all his implausibility as an empirical spy ineluctably belongs to the set of spies) because they are, rather, just like the angels as specified by Aquinas—i.e., forms and individual entities at one and the same time, or members of a set of which they are the sole member. Gandalf can never become a bad wizard because he has nothing notionally in common with any wizard as defined (defined, mind you, not necessarily existent) in the real world, just as Elrond can never become a bad elf because he has nothing notionally in common with any elf as defined (defined, mind you not necessarily existent) in the real world. You seem to know a suspiciously great deal about the Tolkieniverse for someone who professes to hate it as Tybalt hates hell, all Montagues, and Benvolio. I suppose I do, and that is because as a super-youngster—which is to say, between the ages of eight and thirteen—I was a much more ardent Tolkien fan than the next super-youngster (One must remember—or merely member, if this is new to you—that in those days, the days bookended by April 28, 1980 and ca. June 1, 1985 [for my Tolkien fandom did not survive my graduation from seventh grade, which occurred just over a month after my thirteenth birthday], the average American youngster, or at any rate the average American male super-youngster, was most ardent in his fandom of the likes of The Dukes of Hazzard and The A-Team). Whence then do you derive the supertanker-sized side, the pinnicular altitude, the 179.9…-degree nasal arc, requisite to contemning Tolkien? I derive it or them from my acquaintance with a place that, once one has visited it, one’s failure to acquire, namely the entire historical-cum-philological-cum-biographical context of The Lord of the Rings and the perhaps dozen or more other texts that piece out the Tolkieniverse. I should be ever so grateful if you would, á la a mother or nanny preparing a newly-teethed toddler his first solid meal, cut this prima-vista indigestibly huge wholeness into bite-sized chunks. And I shall be ever so obliging as to perform that job of coupage; and indeed I have been so obliging (not to mention prescient) as to alienate from that whole two of the more digestible bits, namely the respective histories of elves and wizards outside the Tokieniverse. [You, doing a so-called slow burn] But you’ve just told me that those other elves and wizards have nothing to do with Tolkien’s elves and wizards. Indeed I have, and indeed the former do have nothing to do with the latter, but that having-nothing-to-do-with-ness, for all its ineluctability on a certain metaphysical register, has been imposed on the Tolkieniverse, or, rather was imposed on it by Tolkien himself at the moment, or collection or succession of moments, at which he “created” (which is to say, in scare-quote-free terms, concocted or patched together) his universe of Middle Earth-plus-those super-dreary other realms whose names I am pleased to say I have forgotten (an oblivion abetted by inability to supplement my three or four complete readings of The Hobbit and the Lord of the Rings with a single snooze-free reading of Unfinished Tales or The Silmarillion); and at a certain moment in 1985, I realized that the entire Tolkieniverse consisted of nothing but such impositions. Why did you not come to this realization until 1985? you really so precocious an acquaintance-maker of the Tolkieiniverse (or so complete an ignoramus vis-à-vis the traditional corpus of very-small-children’s literature—notably, if not exclusively, the corpus of Benjamin’s prized genre of the fairy tale) as to envisage Gandalf as the de facto wizard and Elrond as the de facto elf? Of course I wasn’t; of course, like all Anglophone youngsters of my generation and gosh knows how many preceding generations, I had emerged from my infancy thinking of a wizard by default as the Merlin of the Arthurian legends (or at any rate of such 20th-century spinoffs of them as the Disney cartoon movie, The Sword and the Stone [itself a spinoff of T.H. White’s novel of the same title] and the short-lived situation comedy Mr. Merlin, wherein Merlin was to be found working as an auto mechanic in late twentieth-century San Francisco) and of an elf by default as…As one of the mascots of the Keebler brand of cookies? Well, most likely not one of them in particular, but certainly as a class of beings that included them, a class of beings signalized by being extremely small—small enough, indeed to be carried in one’s (or at any rate, the average-sized grown-up human’s) hand, and distinguished from leprechauns only in lacking a shillelagh and not being invariably attired in green; hence, a class of beings that did not have room for Tolkien’s elves, who were signalized by being slightly taller than the average human, and indeed I dimly—albeit only very dimly—recall having to acclimatize myself to the notion as an elf as that kind of being (while at the same time experiencing the alienating feeling of not having to relinquish the notion of dwarfs—or, as Tolkien preferred to pluralize them, dwarves—as beings roughly half as tall as the average human {while yet again having to acclimatize myself to the notion that dwarves were not the shortest  little people apart from elves, thanks to the invention of these beings hobbits who, I believe, were said to be slightly shorter than dwarves in the Tolkieniverse}). And  perhaps if Tolkien had been more consistent in his misnaming, and had confined himself to a combination of entities taken from fairy tales and entities of his own invention, if he had simply peopled, placed, and thinged his world with dragons and spiders, whether the same size (as his dragons were) or larger (as his spiders were) than the ones I had read of elsewhere, with orcs (his invented plebian, low-intelligence baddies, helpfully only schematically described), and ents (his invented walking trees), with places with names like Mordor and Gondor, I would never have become a jot the wiser as to that world’s factitiousness. Perhaps, in other words, I would have taken the extreme idiosyncrasy in his handling of certain familiar words as being par for the bizarre course of a nomenclature of a manifestly fictional universe, of a sort of proto-bizarro world more akin to Lewis Carroll’s than to his actual own. But unfortunately for Tolkien (to the extent that a single youngster’s discovery of the factitiousness of his world more than ten years after his death can constitute a misfortune for JRRT), he had also either out of laziness or perverseness packed his world with entities that hailed from the more or less rightly called real world, that functioned more or less exactly the same as their counterparts in that world, and that sometimes even bore the same names thereas. For example, both Gandalf and the eponym and hero of The Hobbit, Bilbo Baggins, were fond of smoking something called pipe-weed. Whether one regarded this pipe-weed as a plant whose effects on the smoker were more or less consubstantial with those of tobacco or whether, in company with Tolkien’s more waggish and countercultural fans from the 1960s onwards, one could not deny that pipe-weed did not correspond to any plant that figured in fairy tales. Indeed, to the best of my layman’s recollection (no latter-day Grimm-brother I!), fairy tales are completely devoid of substances that we would think of as functioning in the manner of drugs as we now understand them; there are potions and poisons, to be sure, substances (almost all of them liquid) that effect some instantaneous transformation in the person who consumes them, substances that cause him in the blink of an eye to turn into a non-human animal fall into a thousand-year slumber, etc. but no substances that effect some gradual and more-or-less subtle change in the consumer’s mood or state of health; and then one thinks of smoking in general qua activity, and how utterly un-fairy-tale-like it is. I believe Dick or Harry Klein in that mid-1990s sleeper popular-bestseller of a work of academic literary criticism and paean to smoking and smokers, Cigarettes are Marvelous or Cigars are Da Bomb, makes the instantly persuasive assertion that smoking is an ineluctably modern phenomenon—not merely contingently, i.e., because tobacco was first cultivated in the New World whose discovery marked the beginning of the modern era—but intrinsically and essentially, i.e., because it’s just not the sort of thing an ancient or medieval person, a betoga’d bloke like Julius Caesar, or a be-cassocked bloke like St. Thomas Aquinas, would ever do, because the attitude of simultaneous preoccupation and distraction that it exacts is irreconcilable with the ancient and medieval mindsets-cum-habituses. And if smoking is unthinkable in antiquity and medievality, it is surely even less thinkable in the world of fairy tales, which at least in terms of the mindests-cum-habituses of its inhabitants is a sort of mélange of the ancient and medieval worlds. And what about that beardy loon Tom Bombadil whom Frodo and co. encounter on their way into or out of Mirkwood (sp? [I’m determined as a matter of principle not to crack open a single Tolkien-penned book or Middle Earth-orientated reference in the course of this Tolkien-bashing sub-screed {even if this admission itself tends to undermine my credibility as a Tolkienophobe, inasmuch as memory is “in a very real sense” an epiphenomenon of love}) Forest? What’s a guy called Tom—an English diminutive of a Greco-Aramaic forename—doing in a novel set in a realm that purportedly antedates the earliest still-extant traces of human history by millennia (and that is perhaps also more than figuratively buried under our world by miles of some unspecified material [I’m pretty sure the “Middle” in “Middle Earth” is a sort of vertical analogue to the “Middle” in “Middle Kingdom” qua Anglo-Chinese nickname for China])? But of course if one ventures westward from Mirkwood to the Shire one encounters a veritable bestiary of proper-nominal anachronisms (as I am sorely obliged to term them, for our language—and to the best of my knowledge every other language—lacks a word for entities that are not only out of their own time but also out of their own plane of reality, a lacuna that at first blush may seem initially to scream for filling but that at second blush redounds greatly to the credit of our forebears qua possessors of better things to do than dream up alternative planes of reality). True, the forenames of the star hobbits, Frodo and Bilbo, appear to be genuine neologisms (albeit that the eldritch common noun bilboes makes a memorable appearance in Hamlet), but Pippin is the name of the most famous medieval monarch after Charlemagne, and Merry and Samwise are transparently deliberately evocative of names super-current in the modern actual Anglosphere; and at Bilbo’s “eleventy-first” birthday, a character bearing the surname of Proudfoot is pernickety enough about English grammar to insist that a brace or group of his kinsmen must be termed Proudfeet.  And just think of the name the Shire itself: what is shire but another name for county, and specifically for a county in medieval and post-medieval Great Britain (for I believe that in all the other English-speaking countries including Ireland one designates a county by prefixing or appending the word county to the other component of its demonym)? Not that the name is at all inapt, for apart from the diminutiveness and subterranean residences of its inhabitants, the Shire is a letter-cum-picture perfect evocation of an English county of between the early eighteenth century and the mid-nineteenth, an almost criminally exact Venn Diagram section comprising the mutual overlapping of Tom Jones’s presumptive Dorset, the nameless provincial counties of The Vicar of Wakefield and Thomas Hardy’s Wessex. (Note that I write overlapping, because although the Shire is crawling with yeoman and squires it is bereft of the nobles who peripherally figure in Goldsmith and Hardy [its simultaneous pullulation with persons of sufficient status to enjoy the liberties of “freeborn Englishman” and bereftness of persons with sufficient status to exert irresistible tyranny over other freeborn Englishman is of course an alienable aspect of the Shire’s charm].) Not even the elves were suffered to exist in hermetic isolation from the real world, for I learned that Elvish was based on Finnish. The race of men, apart from their unselfconscious coexistence with language-using intelligent species with whom they were seemingly incapable of interbreeding, seemed sufficiently ontologically sturdy; one could imagine them inhabiting any part of the earth from Cornwall to the Caucuses and any phase of human history from the early Iron Age to the Late Middle Ages—in other words, at least according to the ontological schema wherein Middle Earth is simply the actual earth at a super-primeval, super-prehistorical stage of its history, one could imagine them developing into the humans of the Paleolithic age via a series of civilizational setbacks and like the ones that obliged the Greeks to rediscover and improve writing and the Western Europeans of the middle ages to rediscover the Greek and Roman classics. But even these men of Middle Earth communicated with each other by means of runes, a form of writing pioneered by Germanic tribes of the fourth-century A.D. And of course the more one learned about Tolkien’s biography the more understandable—not excusable, mind you, let alone cogent—one found it that Middle Earth had turned out to be a geographical-cum-historical potpourri of exactly the above-listed ingredients. Tolkien had, after all, been born to middle-class English parents, a salaryman of a multinational bank and his wife, in South Africa in 1892, which had meant that although he had been unable to help regarding England as the land of his forebears and although he had lived in England from the age of three onwards, he had never been able to look on England as his native land in a truly strong sense, which meant naturally if “ironically” that he had developed a super-strong attachment to both the Merrie Olde England of the nineteenth-century English petit-bourgeoisie, the England of rural pubs and village greens—an intrinsically modern England despite its archaic orthography (which in any case echoed spelling conventions that had been extinct for a mere two centuries)—and the most ancient phase of English history, the phase bookended by the arrival of the Angles, Saxons, and Jutes (we mustn’t ever forget the Jutes!) in Great Britain in the fifth century and the Norman Conquest in the eleventh. And as a native of an ex-British colony myself—and, withal, an ex-colony that like South Africa scarcely bears any resemblance to the home country in its natural or human geography (i.e., Florida rather than some congenital clone of England like Massachusetts or Maryland)—I can certainly sympathize with the presumptive motives for the generation of Middle Earth. But in that selfsame capacity I was unable to remain obsessed with Middle Earth once I learned of its historical sources and substrates; at that moment, I knew that if I wished to get closer to what I prized most in Middle Earth, I would have to leave Middle Earth ever-further behind and immerse myself in the literature and history of the Great Britain (and, to a lesser extent, the continental Europe) of the ca. fifteen centuries antedating Tolkien’s birth. And the rest, as they say, is history, or, rather history and literature—i.e., at that moment you embarked on the course of reading (the reading of the writings of, inter alia, Fielding, Sterne, Herodotus, and Walter Benjamin) that has “got you where you are today,” i.e., at least in the most literal reading of “today” (but also in a fairly latitudinarian interpretation thereof, for this activity has obviously been many years in preparation), flogging the sitting duck-shaped dead horse that is the literary respectability of J.R.R. Tolkien (I term that respectability a sitting duck-shaped dead horse because it has after all been well over six decades since the first lash—and quite a formidable lash at that, a lash that may very have been fatal on its own—was administered to it by Edmund Wilson). Yes (although I have, to put it mildly, some strong reservations about your concluding parenthesis). Well, then, assuming your reservations about my concluding parenthesis are themselves merely parenthetical, have you ever had even the faintest ghost—or in Tolkienian parlance wraith—of a legitimate bone to pick with Tolkien fandom? After all, if The Lord of the Rings &co. can serve as such a salutary gateway drug to more truth content-bearing books, surely the more Tolkien fandom there is the merrier (and pippin-ier) we all shall be. My bone to pick with Tolkien fandom is as legitimate, corporeal, and animate as any bone can ever hope to be inasmuch as the large-writ history that hath coincided with the small-writ history of my post-Tolkien course of reading hath conclusively shewn that like many a user of either of the conjectural real-worldial inspirations for pipe-weed, many and very probably most Tolkien fans do not move on to so-called harder drugs, that they are content to wallow in their Tolkienism for the entire duration of their naturals, or at their most venturesome, to overlay their Tolkien-worship with the worship of some similarly (and therefore, the Tollkieniverse being the earliest of such universes [at least the earliest of such multi-bookshelf-exhausting dimensions], derivatively) garbled-piecemeal universe like that of the above-execrated Harry Potter novels, or of one of the too many-to-be-named god-awful so-called role-playing games (originally exclusively paper and tabletop- centered but now almost exclusively electronic). But why ever the heck have they not moved on as you did? After all, the facts about Tolkien’s biography and sources are not top-secret secrets; nay, one assumes that these facts are even more readily available now than they were during your youngsterhood. To some extent the answer to this question is a mare’s nest of a flow chart connecting hundreds of chickens to hundreds of eggs via thousands of bi-directional arrows, but one can without hesitation at least identify a goodly half-dozen of the eggs or chickens themselves. One of them is undoubtedly the complete evaporation of the prestige and influence accruing to critics like Edmund Wilson and to polemics like the one of his mentioned in your above parenthesis and the usurpation of these critics’ place by critics actually aggressively flogging trash of even less merit than the Lord of the Rings (i.e., patchwork universes made of shabbier patches than the Britanniana comprising the Tolkieiniverse and assembled thereoutof even more ineptly). Why, I recall about a decade ago, a selection of the writings of that god-awful poor great-grandson of E.A. Poe (himself a writer who safely might never have been born had it not been for his mainly beneficent influence on French literature, and on Walter Benjamin’s justly beloved Baudelaire in particular), H.P. Lovecraft (an edition with a preface by Stephen King, rightly derided on the occasion as “an author of penny dreadfuls” by the now-deceased Harold Bloom, the very last critic of Wilson’s level of discernment enjoying a fraction of his prestige and influence) was added to the Library of America founded by Wilson, who had rightly poured even more excoriating scorn on Lovecraft than on Tolkien. And in commemoration of the occasion, the BBC (Radio 4 as I recall) produced a mini-documentary on H.P. Lovecraft featuring some leading academic authority on Lovecraft (if the very notion of an “academic authority” on such a picayune scribbler as Lovecraft be not oxymoronic) naturally descanting on the supposed fact that HPL was now finally getting the attention and indeed reverence that he had supposedly always deserved and in supposed open-and-shut proof of this always-already lacking reverence mentioned the just-mentioned addition of HPL to the Library of America, “the very Library of America founded by Lovecraft’s most ardent adversary and scourge, Edmund Wilson. Score: Lovecraft Two-Zillion, Edmund Wilson, Love!” That to adduce the admission of Lovecraft into the LoA as proof of Lovecraft’s greatness was to beg the question, that all that admission it in fact proved was that since Wilson’s foundation of the Library it had fallen into the hands of people who mistakenly thought that Lovecraft was a great writer (and perhaps even more reprehensibly thought Stephen King was at least a good enough writer to introduce a great one), seemed not to have occurred either to the Lovecraftian or to the producers of the program(me), presumably because while they were aware of certain biographical details about Edmund Wilson, they had not grown up crediting Wilson and critics of his time and standing with any special degree of taste or insight. But did you grow up crediting Edmund Wilson &co. with any special degree taste or insight? As I recall, your pre-youthful discovery of the shortcomings of Tolkien impelled you to seek out English literature, not American literary criticism. Semi-indeed, I didn’t grow up crediting Edmund Wilson &co. with any special degree of taste or insight because I didn’t read anything written by Edmund Wilson until I was in my early 20s and hence more or less completely grown up. And I doubt that I read any literary criticism by anyone at all until my late teens. Still, even before I cracked The Hobbit for the first time, I knew more than vaguely that there was such a thing as serious literature and that none of Tolkien’s books was an example of it. I recall that during that pre-The Hobbit-cracking epoch I cracked a volume from our family bookshelf, anthology textbook from my mother’s college years called The Experience of Literature and edited by Lionel Trilling. Although I do not recall reading any of Trilling’s commentary on any of the literary works contained in the collection, I do recall reading five or ten pages of one of the works itself, Tolstoy’s “(The) Death of Ivan Ilyich” (naturally in English translation [but don’t ask me to name the translator {although naturally he or she was doubtless Constance Garnett}, altho’ of course I could find out who he or she was easily, and doubtless even cheaply {i.e., even if I had to order The Experience of Literature from the usual source, at which a “Used, Adequate” copy with “visible wear on cover and numerous unidentifiable stains throughout” could not cost more than 10 American dollars (at least as of this writing [July 6, 2022], for in this microepoch of rising inflation, 10 American dollars may not suffice to purchase a can of supermarket own-brand cola [if so-called ESG standards have not made cola and aluminum cans legally unpurchasable by then] by the time this essay next sees empirical eyes, be they even my own qua self-proofreader)}]) enough) before throwing the whole half-tome aside out of boredom. I recall, further, finding on the family bookshelf a yellowed paperback late-1950s or early-1960s edition of Dickens’s A Tale of Two Cities shortly after the broadcast of a television-miniseries adaptation of the book, a broadcast at which I had at least taken a voluntary and curious peek, and reading two or three paragraphs of the first chapter before throwing the compactly hefty Taschenbuch aside, partly out of visceral disgust with the smell and appearance of it, but also out of insuperable bemusement by the archaic grammar and syntax of the work itself. In particular I recall being put off and tripped up by the sentence that began “There were a king with a large jaw and a queen with a plain face,” a sentence which it seemed to me should instead have begun “There was a king…,”presumably because I had thitherto heard and seen the construction “there were” used exclusively in conjunction with numerically aggregated plural nouns like “two kings” and “two queens” (and “seven swans a-swimming” and “six geese a-laying”) and never in conjunction with concatenations of singular nouns like “a king and a queen.” Nonetheless, none-the fudging-less, by the moment of my abandonment of “The Death of Ivan Ilyich” and A Tale of Two Cities, a certain seed had been planted, as they say, in my Weltansicht, the seed of a certain notion of proper literature as something that was at least initially quite difficult and quite boring, a seed whose mere presence qua seed in my Weltansicht militated against my taking The Lord of the Rings fundamentally seriously even at the acme (or nadir) of my infatuation with it, and whose almost instantaneous efflorescence into a full-fledged tree (or, in strictly accurate figurative terms, preternaturally large flowering shrub) was guaranteed to take place the instant I fell out of love with the Tolkieniverse at the instance of any efficient cause, and as it indeed did when, as recounted above, I eventually twigged the metahistorical-cum-metageographical incoherence of that ieniverse. I do not doubt that that such a seed was implanted in your Weltansicht or that it was guaranteed to effloresce on such a grand scale or that it did indeed eventually effloresce thereon, but by that same Tolkfest—i.e., given that the Trilling anthology, the Tale of Two Cities paperback, and the T2C television adaptation were all self-evidently mass-produced productions, mustn’t many thousands if not outright millions of your exact contemporaries have had that selfsame seed planted in their respective Weltansichte and consequently been likewise debarred from becoming lifelong Tolkien-worshipers and impelled to become eventual lifelong worshipers of proper literature? Nay, at this very moment (i.e., 6:16 a.m. EDT on July 7, 2022) mustn’t a statistically sizeable number of Anglophone youngsters be having that selfsame seed implanted in their respective Weltansichte—not, to be sure-ish, by Trilling-edited anthologies, yellowed paperback editions of Dickens, and TV network produced adaptations thereof, but surely by some serviceably analogous popularizing media of the present? Such that, what, my report on the current Weltgeist shaping-cum-Zeitgeist-defining influence of so-called fantasy worlds needs must be greatly exaggerated? Such further that it can be assumed that a sufficiently statistically robust proportion of the Anglophone reading populace has all along been quite serviceably shoring up the Weltgeist and Zeitgeist against their succumbation to that baleful influence?  Exactly. I believe I can put paid to both of these “such that” clauses via the adducing of one additional piece of evidence in two settings. The piece of evidence is the existence by my own youngsterian day of an embryonic version of what is now known as fan culture, a version thereof centering on feature length animated-cartoon adaptations of fantasy novels including The Hobbit and The Lord of the Rings and the exhibition and viewing of these adaptations, along with other feature-length cartoons like Watership Down and material from the archive of such televisual-cum-cinematic science-fiction universes as Star Trek, at national and regional film festivals such as the annual one at the University of South Florida that my parents took me to three or four years (ca. 1979 to 1983) in a row. This subculture had not immediately sprung into existence with the mid-1950s publication of The Lord of the Rings; to be sure, The Lord of the Rings had been an immediate best-seller and indeed a sufficiently annoyingly high-profile bestseller to provoke a cantankerous mini-screed from Edmund Wilson, but in the mid-1950s even Tolkien’s most ardent admirers—for example, W.H. Auden—could not but know that their time was more appropriately lavished on other objects; in their eyes reading Tolkien could never have been anything more redeemable than the guiltiest of guilty pleasures and hence only an occasional diversion occupying a tiny fraction of their reading hours. Accordingly the very existence of the abovementioned subculture even in embryonic form a mere quarter-century later cannot but be interpreted as a sign of a substantial increase in the prestige if not necessarily the respectability of fantasy-novel fandom--for the dynamic in play was not a decrease in respectability but a displacement of respectability by coolness or hipness as the ideal ethos-cum-habitus--and a substantial decrease in the prestige if not necessarily the respectability of the admiration of serious literature. But even then, in ca. 1980, fantasy literature-fandom was not yet prestigious in an absolute sense; it was not then yet cool or hip, and indeed, the Gesellschaftsgeist even of that microepoch even had ready to hand and mouth certain pejorative terms for aficionados of fantasy literature and adjacent phenomena such as the above touched-upon role-playing game-playing (If this last construction seems unbearably clunky, one must remember that there is really no non-anachronistic alternative to it, for in ca. 1980 the concise freestanding gerund gaming was still employed exclusively as a synonym of gambling, as a word whose definition was “play[ing] at games of chance for money” [so the 1990 Concise Oxford Dictionary]; and the reflection that over the past 20 years gaming has come effectively to denote nothing but “playing role-playing games and shoot-’em-up video games for pleasure” speaks jeremiad-filled [and hence, inter alia, story-bereft] volumes about the moral-cum-intellectual vacuity, for the old-style gamer, for all his dissoluteness, did at least share with his contemporaries in the realm of productive labor the aim of reaping monetary gain from his defining activity.)—terms such as geek, dork, and nerd.  But if fantasy literature-fandom was not yet hip or cool in 1980, what was preeminently hip or cool then? Why, I suppose, fandom of certain so-called underground or post-punk pop music bands along with certain semi-serious literature like the novels and memoirs of such latter-day libertines as Burroughs and Bukowski and certain semi-sub-commercial cinematic corpora such as those of David Lynch and John Waters. And would you maintain that this was an intellectually and morally preferable welt- und zeit-geistig state of affairs to one in which fantasy literature-fandom would have been hip and cool. I would and indeed do maintain this unreservedly. For as with the abovementioned gourmandization of traditional airport novels, in participating in the sorts of fandom catalogued in the last sentence but one, one was always somehow engaged with something with a referent in the actual world of the past or present. In reading a magazine review of a recording by, for instance, the then-present-day so-called industrial band Cabaret Voltaire one would quasi-automatically be impelled simultaneously to learn something about the Dadaist nightclub Cabaret Voltaire and the eighteenth-century philosophe Frank Voltaire; in listening a recording by the then-present-day so-called band Magazine, one would quasi-automatically be impelled to learn to something about Dostoyevsky’s oeuvre thanks to a song partly named after that writer’s Notes from Underground and featuring lyrics written from the point of view of that book’s narrator. The hipster filmmakers of the time, while by no means as well-read as their predecessors of the so (and on the whole, rightly)-called golden age of Hollywood (and I am by no means thinking only of the “artiest” of them, for even as bloky and “extravert” a director as Samuel Fuller found inspiration in so obscure a literary work as the diary of John Evelyn), took most of their points de repère from highbrow writers. Lynch wore his debt to Kafka on his sleeve so often that he could dine out on it (i.e., the debt or the sleeve, and figuratively or literally, take your pick of either or both [for this is after all the ever-so-wacky, ever-so-zany David Lynch we’re talking about here]) and Waters has said that he stopped caring about making commercially successful movies once he knew he didn’t have to check a book’s price before deciding to buy it. And the Beats and their progeny like Bukowski, although ultimately even more nugatory in intrinsic merit than Updike (being as they were in intrinsic terms but ninth or tenth pressings of Baudelaire and Whitman), were after all nothing if not pretentious, such that they could not help peppering their works with allusions to the work of their superior forebears with the shameless profusion of an Olive Garden waiter. In short, the purveyors of the most prestigious cultural wares of ca. 1980, although not necessarily all that cultivated themselves, were constantly proffering their readers and audiences what one would now term hyperlinks or rabbit holes leading to all manner of highbrow historically mediated phenomena. And “don’t even get me started” on the people involved in the production side of this hipster-driven sub-economy: playing in a pop band, even the most untutored primitive punk combo, required one to learn something about music and some intrinsically musical skill, be it only how to tune a string instrument and how to play in time with other performers; and playing in band for any extended length of time, even a few consecutive months, forced one to develop a sense of discipline (yes, yes, yes, even if one spent the greatest portion of those months boozing or consuming controlled substances), be it only the discipline of showing up on time for practice-sessions and performances, and an appreciation of artistic form, be it only the rudimentary form of a Verse-Chorus-Verse-Chorus-Bridge-Chorus pop song. And of course, in a song-centered genre-cluster such as pop, the development of a sense of musical form was perforce attended not only by the development of at least a bare-bones appreciation of such elements of prosody as meter and rhyme but also and more significantly by a sense of what one might term referential exigency, a sense of what sorts of things were worth writing about and how to go about writing about them, even if those sorts of things turned out more often than not, at least initially, to be as banal as one’s rejection by a girl or a boy in whom one had taken and continued to take an amorous interest, for in the course of pursuing such banal themes, one’s boredom with the familiar and unenlightening treatment with them would eventually impel one to treat of them in unfamiliar and enlightening ways and to alight on such pricelessly infungible formulations as Morrissey’s “writing frightening verse to a bucktoothed girl in Luxembourg.” The discipline and hands-on experience of the objective world required in order to put together a film, especially films of such shoestring’d budgets as those of “the New Hollywood” requires little if any comment. In interviews recorded in the past twenty years, the surviving members of Waters’s casts and crews of the 1970s emphatically that despite their lack of accredited training in acting, set-designing, and the like, they were subjected by Waters to as grueling and exacting a regimen as they ever subsequently endured on the set of a big-budget mainstream Hollywood picture. And of course, Burroughs and co., despite their far from trivial dependence on so-called mind-altering drugs (and hence on a pseudo-escapist mode of being that was not completely unlike the one concurrently pursued by fantasy-literature fans), were at least affecting to be describing and recounting realities of bone-juddering immediacy, realities immersed in nature “red in tooth and claw,” and consequently at least occasionally succeeding in doing just that. “Fast forward,” as they say, roughly a quarter-century from 1980, and so no farther ahead than to 2005, and the former geeks, nerds, and dorks—the fans of fantasy literature and role-game playing--are uncontested rulers of the c******l roost. By 2005 it is not so much the case that, in the words of Huey Lewis in ca. 1985, “It’s hip to be square” (by which, I suppose, he merely meant that people like his fellow Back to the Future cast member Michael J. Fox were winning fame and wealth by portraying such decidedly un-dorky non-hipsters as preppy undergraduate economics majors and stockbrokers) as that it’s passé to be hip and trendy—and increasingly well-nigh-obligatory—to be a dork. And ever since 2005, serious literature—the body of writing comprising the entire 2,500 year-spanning canon of novels, epics, stories (short or otherwise), sacred scriptures, fairy tales, lyric(al) poems, plays, histories, and philosophical treatises—has been dwarfed (or, as Tolkien would doubtless have insisted, dwarved) if not dust-mited as a collection of points de repère by the canon of so-called fantasy writing. Tolkien’s writings mark the very quasi-primeval beginnings of this canon; in today’s collective imagination they occupy the place that used to be occupied by the Bible, Homer, Herodotus and Plato; The Hobbit and The Lord of the Rings are today’s Iliad, Odyssey, Republic, Histories, and Book of Genesis all rolled into one. The books in the Harry Potter franchise occupy the place formerly occupied by Shakespeare or perhaps Dickens, which is to say that although they are of much more recent provenance than the founding texts they are generally much more greatly revered and ardently loved than the latter. And of course since the absorption of the Harry Potter franchise itself into sub-literary history in ca. 2010, dozens of other slightly more obscure franchises have come to occupy the place formerly occupied (in point of snob value if not in substantiality) by the great and at least reputedly great modernists—Proust, Joyce, Beckett, Kafka, Hemingway, et al. The effect of the accession of this body of sub-literary work as the c*****l lingua franca on anyone who has had the equivocal fortune to outgrow fantasy literature is more than merely analogous to that of a talking lion as imagined by Wittgenstein in the Philosophical Investigations: it would not be possible, says Wittgenstein, to learn the lion’s language, because a minimum condition of learning a foreign language is a shared world to which the foreign-language speaker’s language and one’s own language both refer, and a lion’s world has nothing in common with our own. A person who has enjoyed or suffered the just-mentioned equivocal fortune cannot get through a single non-solitary day (i.e., any day apart from one in which he encounters no living humans either immediately in the flesh or mediately via any electronic, acoustic, or paper medium) without being assailed by chunks of conversation and discourses liberally larded with unintelligible references to the corpus of fantasy sub-literature, and indeed, to the indisputable and exhaustive extent that any authentic work of art must register the state-of-the art subjective experience of its historical moment of origin, if any authentic novel or novel-like text is still capable of being written, such a novel needs must center on a narrator-cum-central character repeatedly grappling unsuccessfully to make sense of such conversation-chunks and discourses. I can think of little if nothing more arrogant than your just-tendered implicit arrogation to or for yourself of the loci dramatis of Charlton Heston in Planet of the Apes (with inscrutably talking lions substituting for scrutably talking apes) and Vincent Price in The Last Man on Earth (with Tolkien, Rowling, et al.-quoting post-Gen-Xers substituting for vampires). For after all, the besetting quotidian phenomenon you are bewailing is the merely the super-banal one of simply not being au fait about a certain state of affairs, a phenomenon that has doubtless been experienced by everyone in the world, whether living or dead multiple times in the course of a life, the exact same flavor of alienation as that experienced by a lifelong baseball-only fan taking in his first televised cricket match, or a dim-sum virgin perusing a dim-sum restaurant’s menu for the first time, or a Roman history ignoramus’s inaugural sighting or audition of the names Scaevola and Lars Porsena. Surely if you took the trouble to explore the corpus of post-Tolkien fantasy literature–and mind you, I by no means wish to imply that such trouble is necessarily worth taking—you would come to find a large proportion of these conversation-chunks and discourses as readily intelligible as you now find (or would find, in the requisite meta-cultural setting) conversation -chunks and discourses liberally larded with references to Shakespeare or Samuel Johnson. Fine: go ahead and think that, as you are avowedly incapable of thinking otherwise. But by that selfsame tokefest, I am incapable of thinking anything more arrogant, and indeed not only more arrogant but more willfully fatuous, than your obliteration of the drastic qualitative distinction between allusions to the corpus of so-called fantasy literature and allusions to such corpora as those comprised by the rules of a sport, or the annals of history, or the canon of serious literature, or even (for I believe in certain respects that foodie-ism is as pernicious as so-called fantasy literature in the will o’ the wisp-in-a-hall-of mirrors-like character of its referential nexus) the recipes of a cuisine. For it is in the nature—an historically mediated and therefore admittedly contingent but for all that very well-established nature--of these latter sorts of corpora that upon encountering for the first time a reference to an entity hailing from them one has even in the midst of one’s ignorance a fairly well-defined notion of what one has been missing out on and what one will learn as a consequence of becoming better acquainted with that corpus. One knows that cricket like baseball is an athletic game with certain rules and that however ridiculous or irrational those rules may end up seeming to one as a baseball fan one will, upon close enough acquaintance with those rules, know what range of events to expect in the course of a cricket match (or test [the distinction between a test and a match being perhaps the most glaringly ridiculous of the rules in point]). One does not expect to learn, say, that the homonymy of cricket qua game-designator and cricket qua insect species-designator is not merely an inscrutable linguistic phenomenon (as it in fact is at least according to my 1990 COD, whose etymology of cricket qua game-designator reads in full “16th c.: orig. uncert.”) or even (as linguistic historians or lexicographers may have ascertained since 1990 for aught I know) a scrutable linguistic phenomenon attributable to some real but inconsequential connection between the game and the insect-species (say, the habitual disposition of early cricket batsmen to rub their legs against the sides of their buttocks before taking up their assigned position in front of the wicket) but a scrutable linguistic phenomenon attributable to, say, the demonstrable if inscrutable biological metamorphosis of a certain proportion of the players into entomologically unimpeachable crickets in the course of a match (or test).  By the same tokefest, on hearing a name like Scaevola or Lars Porsena in conjunction with the phrase “Roman history,” one knows, even if one has not read a word of Livy or indeed even heard of Livy, that whatever one learns about Scaevola or Lars Porsena will eventually be intelligible as materially impinging, by however lengthy a chain of causation, on the twenty first-century capital of the nation-cum-polity known as Italy; and one knows further that whatever one learns about that semi-legendary Roman and semi-legendary Etruscans will evince no causal connection whatsoever to their exact contemporaries in Zhou-dynasty China and at best precious little connection to their exact contemporaries in Goa under the Bhojas. And finally, on hearing of a chap by the name of Hotspur or a blokess by the name of Mrs. Grundy and further hearing that the one is a character in a Shakespeare history play and the other a character in a Victorian novel, one can be fairly certain that the one owes at least a groat of debt to some historical personage who was involved in events that eventually eventuated in the accession of Henry VIII to the English throne and that the other is involved in a narrative nexus in or on which questions of social respectability and sexual propriety exert at least an obliquely prominent influence. In each such instance (yes, yes, yes, exactly as in the instance of an encounter with a traditional airport novel, but by and large a fortiori) one knows in advance that in pursuing the body of knowledge behind the reference one will eventually alight on something that will enlighten one about the world in the watertight-as-a-duck’s bum Tractatus-Wittgensteinian sense—viz., “that which is the case” (“is the case” being a predicate that encompasses not merely that which is fully existent in the present but also that whose existence in the past is indisputably assertable [such that it is the case that Louis XIV was or used to be the King of France])--of which one is unbudgeably a part, and consequently about one’s place in that world in the fullest and richest sense—a sense encompassing all one’s potential accomplishments, pleasures, duties, etc., therein. Whereas in the instance of an initial sighting or audition of a reference to the realm of so-called fantasy literature one knows in advance that in pursuing the body of knowledge behind the reference one will never alight on anything that will enlighten one to the tune of a pico-lumen about the just-referenced Tractatus-Wittgensteinian world because however ponderously any knowledge-member therein may initially seem to bear on some so (and for the most part rightly)-called real-world decision, it will with Ezekielian inexorability connect to some other member therein that cannot by hook or by crook be brought to bear a featherweight of bearing thereon. Initially, when taken in isolation, the interlude at the inn in Brandywine in The Fellowship of the Ring seems a serviceable enough analogue to the still-eminently imitable blokey pub-sited pissfests of Fellowship’s exact contemporaries, the so-called Angry Young Men novels and so-called kitchen-sink dramas, but any attempt at patterning one’s own ethos or habitus qua regular at one’s local boozer on the ethos or habitus of Frodo & co. during this episode will be stymied by the ineluctable fact that this episode is undislodgeably embedded in a welter of things that are not, never have been, never will be, and never can be the case—notably thereamong, an order of ghosts on horseback whose sole raison d’être is to prevent a magic ring from falling into a volcano because the fall of that ring thereinto will precipitate a top-to-bottom reconstitution of the known moral-cum-meta moral universe. But my dear fellow, you rejoin through a belly laugh of such insufferable condescendingness that I am much of a mind to send it back up your larynx via a sucker-punch or even a kick to the paunch that is serving as its resonator, you seem to be missing an eye-burstingly obvious fact about such fantastic elements, namely that they are largely or entirely symbolic in function or purpose. In tendering this demurral so smugly you are merely shewing how deeply and irretrievably your mind has fallen prey to the besetting intellectual rot of the age, for a moment’s reflection would have reminded you that this argumentum ex symbolo is both inapplicable and gratuitous to any pertinent consideration of any personage, place, or object sited in the canon of serious narrative literature, a canon that the aficionados of so-called fantasy literature have no right to ignore inasmuch as their own canon is shamelessly superstructed on it in being centered on a succession of cycles of novels. One does not say that Tom Jones symbolizes youthful insouciance or that Scrooge symbolizes avarice or that the Baron de Charlus symbolizes homosexuality; one says, rather, that each of them exemplifies or instances or embodies his signature quality, meaning that he is to be understood as a faithful representation of an empirical person empirically bringing to bear this quality on the empirical world of the moment of the genesis of the literary work that he immediately inhabits. (That the plausibility of the representation is not to be taken for granted should go without saying, as the case of Scrooge in particular eloquently attests in virtue of the extremely schematic quality of Dickens’s depiction of Scrooge’s counting house and the transactions transacted therein.) And one derives instruction and enjoyment from reading about these characters in exact proportion to one’s interest in empirical people’s embodiment of these qualities in the empirical world of one’s present, or in the empirical world of one’s past as it impinges on that present-world.  In appealing to the argumentum ex symbolo in defense of fantasy literature, one merely spackles over the impossibility of deriving any such instruction or enjoyment from the characters and other elements of that literature owing to those elements’ unassimalability to any version of the empirical world at any stage of its history. Oh, come now, my dear fellow! you rejoin through a splutter that is no less condescending than your belly laugh was albeit that, being a splutter, it is actuated by outrage rather than mirth: This demurral of yours at my appeal to the argumentum ex symbolo is obviously instantly overturnable by the soundest and most-watertight argument of all, the argumentum ex gratia, the argument on the evidence of popularity. For if people were incapable of deriving delight or instruction from fantasy novels, they would most certainly not be purchasing them in the tens of millions, let alone endlessly citing them in their quotidian chatter.  I do not deny that people buy these books in the tens of millions or even that they derive delight from them (let alone that they cite them in their quotidian chatter, for I have already complained of this ubiquitous citation), but I do emphatically deny that anyone is deriving an iota of instruction from them, for instruction perforce consists in learning what is true (or, in the above-adduced case of the rules of an athletic game, what is consistently applicable in some internally cogent setting), and what people learn from reading so-called fantasy novels is incorrigibly false. So do you seriously maintain that from reading these novels people come to believe that they and their peers and compatriots, their friends and adversaries, are, say, magicians or quasi-human beings such as elves –and in any case, beings who can work their will in the world via supernatural means such as magical spells and charms? I believe that a certain proportion of them are coming to believe just that, and coming to believe it, indeed, thanks to a mechanism of epistemological legitimation built into many of the books themselves—for after all, the Harry Potter books are centered on an initially seemingly ordinary English boy who is living in an initially to-all-appearances prosaic, utterly disenchanted late twentieth-century England and who only belatedly discovers that he and a tiny minority of his peers are possessed of magic powers. (C.S. Lewis got this whole line of bullshit started with his mid-twentieth century wardrobe backing onto a snow-carpeted sylvan realm inhabited by a kindly alibidinous satyr and a cruel ice queen-cum-enchantress straight out of Snow White, his Dickensian Victorian bachelor uncle who turns out to be a magician, and the like, in his Narnia novels.) The preponderance of this proportion probably grow out of believing that they have been granted magic powers as individuals but only via their initiation into cults such as wiccanism and the Gnostic church, collectives that inculcate in their adherents a belief that they possess an esoteric form knowledge capable of working palpable changes in the world–i.e., more than figuratively, a kind of magic. At its very least deranging (which is by no means necessarily its least pernicious), a steady reading diet of fantasy novels engenders in the reader’s mind a surfeit of Walter Mitty-esque fantasies ([sic] on fantasies qua repetition of fantasy in the plural, for why [in] the fudging heck do you think they call the novels in question fantasy novels?) wherein one imagines one’s quotidian self as a magician routinely using his magical powers to discombobulate, paralyze, or even annihilate such helpless and incorrigibly unmagical saps as one’s boss, spouse, mum, dad, and doubtless even (now that two generations of full-time fantasy lit-gourmandizers have come of age, such that we now horrifyingly cohabit with scads of fantasy lit-gourmandizing mummies and daddies) one’s children; not to mention by default a Weltansicht wherein a metaphysical and religious (or if you insist, Mr. Vulgarian, “spiritual”) essence is ascribed by default to all temporal powers, an essence qualitatively distinct from and inferior to that traditionally ascribed to those powers’ relation to the Judeo-Christian God in that it is both more immediate and more prosaic. In former ages one might indeed have conceived of some political figure whom one regarded as a tyrant or despot as satanic, and in conceiving of him thus one would indeed have been imputing to him all the malevolence and sinfulness a dedicated Satan-worshiper was conceivably capable of (and indeed occasionally—i.e., when his lifestyle gave a certain warrant for inferring as much [a warrant in the form of, say, his CCTV-documented regular attendance at all-night sex parties in which goat’s blood was ritualistically consumed]—even conceiving of him as an actual dedicated Satan worshiper), but one would still have regarded him as a fellow-human-being and have regarded the Devil’s manipulation of him as prevailingly oblique and impersonal, much after the manner of God’s beneficent oblique and impersonal intervention in the world under the name of Providence. In contrast, the dedicated Tolkien fan (and remember that “in a very real sense” all fantasy-lit fans are dedicated Tolkien fans inasmuch as Tolkien is the Homer, Plato, et al. of the fantasy sub-literary canon) in conceiving of some geopolitical baddie as Sauron regards him not merely as a minion of the supremely evil being but as the supremely being himself, but inasmuch in the Tolkieniverse the supremely evil being engages with the world in a prosaically immediate manner, a manner that is indeed less reminiscent of the Devil’s therewith than that of a twentieth-century dictator: Sauron has at his beck and call a group of clock-and-dagger spy-like minions on horseback with whom he maintains constant radio-like contact, and he personally spies on the world at large and (when the secret-decoder-ring like magic ring permits) at small via a giant bloodshot eye that cannot help both recalling Orwell’s Big Brother (remember: the publication of The Lord of the Rings postdates that of Nineteen Eighty-Four by half a decade) and anticipating Kubrick-and Clarke’s Hal 2000 computer. The Tolkien fan consequently conceives of this baddie as a figure that is at once far more menacing and far less menacing than the secular evildoer of yore and corollarily conceives of evil itself as altogether less menacing because all the more intimately and therefore ignobly involved in the mechanical goings-on of the secular world. Naturally this prosaification of evil via its Tolkienification has fed into the fortunes of the far-above touched-upon founding myth of our epoch, the myth of Hitler as “the worst guy ever.” Those Anglo-Saxons of the generation who defeated Hitler might have been expected to be possessed an abhorrence of his evilness whose intensity could never be matched by succeeding generations, and doubtless “in a very real sense” that intensity remains unmatched. All the same, in reviewing the archive of books, magazine articles, television programs, etc. produced by the members of that generation within the first two decades following the war one cannot but be struck by the insouciance-verging-on-flippancy of their attitude towards even the worst horrors inflicted by the Nazi regime. Charles Chaplin’s portrayal of the Führer himself as a bumbling figure of fun in The Great Dictator has often been excused on the grounds that the magnitude of the atrocities in the death camps was as-yet unknown in 1940, but the same get-out-of -Downtown Nuremberg-jail-for-free card cannot be issued to Mel Brooks’s over-the-top light-making of the Third Reich in The Producers in 1968 or John Cleese’s farcical impersonation of an early 1970’s Britain-residing Hitler running for his local council under the assumed surname of Hilter. And yet these humorous treatments of Nazis and Nazism were not at all controversial; they were indeed enjoyed and applauded by people of all ages, political persuasions, and religions. (For after all, Mel Brooks himself was—and is—a Jew.) Fast forward, as they say, to the third decade of the twenty-first century, and one finds a stand-up comedian being banned from You-Tube and facing criminal charges for digi-filming and posting a routine centering on his training of his dog to make a Nazi salute; one finds a member of the British Royal Family being universally excoriated for having donned an SS uniform as a fancy dress costume. Is the reason for this sea-change in public attitudes to jocular impersonations of Hitler & co. that the magnitude of the atrocities in the death camps is even better understood now than it was a half-century earlier? Most likely not, if one is to judge by the dwindling of the torrent of newly published history books on the Third Reich to a mere trickle over that selfsame half-century (and it seems reasonable to judge by that dwindling inasmuch as one does not often find these earlier-published books being cited by one’s contemporaries). The reason for the sea-change, it seems to me, is that thanks to the universal quasi-scriptural status assumed by fantasy lit since then, both Hitler himself and the Nazi movement are understood in entirely different terms than they were a half-century ago. Hitler is now understood to be an avatar (in both the classic meta-religious sense and the tacky video-gamic sense) of Sauron, a conduit of an evil too intense to be confined in a single human body, and consequently construable only as part of a gestalt additionally composed of the S.S., Gestapo, and Wehrmacht qua so many ringrwraiths, nazguls (?), and orcs; a gestalt distinctive in virtue (or perhaps, rather, in vice) not only of the unalloyedness of its evil essence but also in virtue of its exhaustiveness qua container of that essence. Once one has memorized the personnel of the Nazi regime and their interrelation in the hierarchy one has comprehended Nazism as fully as this new schema permits—which is to say, not a jot, inasmuch as the understanding of any phenomenon embedded in and mediated by history is ineluctably contingent on an understanding of its formation by other earlier historical phenomena that however inalienable from it they may be are qualitatively distinct from it. In Nazism’s case these phenomena include, inter multissima alia (and in no particular order of priority of substantiality [inasmuch as this order is of course highly disputable and hotly disputed]), Wagner worship, Nietzsche-worship, post-World War I revanchism, late-stage industrialization, militarization, and intensified economic competition of Germany with the United States. And do you really think the old-style treatment of Nazism by Chaplin, Cleese, Brooks, et al. adequately captured the contributory force of all these phenomena in relation to Nazism? Of course not, and indeed the inadequacy of this treatment was recognized almost from the very start by (surprise, surprise!) Theodor W. Adorno, who quite rightly maintained that Chaplin’s portrayal of Hitler as a laughingly bumbling oaf ineluctably made him seem as intrinsically and utterly harmless as the laughingly bumbling oaf next door. Still, it cannot (or at least shouldn’t) be denied that the old-style treatment of Nazism captured a facet or portion of the truth about Nazism—the truth that, inter alia multissimma pessima, the Nazis were extremely ridiculous, and indeed ridiculous in exactly the same way and for exactly the same reason as the laughingly bumbling oaf next door. The conception of Hitler &co. as a Sauron-&-co. like gestalt captures no portion of the truth about Nazism, because although Nazism was evil in a more than usually real very real sense (whence the absence of the usual scare quotes), neither Hitler nor the rest of the Nazis was or were evil in anything like the way in which or the reason for which Sauron-&-co. are evil ([sic] on the are, for as I learned in tenth-grade English class, one must always write about fictional characters in the present tense, “for they are as alive now as they were at the moment of their creation”); they were not mutually interactive and collectively self-contained agglomerations of a malevolent evil plasma or jelly; nor were they dispatched in anything like the way in which or even for the reason which Sauron-&-co. are dispatched in The Return of the KingWell naturally not, because the Nazis were not magically spirited away thanks to the casting of a magic ring into a magic volcano crater.  Naturally not, but according to the Tolkienian conception of Nazism, the Nazis might as well have been spirited away by magic inasmuch as they vanished from the world the moment the mushroom clouds sprouted from Hiroshima and Nagasaki just as spectacularly as Sauron & co. vanished from Middle Earth with the eruption of Mount Doom. But for Daimler-Chrysler’s sake, Hiroshima and Nagasaki were in Japan, not Germany! Well, of course they were, but at the moment we are not dealing with the objective events of the war but rather with the war as imagined in the popular imagination and specifically the debased popular imagination of a populace composed of fantasy-lit readers, in whose minds the Japanese under Tojo & co. were but an auxiliary gestalt of Hitler & co. (yes, yes, yes: despite the fact that the U.S.’s entry into the war was provoked by a Japanese attack, that Japanese-Americans but not German-Americans were interned by FDR, et-fudging-cetera, et-fudging-cetera).  But if according to the Tolkienian conception of Nazism, the Nazis vanished from the world back in 1945, how can that conception conceivably have exerted any influence on the popular conception of any latter-day statespeople, let alone determine that popular conception today?  It can have exerted that influence and can determine that popular conception today (and indeed will be able to continue to determine that popular conception for the indefinite future) on account of the ontological quasi-independence of the Tolkieiniverse from the history of the empirical world, on account of the fact that the events of The Lord of the Rings did not actually take place between ca. 1933 and 1945, on account of the fact that although in the Tolkieinian conception Hitler & co.’s essence was exhausted by Sauron & co., the essence of Sauron & co. is by no means exhaustible or even ever-so-slightly depleteable by Hitler & co., such that the Sauron gestalt is ever-amenable to being instantly reinserted en bloc and in toto into empirical history the instant a present-day statesperson is likened to Hitler or even generically described as a despot or a dictator, inasmuch as Hitler has of course become the archetypal despot or dictator. (This perennial reinsertability of the Sauron gestalt is incidentally the most damning demonstration to date of the gormlessness of Tolkien’s assertion that the plot and fabulae personae of The Lord of the Rings owed absolutely nothing to the historical events of the period of its composition, for even if those events did not even exert even an unconscious influence on that composition, in the long run The Lord of the Rings has both made those events its own and made them the pattern for all subsequent historical events via their extrusion through its plot and fabulae personae.) But even granted that the Tolkieinification of post ca.-1933 history is an inadequate means of grappling with that history—It’s not inadequate but wrong. All right, granted that it’s wrong, what is the right way of grappling with that history, given that you’ve already dismissed satire as inadequate, which perforce falls far short of right even if it is not completely wrong? The right way to deal with that history is simply to resign oneself to the impossibility of encompassing it within the bounds of a single literary mode or genre, or at any rate, to a mode or genre that is so gosh-damn(ed) prosaic that one seldom bothers to think of it as literary at all—viz., documentary historiography. And by “documentary historiography” I take it mean some sort of textual analogue of those documentary films about Hitler & co. that were notoriously popular on the History channel twenty to thirty years ago. No, I mean a textual actuality possibly likenable to nothing but itself and consisting of nothing but the compilation of documents more or less transparently germane (if not necessarily German) to the phenomenon in question. So in the specific case of Nazism, I would imagine some sort of massive anthology of every sort of text that either instantiated or recorded some phenomenon that in some transparently germane fashion fed into Nazism over the course of as many centuries or even millennia via which some degree of continuity of influence is traceable. So for instance—to take an example about which I am well-nigh completely ignorant—one has been told that the Nazis considered themselves “Aryans,” a designation that before them had referenced a group or lineage of people resident on the Indian subcontinent, and that their symbol, the swastika, ultimately hailed from some Indian-subcontinental sub-civilization. Presumably they got the idea of adopting or arrogating these India-originating elements from somewhere other than archeological or anthropological research directly engaged in onsite by future members of the NSDAP; presumably some Germanophone or other in the late nineteenth or early twentieth century wrote a book in which he stated something that led future members of the NSDAP to find it plausible to consider themselves Aryans and to adopt the swastika as some sort of personal or collective emblem. Accordingly, an acquaintance with this book will allow one to trace precisely to what extent and in what way the Nazis were and were not indebted to subcontinental India for their idea of the Germans qua so-called Herrenvolk, qua so-called master race, by allowing one to see the extent to which this book reported and distorted the meaning and function of Aryanism and the swastika in subcontinental Indian history. B-b-but isn’t there a danger that in acquainting oneself with these documentary sources one will discover that, that, that—What, that the Nazis were actually not completely wrong about something? In the supposedly genuinely immortal words of HRH J.H. Christ (at least as reported in some post-KJV English translation) “You said it, not me.” Why, of course there’s such a danger, and the mere fact that one has to be reticent about saying something like that attests to the of Tokienian conception of Hitler & co. as Sauron & co. It is as if one is afraid of conjuring up all the evil essence of Nazism in one go merely in the utterance of the very word name Nazi  in any other context than that of a curse. But the most potent “danger,” if one is obliged to term it such, is not that one will learn something good about Nazism as such but that one will learn something “bad”—that is to say, something discrediting—about someone or something that one has formerly regarded as formerly unalloyedly good in virtue of his or its supposedly complete immisibility with Nazism, and simultaneously and corrolarily learn that something or something one had formerly regarded as completely inalienable from Nazism was actually eminently alienable from it and therefore not altogether bad, not altogether discreditable. You’re obviously thinking of some quite specific person or thing or cluster of persons or things, so pray do stop this about-the-bush-beating, particularly if it is about an actual bush (or Bush). I shall gladly do that, and it is not about an actual bush or Bush (although I doubtless could make it about one if I wished to do—i.e., by focusing on the 43nd U.S. president’s grandfather-cum-41st U.S. president’s great-grandfather’s financial transactions with the Third-Reich German government). The thing in question is the ardency and seamlessness of Jewish participation in German nationalism from the time of its birth in ca. 1800 right on up to Hitler’s accession to supreme executive power in Germany in 1933; the fact that throughout this period a great many German Jews, and perhaps the majority of them, regarded themselves as Germans first and Jews second or even not at all and were regarded by people outside the German-speaking countries as Germans first and Jews second or not at all. One thinks, in connection with one of earlier epochs of this period, of Heinrich Heine, who after Goethe’s death assumed the mantle of the most internationally illustrious German poet and who chauvinistically maintained that Shakespeare’s characters were more like nineteenth-century Germans than nineteenth-century Englishmen; and in connection with the last epoch thereof, of Arnold Schoenberg, a Jewish citizen (or, technically, subject) of quasi-theocratically Roman-Catholic Austria who converted to Lutheranism, the arch-German religion, at the age of twenty-four, who in early forties enthusiastically enlisted in the Austrian army upon the outbreak of World War I, and who in his early fifties, in the mid-1920s, triumphantly maintained that his new twelve-note system of composition would assure the supremacy of German music for a thousand years. Are you then denying the prominence and pervasiveness of anti-Semitism in pre-1933 Germany? By no means—like anyone even vaguely familiar with music history, I am aware that the most internationally illustrious German composer after the death of Beethoven, Richard Wagner, was a fervent and very public anti-Semite; and from Mark Twain’s account of his sojourn in Berlin in the 1890s I have learned that however they might have disagreed with each other on other matters, the various political parties in the imperial Reichstag were united by the fact that “they all hate[d] the Jews.” And of course one cannot pretend that late nineteenth and early twentieth-century Zionism was not as strongly actuated by Europeans Jews’ desire to escape persecution in their own countries as by their desire to reclaim the Holy Land. What I am denying is that there was something intrinsically anti-Semitic about Germans’ pride—or even smugness—in being German and that the intensification of this pride over the course of the nineteenth and early twentieth centuries was ineluctably vectored toward the Holocaust. For pre-World War II European anti-Semitism was of course by no means confined to the German-speaking countries, as is instanced by the Dreyfus affair. But of course no present-day Anglophones, apart from super-enthusiastic Proust fans, the ones who not only read all seven volumes of A la recherche but also pursue the minutiae of his biography, have any longer even heard of the Dreyfus affair. And so, although many present-day Anglophones (although probably not most of them) are aware that France was occupied by the Germans and had an openly pro-Nazi government through most of World War II, most of those selfsame many think of the anti-German resistance fighters as the real Frenchpeople and of their fight against the Germans as being exactly coextensive with a fight against anti-Semitism (even though a good portion of those resistance fighters were hardly philo-Semites themselves). Consequently no Anglophone today regards French nationalism in a sinister light; today no Anglophone begrudges the French their overweening smugness about the infungible superiority of their baguettes and croissants and escargots and whatnot, or even regards this smugness as a manifestation of nationalism at all. Corollarily, every supposedly liberal or progressive Anglophone has come to regard nationalism itself as a phenomenon both invented and monopolized by the Germans and inextricable from anti-Semitism in particular and xenophobia in general, and as a sub-consequence of this corollary, it is enough for any present-day Anglophone to evince the faintest concern about the territorial and demographic stability of his home polity to be labeled a nationalist and corollarily a Nazi and corollarily a real-world minion of Sauron whose utter annihilation is not only not to be regretted but positively exigent. And yet if the history of the early twentieth century had taken a very slightly different turn, we might now be regarding the connection between nationalism and xenophobia in a very different light.  Do you actually meantersay that if the Germans had won World War I the Holocaust might have been instigated and carried out by the French rather than by the Germans? I actually meantersay that—with, of course, the proviso that that might is a very big might, but it is not a might of infinite size. One must remember (or be made aware for the first time, assuming one is not one of the abovementioned super-enthusiastic Proust fans), after all, that Dreyfus was charged with being a German spy, and that hence the Dreyfus affair was at least as much a Germanophobic affair as an anti-Semitic one, and further hence that, pace our (or at least my) high-school history teachers’ presentation of the start of the First World War as an almost purely mechanical process actuated by a Rube-Goldberg (or Heath Robinson)-like assemblage of treaties and quasi-treaties that had been constructed by diplomats almost as a sort of intellectual party game with no connection whatsoever to the sentiments of the populaces of the polities for whom they had been deputizing, the French of the Belle Epoque were on the whole a nation of German-haters itching for a fight with Germany, such that had they lost the Great War they might very well have felt a flavor of ressentiment not radically different from that felt by the Germans after the signing of the Treaty of Versailles (which naturally in that case would have been the Treaty of Potsdam), that they (or specifically their non-Jewish majority) would have felt that they had been sold out to the Germans by the Jews in their midst.  (As a quasi-side-note I should mention that Germanophobia seems to have complementarily trumped anti-Semitism in the United States in the run-up to the U.S.’s entry into the war, as instanced the interpellation of a Jewish lawyer—a man presumably not accidentally surnamed Dreyfus—as a German spy in the immediately pre-entry-set portion of Dos Passos’s U.S.A.) As to whether in that case the French would have gone so far as to construct death camps…well, in weighing the answer to this question one is admittedly inclined to aver that there is enough substance to the Anglo-Saxon stereotypes of the Germans as efficiency-driven technocrats and the French as airy-fairy theoreticians to incline one to answer it in the negative. Still, one can quite easily imagine a post-World War II French government’s mass deportation of the country’s Jewish population to Palestine. And then in this connection one must consider the influence of both early-twentieth century Germans’ and early-twentieth century Anglo-Saxons’ attitudes to the French over the longue durée of the two-and-a-half centuries since the accession Louis XIV, on the meta military-cum front, and on the super- longue durée of the entire post-medieval era on the c*****l front (if not some super-super-duper post ca. 1066 longue durée on the c*****l front, for one finds glimmers of the attitude in question even in Chaucer, and at least north of the Channel it surely has its roots in the Norman Conquest), attitudes that basically boiled down to a super-national inferiority complex—the sort of complex that is singularly inappropriately nicknamed the Napoleon complex, inasmuch as, for all post-World War II Brits’ and Yanks’ mutually backslapping belittlement of the French as perennial losers-cum-military freeloaders who “deserve a civil war so that they’ll finally win a war” as John Cleese joshed only a few weeks ago as of this writing (August 1, 2022), up until World War I it was the French rather than the Germans or any of their pre-unification predecessors who were regarded as the preeminent military badasses of Europe (and indeed, although the Prussians certainly yearned for France-worthy near-hegemony from the reign of Frederick the Great onwards, by the same Tokay-fest, the Austro-Hungarians were satirized and adored for their quiescence, their need to be dragged kicking and screaming into any military enterprise), such that to the debatable extent that Mr. Putin is to be regarded as a territory-monger, it is far fairer to compare him in that capacity to Napoleon (whether I or III, for regardless of whatever degree of blame may accrue to Bismarck qua provocateur, the Franco-Prussian war was started by France) than to any Tsar. And on its just-mentioned c******l front, this complex consists in the sense among all Anglophones and Germanophones (and perhaps even a fortiori Russophones) that the French are simply more sophisticated, more discerning, more classier, much more tasteful, and altogether much cleverer than them, a sense both inculcated and manifested by the preference in varying degrees of these non-Francophone countries’ respective seventeenth and eighteenth-century aristocracies’ preference of French to the vernacular when speaking amongst themselves; a sense in against which the Nazis’ styling of the Germans as the Herrenvolk must ultimately be seen as a pathetic protestation (and indeed, in all these countries the Jews were in some measure a convenient domestic scapegoat for a Francophobia that could only be intermittently expressed through wars), a sense that is still manifested in the apparently insuperable prestige of French cuisine and viniculture and the ludicrous ease with which the most preposterous ideas have been absorbed into the English intellectual mainstream thank to their concoction and promulgation by Frenchpeople. Where exactly are you going with all this? Where I’m going with all this in the short term [obligatory Paris Metro or Eurorail-centered conceit preempted on account of the present writer’s unregenerate near-total ignorance of the trajectories of even the most prominent Paris Metro and Eurorail lines] is to some grotesque so-called alternate-world version of 2022 in which some anti-Jewish atrocity on the scale of the Holocaust had been perpetrated in ca. 1940 but had never since been greeted with any more deprecatory reaction than a mildly disapproving shrug from the so-called international community simply because it had been committed by the French rather than by the Germans. Where I am ultimately going with it is to the assertion that the sort of immersion in documentary historiography that I am recommending vis-à-vis the most myth-saturated historical phenomena will almost always (and possibly absolutely always) suggest if not reveal both that these phenomena are both super-situated and super-contingent in their manifestation, meaning that they could not have manifested at any even slightly earlier or slightly later historical moment and that certain other phenomena of drastically different repercussion and ramification might just as easily have manifested in their place; and corollarily that such an immersion in even a handful of such phenomena will inexorably lead one to the conclusion that in pondering prospectively historical phenomena of the present, one would do best to consider them as things-in-themselves—as unique and unprecedented events or entity-constellations—as long and near-exhaustively as possible and to postpone reaching for comparisons to past-sited phenomena as long as possible and thereafter to avail oneself of such comparisons as sparingly as possible. So for example, to confine ourselves to the neighborhood of the Nazi-gestalt, if one encounters an instance of what palpably seems to be an abuse of authority, why not simply call it an abuse of authority, an expression that is meta-historically burden-free, rather than fascism, perhaps the most meta-historically over-laden designation in the history of history? This sounds essentially like the prescriptive substrate of your anciently tendered riffette on Santayana: “Those who imperfectly remember the past are condemned to flatter themselves that they are repeating it.” And that is essentially what it is, although with regard to the above-discussed influence of fantasy-lit on the interpretation of history, imperfectly remembering does not accurately describe the shortcoming in question, for the fantasy lit-fans are not merely bringing to bear an incomplete knowledge of history to their interpretation thereof but rather imposing thereupon a template that is utterly alien to history; such that in conceiving a real-world personage of the present as a fascist they do not merely tog him out in their imaginations like a Brown-shirt or an SS officer as their merely ignorant countercultural predecessors of two generations ago; rather, in their chimera-saturated imaginations they transmogrify him from his physiological core outwards into an orc or the above all-seeing eye of Sauron. And such being the case, in other words, given that this manner of misconstruing history is entirely new, and that these minsconstruers are themselves historical entities, if only en bloc, one is duty-bound to be on guard against interpreting this manner itself and its place in the context of other present-day phenomena in terms mis-borrowed from those of the past. For example, one is meant to take for granted nowadays that the United States is “deeply divided” between a bicoastal “blue-state” urban “elite” (i.e., an upper middle cum-upper class) and a rural, inland “red-state” lower-middle-cum-working class depreciated under the auspices of various pejoratives that I will not name, and I at least half-sincerely believe that this idée reçue is one of those rare idées reçues that richly deserve their hearty general reception; indeed, I at least a-quarter-sincerely believe that contra every idée recue about the American past (every one of which idées reçues would tend to undermine the popularity of the “red states vs. blue states” idée recue if it were considered in comparative juxtaposition threwith, but idées reçues do after all exhibit a depressing tendency not to be considered in comparative juxtaposition with each other), the division between urban American life and rural American life is sharper and more pronounced than ever before. What—even than at the time of the founding? Why, yes, because at the time of the founding the United States hugged the eastern seaboard, such that the country people could not help having dealings with the city people (i.e., not to put too fine a point on it, if they wished neither to be drowned nor massacred by ******s); and because the rural elites conformed to the English pattern of maintaining both a country estate and a house in town. Well, then surely during the antebellum period the divide was sharper and more pronounced? No, certainly not, for as the case of the milieus of Abraham Lincoln’s and Mark Twain’s Southern-cum-Midwestern boyhoods shews, in the antebellum period a goodly proportion of the rural populace—a proportion by no means exclusively composed of the rural upper crust—were even more avidly addicted to book-learning than the average upper-middle-class city-dweller. So bazically, your argument is that in defiance of all idées reçues about urban-cum-rural American life up to but not including the “red states vs. blue states” one, rural and urban life were virtually mutually indistinguishable until, at least recently, twenty years ago (i.e., until whenever it was people started distinguishing between “blue states” versus “red states”; i.e., shortly before or after that recent U.S. presidential election in which some dumbass at one of the television networks decided to flip the traditional election-map color scheme of blue for Republican-won states and red for Democrat-won ones [and thereby made nonsense of the century-old color semiology of American politics, wherein red had signified Communism and by extension all things “liberal” or “lefty” and hence Democrattish])? I don’t know if I’d go quite that far, but I think Lionel Trilling hit the horseshoe nail-cum-boot hobnail more or less squarely on the head when he remarked in about 1950 that in the United States “the opposition between urban and rural ideals has always been rather factitious,” whichistersay that whilst I would never deny that the daily life of an American cowboy or tractor-driver has always been radically different in certain material respects from the daily life of an American stockbroker or longshoreman but merely that like Belinda Carlisle-and-her paramours avant la lettre and even slightly après la lettre (i.e., between the release of the 1991 pop song I am about to quote and the advent of the “blue states vs. red states” idée reçue) up until ca. twenty years ago the most ancient, rural Americans and urban Americans “lived the same dream and wanted the same thing,” whichistersay that whilst urban Americans and rural Americans of different social strata, life-walks, etc. might have harbored quite different fears and ambitions from those of certain of their neighbors, caeteris paribus or across the board rural and urban Americans’ fears and ambitions did not differ radically from each other. One sees ample evidence of the durability of this interregional concordia discors in, say, the emergence and development of the various American “musics” after the invention of the gramophone and radio. The music that came to be universally known as “country” music or “country- and-western” music by some point in the 1950s (as I recall, both Jack Benny and Leonard Bernstein referred to it as “hillbilly” music in that decade, but I cannot recall any country-music performer who came to prominence in the 1960s—say, Johnny Cash or Willy Nelson—being termed a “hillbilly” musician) was (as I have already disclosed in the immediately preceding parenthesis) originally called “hillbilly” music. “Hillbilly” differs from “country-and-western” both in genre of reference and in scope; it denotes a certain type of person, a sort of peasant or kulak, who lives in or on hills, and hence by implication (an implication implied by the geographical provenance of some of the music’s most celebrated performers and its principal organ of transmission, the radio program The Grand Ole Opry) the foothills of Appalachia. “Country and Western” more generally denotes music produced either in some rural rural part of the country or in the western part thereof. Presumably the name-change was at least partly actuated by the desire of the some demographically and financially significant portion of the music’s fans not to be pigeonholed as kulaks residing in the Appalachian foothills; presumably some of them regarded themselves as members of loftier and more respectable social aggregations than the peasantry, and presumably certain others, while happy to be pigeonholed as kulaks, wished to be associated with specifically Occidental peasant livelihoods like cattle-roping and bronco-busting. But neither the instrumental components of the music nor the subject-matter of its song-lyrics changed dramatically—or even yawn-inducingly—in response to the rechristening. To be sure, the country-and-western music of the twenty-oughties sounded (and sounds) noticeably different from the country-and-western music of the 1950s; to be more specifically sure, over the intervening half-century both the instrumental components of the music and the subject-matter of its song-lyrics (not to mention the production-style of its recordings) had drawn closer to that of the other “musics,” but these had never been all that stridently divergent from those of other “musics” in any respect to begin with. To be sure, in the Venn diagram overlap-zone comprised by the collective imagination of non-country-music fans and the collective imagination of self-mocking country-music fans there has long persisted a notion that country music song-lyrics are dominated by a set of topoi utterly alien to all other “musics”–e.g. (or perhaps even i.e.), one’s desertion by one’s woman or man, the death of one’s dog, and the terminal failure of the engine of one’s pickup truck. But truth be told, country-music song lyrics—the very earliest of hillbilly-music song lyrics included—have been dominated by the same two themes that have dominated the song-lyrics of all other American “musics” from ragtime to heavy metal to funk to trip-hop to tam tam-and-treble–viz., unrequited love and currently requited-cum-“everlasting” love, which just goes to show that, wie gesagt, regardless of in what part of the country or in what sort of “built environment” they lived, until very recently Americans “lived the same dream and wanted the same thing”—namely, to meet the right girl, woman, boy, or man and settle down permanently with him or her. Fine (or at least fine-ish, i.e., conceivably buyable), but the mere fact that they all lived the same dream and wanted the same thing did not mean that they all knew that they all lived the same dream and wanted the same thing, let alone that they all respected each other for sharing that same lived dream-cum-wanted thing. Indeed it meant no such non-wanted thing, and I won’t even deny that at the only slightly-sub-ideal level Americans often lived very different sub-dreams and wanted very different sub-things (i.e., pursued very different pastimes and means of arriving at and holding onto the shared dream and wanted thing) depending on what region and what sort of so-called built environment they lived in, or that they did not tend to look down their lorgnettes (or regionally or built-environmentally appropriate alternative apparatuses) at people in other parts of the country and in other so-called built environments. In the country one would tend to go looking for a paramour at a honky-tonk or (out West) a cattle market and give oneself Dutch courage to approach a prospective paramour by downing great lashings of Miller High Life or (out West) Lone Star, and one would take one’s beau, belle, or spouse out for an afternoon or evening’s entertainment to a tractor-pull or (out West) a rodeo, and for an evening’s dinner at one’s local barbecue shack or (in later years) Cracker Barrel. In the city one would go looking for a paramour at a nightclub or cocktail bar; one would give oneself Dutch courage (yes, yes, yes: if there is one thing that has reliably brought Americans of all regions and so-called built environments together at least since Washington Irving and Martin van Buren’s day it is their love-hate relationship with the Dutch) to approach a prospective paramour by downing great lashings of a martini or a margarita, and one would take one’s beau, belle, or spouse out for an afternoon or evening’s entertainment to a play or symphony concert, and for an evening’s dinner at one of the city’s numerous restaurants specializing in the cuisine of France or some Asian country other than China. And as a country resident one would disparage the city-dwellers as a bunch of sissified fancy pants hoity-toity hifalutin soandsos (I do in fact recall a country-and-western song from 20-oughties that explicitly denounced city-dwellers qua martini-drinkers), while as a city resident one would disparage country dwellers as a bunch of pig and sibling-swiving, Bible-thumping, gun-toting, snake-handling soandsos. But in the old days the rural-urban dichotomy was mostly overridden by a dichotomy that seems since to have dwindled in significance almost to the point of unintelligibility—viz., the distinction between the great metropolitan areas-cum-regions and the provinces. Back in the old days one could not escape the suspicion of being a rube, of being a pig-and-sibling-swiver et al. aut etc., merely by living in any old (or new) American city and drinking at one’s own burgh’s martini bars, dining at one’s own burgh’s Mobile star-less French or Asian restaurants, and attending performances by one’s local stock theater company or part-time symphony orchestra; and indeed, Boston’s self-styling as “the Hub [of the world or universe]” notwithstanding, the only American city in which one could escape that suspicion by living therein was the one “so nice they[’d] named it twice,” New York, New York. What about Los Angeles, the City of Angels and Dreams (and of Dreams of Angels)? And Chicago—That Toddlin’ Town, the Windy City? And Washington D.C., the capital of the entire s****ing country? Of course New Yorkers have always regarded Los Angeles as a pseudo-city on numerous accounts—its notorious unwalkability, its c******l centering on the ignoble pseudo-art of cinema rather than the noble art of the theater, its supposed bereftness of edible bagels, blintzes, knishes, and other Kosheriana, etc.;  and Chicago as but a shabby Midwestern knockoff of New York itself. And as for Washington, D.C.—why, in the light of its geographical and demographic tininess, it is regarded by New Yorkers as but an American Brasilia or Canberra. And if one happened to live in any other American city of roughly comparable demographic magnitude to Washington, why one might as well have been living in the suburbs of Omaha—nay, in the exurbs of Bismarck or Fargo—in point of the number of urbanity points one accrued thereby. Why, I recall Baltimore in the late 1990s being smugly disparaged as “one of the most provincial places on earth” from the pages of the Washington, D.C. City Paper, whose editorial office could not have been sited a furlong more than 40 miles from Fort McHenry. And while this rain of disdain from the Big Apple and the Three or Four Smaller but Not Completely Insubstantial Apples certainly prompted a fair proportion of the citizens of these 20 or 30 other cities to hang their heads in shame and pine for some so-called job opportunity that would give them an excuse to move to Gotham, or failing that, a series of vacation opportunities that would allow them to take in a Broadway show or Met opera at least once a year, from an even fairer proportion thereof it tended to elicit a certain defensiveness combined with a certain tendency to wear their provinciality on their sleeves as a badge of unshame if not quite pride. In one’s capacity as a member of this fairer proportion, however bored or repelled one might have been by tractor-pulls, rodeos, or whatever other plebian sport or pastime was most popular in one’s province, one would at least occasionally check in on the local television stations’ coverage of that sport or pastime for the sake of being able to hold one’s own as an ostensible aficionado of or expert on that sport or pastime in the presence of any visiting citizen of the Big Apple or the Three or Four Smaller but Not Completely Insubstantial Apples whom one might unexpectedly encounter. And on treating to dinner a visiting citizen of one of these Apples, however nauseated a member of this fairer proportion might have been by crab cakes, mountain oysters, or whatever other plebian (and self-evidently non-French and non-Asian) food-preparation for which one’s province was most celebrated (or notorious), a member of this fairer proportion would bypass even his province’s best or most expensive French or non-Chinese Asian restaurant in favor of its most “authentic” (i.e., usually, grubbiest and smelliest) purveyor of that preparation, and indeed would often drive dozens of miles into the exurbs to reach that purveying crab-shack, mountain oyster-mountain, autc. And such then (remember, pre-ca. 2005) being the case, the statement, “American city-dwellers are completely out of touch with the way Americans who live in the Heartland [or “in flyover country”] live” would have verged on the oxymoronic—it would have verged thereon given that most American city-dwellers were quite content and unashamed to be residents of flyover country or the Heartland. (A scene from John Waters’s 1997 film Pecker, one of his later opera centering on relatively normal Baltimoreans instead of on freaks and perverts, epitomizes this provincial metropolitan ethos: on alighting from the bus that has just brought her back from a trip to New York, a young Baltimorean woman played by Christina Ricci kisses the patch of hometown asphalt readiest to mouth.) Moreover, in the old days, even in the hippest and snootiest neighborhood in New York, whether Hell’s Kitchen or Park Slope, the residents’ contempt for even those parts of the country most thinly populated by humans and most densely populated by livestock (and hence most densely populated by livestock-f**kers) knew certain limits, limits that were not only well-defined but also painfully shy of the degree and intensity of contempt that one might lavish on, say, a rat or cockroach. And it was so painfully shy thereof not because these super-hipsters were preternaturally altruistic but because their Alltag, their daily routine-cum-itinerary in their capacities as super-hipsters, brought them constantly into intimate contact with certain realities that inexorably reminded them of their material dependence on the labor and produce of people who lived many hundreds or even thousands of miles from Gotham. A typical night out at their favorite juke-joint would entail being on a first-name basis with spirits, cordials, garnishes, etc. hailing from all over the United States and the world. One of their typical nights out at the theater would center on the naturalistic production of a play with a provincial or rural or provincial-rural setting, whether the Deep South of Tennessee Williams, the Russia of Anton Chekhov, or the Main Street of Thornton Wilder. And both at and en route to or from the theater, the juke joint, or their favorite they would more than figuratively rub elbows with out-of-towners in the form of bartenders, waiters, taxi-drivers, or just random fellow-pedestrians and public-transit users. In short, for all their love of living in the metropolis, they were never for an instant in a psychological position to regard their metropolitan existence as even fleetingly self-contained or self-sustaining. They undoubtedly regarded themselves as almost immeasurably more intelligent, more cultivated, and more refined than their fellow-citizens in the provinces, but they would not for all the top-shelf real estate in the five boroughs have wished these people out of existence because they knew full well that their continued enjoyment of their big-city amenities was immediately contingent on the well-being of those churls and louts, that their supply of Chateaubriand, beef awaze tibs, top-shelf spirits, etc. would have petered out into nothing in the absence of far-away farms, factories, distilleries, etc. in or on which these articles were produced; and extensive highways, waterways, and railroad lines via which they were transported into the city; and farmhands, laborers, drivers, et al. residing along the interstices of these lines of travel and transportation. Today (i.e., since 2010 at the most recent), for the first time in American history, one both finds city dwellers from “sea to shining sea” united in an unequivocal disdain for country-dwellers (and correlatively in their esteem for one another) and a belief that country-dwellers are completely materially dispensable and indeed a sort of prospectively fatal deadweight on both the Volksgeist and the body politic. One finds them thus united because their urban-ness (I shrink from calling it urbaneness or urbanity because it is as far from classical urbaneness and urbanity as Rivendell is from ancient Rome) does not consist in or center on their interaction with their immediate so-called built environment in ways that either set them apart from one another or unite them to their suburban and exurban neighbors; because being a New Yorker no longer consists in repairing to places within New York City that are visibly and palpably unlike places in other American cities and consuming goods and services that are visibly and palpably unlike goods and services offered in other American cities, but rather in engaging in a set of practices that, while being palpably distinguishable from the practice-sets engaged in in any rural American locale is indistinguishable from the practice-sets characteristically engaged in within the city limits of Seattle, St. Louis, Dallas, or even Sheboygan and Eau Claire. A practice-set consisting in what…dicking (or p***ying) about with one’s so-called virtual friends in so-called virtual worlds via one’s mobile phone or laptop? Yes, exactly. But don’t people d*ck or p*ssy about in so-called virtual worlds in the country as well? Yes, but in the country because one has to spend a goodly portion of one’s day milking or poking cows or busting broncos or driving a tractor or hoeing potato rows or…I don’t know (for fudge’s sake, altho’ I grew up in and now again live in the country, I did spend the central three fifths of my life in a city!), one can never fully mistake any portion of these virtual worlds for any portion of the actual world. Whereas in the city one spends all one’s time—including one’s working time (assuming one even has a job and is not living entirely on income from a trust-fund or annuity)—in some sort of virtual world. By one you presumably don’t mean everyone, and by all one’s time you don’t extra-figuratively mean all one’s time? Why of course I don’t there-either-by. It is of course a question of degree. Obviously, at least up until the so-called pandemic (and increasingly more commonly since the so-called pandemic), the majority of city dwellers had to work regularly outside their dwellings, but even for the ca. dozen years leading up to the so-called pandemic, the 24/7 all-virtual lifestyle was not uncommon in American cities, and it was certainly a highly respectable and sought-after lifestyle therein throughout that period. And in any case, hyper-recent city life is distinguished not only by its dramatic penetration by the virtual lifestyle but also by the even more dramatic recession therefrom of the various old urban genres of going out, leaving homebodyism of one form or other as the default lifestyle of the city-dweller’s leisure hours. Such that city-dwellers are no longer routinely coming into contact with people and objects originating in the rural landscape? Exactly. What—not even at the weekend neighborhood farmer’s market? [I, doing a spit-take over the top of my Jim Beam-&-Miller High Life boilermaker]: Of course not! The weekend neighborhood farmer’s market is the last place in the world (save another neighborhood’s neighborhood farmer’s market) in which to meet authentic rural people, for the vendor-scape of neighborhood farmer’s markets is notoriously dominated by city people making the feeblest of stabs at moonlighting as rural people via the miserably puny, etiolated produce of their minuscule allotment gardens and kitchens. In any case--and finally to rejoin the mighty mainstream of my argument—it would be not a jot detrimental to the city-dwellers’ material alienation from country-dwellers did their virtual worlds of choice consist even intermittently of reasonably mimetic simulacra of actual human-inhabited landscapes—if they spent their spare time working as virtual farmhands on virtually functioning farms, as spectators at or participants in virtual rodeos, etc. (granted, there presumably are video-games that offer such simulacra; the fact that the entire personnel of the so-called legacy media and usership of the major so-called social media platforms have recently shewn themselves to be ignorant of such a basic distinction of equestrianism as the one between reins and whips proves that the population of urban players of such games cannot be statistically significant).   But given that their virtual worlds of choice are centered on scenarios—all of them derived from or participating in the sub-tradition of so-called fantasy literature—in which even certain basic laws of physics are inapplicable and in which the most cataclysmic changes are brazenly effected by magic, they are bereft of any imaginative capacity to regard their rural compatriots as fellow-humans, let alone as humans on whose labor their existence is even ever-so-tenuously or fleetingly contingent. And so they will at kindest yawn—and at most typical cheer (as at the sudden cessation of some irritating noise of uncertain provenance)—at any misfortune, however dire, or any persecutory government policy, however tyrannical, that befalls or impinges on any rural-residing person or collectivity of rural-residing person? Even so. But given that the so-called levers of power are essentially exclusively in the hands of city-dwellers, what is to stop the federal government and the various state governments from at minimum allowing the rural populace to die out one drought and tornado at a time or at maximum (and therefore all the more likely-ly) exterminating that population at one go? Why, presumably, nothing is to stop these governments therefrom as long as the city-dwellers can continue to enjoy the preponderance of their ungodly virtual pleasures despite the attrition of the human population of the countryside. But happily, at least as far as the long-term vindication of truth is concerned (and I certainly sympathize all too keenly with those just-now inclined to interject, “What has the long-term vindication of truth done for me lately?”), the city-dwellers will sooner or later feel the pinch from an absence of adequate material support from the country-dwellers. And by the pinch you mean…I mean something much more catastrophic than the mere fleeting mini-fillips they have so far felt therefrom (e.g., the occasional five-minute long interweb service outage or dilation of A*****n’s two-day delivery window into three or four days [and as for the genuinely irksome pinches so far administered to the American way of life insgesamt—e.g., lower-double-digit percentage increases in gasoline and food prices, they are in no position to feel them, owing to their abovementioned subsidization by trust funds and annuities and their not-yet-mentioned full-time pedestrianism {yes, yes, yes: the present writer himself has been a full-time pedestrian for more than a quarter-century, but he was once a part-time driver, and even if he had been a full-time pedestrian since his toddler-hood he would still have grown up and come of age in an epoch when the indispensability of the motorcar to the American way of life was unignorable by the most cloistered non-driver}]. Haven’t you forgotten a rather less transient by them-felt maxi-fillip—viz., the middle-double-digit percentage increase in the cost of renting a dwelling-space? Indeed I haven’t, but it isn’t quite accurate to say they have felt that fillip, for before it could reach their collective left bum-cheek [for as they are lefties, their left collective bum-cheek is the only one that matters], the federal government swatted it away with an eviction moratorium under the pretext of curbing the so-called pandemic.)  By the pinch, I mean the sorts of so-called supply chain disruptions that prevent them even from eating at all as often as they like. Admittedly, for a goodly portly portion of them (in terms of both number of persons and gross tonnage), such a disruption long enough merely to compel them to miss a single midnight snack would prevent them from eating at all as often as they like. Indeed it would, but inasmuch as an at least more-than-badly portion of them (the old-school valetudinarian portion, the portion who still feel a modicum of shame at being overweight and are accordingly not entire strangers to fasting) could presumably survive a disruption long enough to empty out merely a supermarket-aisle’s worth of foodstuffs per household, we are probably going to have to wait for disruptions long and severe enough to result in universal near-famine. (Although, to be sure, the interval wherein the goodly portly portion alone feels the pinch and mobs of morbidly obese blue-hairs descend on our [ahem] seats of government demanding the right to a government-issued ration of midnight Twinkies will be entertaining enough.) And when the U.S. is languishing and indeed on its way to perishing under universal near-famine, the material dependence of city-dwellers on country-dwellers will belatedly be near universally recognized, and city-dwellers will have no choice but to re-establish their material ties to the countryside, and such a reestablishment will require the aside-setting of all stories and the up-taking of texts containing unadulterated, super high-octane non-narrative information—textbooks, technical manuals, blueprints (by which I mean literal architects’ blueprints, not bullshittic progressive policymongers’ metaphorical blueprints), maps, geographical and geological surveys, title deeds, testaments, and the like. And so stories of all sorts, types, genres, modes, etc., would have to go out the window for good (in both senses)? No, but the distinction between information-bearing and non-information bearing documents would have to be abolished, such that stories that did not ultimately bear total epistemological integration into the world as it has existed at some point in its history would have to go out the window for good (in both senses). So then, fantasy lit would have to go out that window…and presumably science fiction would as well…well, yes, or, rather, most likely 99% of both genres would go out that window, and over the longue durée that percentage would most likely slowly recede (or, to stick strictly to the metaphor of defenestration, the percentage that had been brought back into the house through the window would slowly increase) until it settled more or less permanently in the lower 90s (or upper single-digits), for as we have already seen in connection with Tolkien, and as middlebrow lit-crit wags have been saying about them for years, narratives in these two genres often reveal to us less about the worlds they imagine (about which they can in fact tell us nothing [although the middlebrow lit-crit wags would never acknowledge this]) than about the worlds of the moment of their geneses, and with the passage of time what they reveal to us about that moment is bound to increase because our own relation to that time is becoming ever-more-distant, such that we are ever-less-likely to share the author’s assumptions about the world. One thinks here, in this connection (although the example is infelicitously [albeit not fatally, as will be seen anon, LW] drawn from television, and hence from something closer to drama than to narrative), of the prominence accorded Paradise Lost’s “better to rule in Hell than serve in Heaven” qua Milton-reference in a certain episode of the original 1960s Star Trek series. The inclusion of the tag in the episode in that capacity reveals that at the time of the episode’s production, Anglophone elites and near-elites seemingly took it for granted (as elites and near-elites of today would shrink in horror from doing) that Milton would be a salient c*****l point de repère for at least another two hundred years, something that might easily be forgotten were one simply to read present-day Star Trek fans’ near-complete ignorance of the monuments of English-language literature back into the habitus of the show’s original producers and audience. Of course, over the super-longue durée, supposing this trend of treating all narrative-centered c*****l artefacts as documents of c*****l history continued, we (or, rather, our descendants) would not need to go questing (yes, yes, yes, like the original crew of the Enterprise on its five-year mission) further among the c*****l rubbish-heap of SF and so-called fantasy literature for further evidence of the mentalité (an untranslatable Gallicism by no means to be confused with our Sinatra-esquely brutish mentality) of the Anglophone elites and near-elites of the early-late twentieth century, for we (or, rather, they) would have formed a settled enough sense of that mentalité that any addition to the evidence-archive thereof would be so much Johnsonian pack-thread or Sternean trunk-hose—whence my abovementioned mention of a settled upper-90s (or lower single-digits); at a certain point we would know more or less exactly and definitively to what extent, in what sense, and out of what motive elite and near-elite Anglophones still admired Milton enough to quote him in the early-late 1960s, just as we now have a more exact and definitive understanding of the extent, efficient cause, and motive of late-Victorians’ desire to believe that Shakespeare was not the author of his own plays; namely (as Bill Bryson of all our contemporaries has perhaps most famously shewn), greatly, owing to their ignorance of the Elizabethan education system, and out of their ardent fetishization of university degrees, respectively. (Today, of course, the anti-Strattfordians are much less common than a hundred and fifty-years ago, but that is because the proportion of Bardolaters in our elite and near-elite—and consequently, the proportion thereof of people who admire Shakespeare enough to wish to remake him in their own image—is much smaller than a hundred-and-fifty years ago, not because the Elizabethan education system is better understood or university degrees less ardently fetishized now). And what about the remainder of narrative prose literature, the super-canon thereof stretching from the airport novel to the Munro-doctrinaire novel to the so-called proto-realist novel of the seventeenth and eighteenth centuries (not that the novel of those two centuries is actually called that (at least not often enough to have reached my eyes or ears), but that they might as well be inasmuch as they are well-nigh-universally regarded as prototypes of their nineteenth-century successors) and full-blown so-called realist novel of the nineteenth and early twentieth? What is to happen to it? Why, it (or rather a proportion of it that starts at about 10% of the total archive in the epoch of the airport novel and swells to near 100% in the micro-epoch of Don Quixote, such that the entire retained portion may be not unfittingly likened to an ear trumpet or first-generation gramophone horn pointed into the past [not unfittingly, i.e., inasmuch as not only does the proportion of the archive retained increase ) will take its place in our collective living room alongside the tiny percentage of the SF-and-fantasy canon kept therein. What, merely alongside it, not above it? Yes, merely alongside it, not above it, for while the proportion of that super-canon retained will always be higher than that of the canon of SF-and-fantasy, the purpose of retaining it will not differ a jot in kind from that of the purpose for which the most minable ninth-generation Lord of the Rings knockoff might be kept therein. B-b-but what about questions of aesthetic quality?  Questions of aesthetic quality, schmestions of shmaesthetic schquality! Oh, go to, go to! You of all people, a snob’s snob, are hardly in a position to apply the schm-prefix to that phrase. Agreed—to a certain very limited extent or after a certain very limited fashion. For the truth is that a disproportionately large proportion of the proportion of that super-canon that I would retain would consist of works that have quasi-traditionally been regarded as being of highest aesthetic quality—works such as the abovementioned A la recherche, Tom Jones, and Ambassadors ([sic] on omission of the definite article from the title, for as Kingsley Amis hath shewn [shewn, not argued] in his King’s English [a book whose full title is naturally The King’s English], the omission of the definite article from a definite-article-commencing title after an occurrence of the definite article or any other determiner [e.g., as in the immediately preceding case, a possessive adjective] is one the snob’s snob’s most highly prized shibboleths])—and my wish to include them would be quasi-categorically indistinguishable from my admiration of them. But this admiration arises at most only secondarily from these work’s embodiment or exemplification qualities that have quasi-traditionally been regarded as the hallmarks and criteria of aesthetic quality—e.g. verging-on-i.e., stylistic elegance, structural or formal proportionality, and the adroit management of symbols and allusions. It arises above all from their  containment—not embodiment, but containment—of that Adornan (or is it Adornonian?) quality known as truth content. But have you not already depreciated these works up to a point for their lack of truth content? No, I have depreciated the mis-assimilation of the truth content contained in them, and specifically the tendency of the admirers of these works to apply without so much as a by-your-leave the truths about states of affairs in the world of their own genesis to states of affairs in the world of the present. But what else are these admirers of these works supposed to do with them?  For (to echo in reverse an expression you have lately used), what have states of affairs in the world of the past done for them lately? This demurral holds water only against an assertion—à la, say, the middle-period Foucault—that past ages are hermetically sealed off from the present such that nothing even remotely practicable in the present can be learned from them. But of course I have asserted no such thing and have indeed effectively asserted just the opposite (I would opt for the less clunky word antithesis but for its Hegelian associations, associations that in the context of the present essay—an essay including discussion of the thought of such Hegel-guided thinkers Marx, Benjamin, and Adorno--would have discursive effects analogous to those of parallel fifths in music)—viz., that every particle of every past age impinges on the present at practically uncountable points in practically uncountable ways. But “ironically” or perhaps even “paradoxically” this omnivectored impingement precludes any immediate legitimate application of any narrative hailing from any moment—be it one sited as recently as ten seconds ago--to one’s present circumstances; the locus classicus of an application precluded by this proscription being (at least within the present writer’s oeuvre) a present-day spinster who applies a Jane Austen heroine’s pre-matrimonial stratagems to her own search for Mr. Right. So then, à la Nabokov, you would have every reader identify not with any particular character in a given novel but rather with that novel’s author? No: I would have every reader identify with none of the persons fictional or factual having anything to do with the blessed or cursed book; I would have the reader not put even the assumptions or intentions of the author (within the scope of which I of course include the implied intentions and assumptions of the implied author [for, as I believe Charles Rosen quotes William Empson as saying somewhere, if one doesn’t explicitly say that one is aware of the inside dope on a given subject, one will be taken to be utterly oblivious of all but the outermost dope thereon]) beyond reproach or examination. So, then, you would have every reader arrogate the privileges of the Deity Himself! To the contrary, I would have every reader, not arrogate but merely assume as his gosh-given right the rights (for no right can rightly be termed a privilege) of the critic, the selfsame right that allowed Dr. Johnson’s fellow-Sam Samuel Richardson to infer and then state that as an authorial-cum-narrative presence in Tom Jones, Henry Fielding exhibited the manners of an ostler. But didn’t you argue far above that in ascribing the manners of an ostler to Fielding, Richardson completely missed the point, that in ascribing these manners he merely underscored the unshakeability of Fielding’s authorial authority qua member of a class many strata above the one occupied by ostlers—viz., the rural squirearchy? Indeed, and I stand by that argument as steadfastly today (i.e., August 22, 2022) as on the day on which I tendered it. But just because Richardson missed the point about Fielding’s ostler-like manners (not, I hasten to emphasize, the point of them, a turn of phrase that would suggest that Fielding had consciously adopted the manners of an ostler, which for aught we know he did, but which we will most likely never know, given that apparently neither Fielding nor Richardson nor any of their contemporaries has or have left to us any document stating as much, however implicitly or obliquely) does not mean that his observation of those manners was worthless, for after all, those manners were there and Richardson was a great novelist in his own right and no dummy, a writer who could fully convey the intellectual mediocrity of one of his characters simply by having him repeat the phrase “let me tell you” a few two many times. No (or yes [for it seems to me that the affirmative and negative adverb will do equally well here]), Richardson’s observation of those manners marked a certain moment in the discovery of the ultimate and fundamental truth-content of Tom Jones—a moment, namely, when an intelligent man of the English urban middle class like Richardson was surprised to find a member of the English landed gentry comporting himself like an ostler , even in the pages of a book and even in the relatively understated and detached capacity of the narrator of a story including characters patterned on members of the lower orders; and both the mid-to-late eighteenth century Fielding fan’s attitude to Tom Jones, the attitude of those of Fielding’s contemporaries who had no problem with his narrative manners either because they (e.g., James Boswell) got along famously with ostlers (and plyers of trades on a par social-status-wise with ostlering) despite being members of the uppermost orders or were ostlers (or plyers of trades on a social par with ostlering) themselves and the attitude of the Fielding reader of the automotive age, a reader who is incapable of being put off by Fielding’s ostler-like manners because he has never had any dealings with ostlers or the plyers of trades on a par with ostlering social-functionwise because such trades no longer exist (even if he is a plyer of a trade that is on a par with ostlering social status-wise because such trades still exist in abundance)—both of these also constitute moments in the discovery of that ultimate and fundamental truth-content of Tom Jones.  But where is the aesthetic interest to be found in the discovery of these moments of truth-content? The aesthetic interest therein or thereof is to be found in the simple but over-bowling epiphanic realization that these moments of truth-content participate in the vast multi-millennial reality (not story, but reality) of mankind’s material and intellectual engagement with the human and non-human world. But what about the aesthetic interest traditionally derived from reading novels—the interest gratified by immersing oneself in the world of this specific novel and no other? That interest must simply be renounced altogether. Walltogethah? Walltogethah. There is simply too much danger of falling prey to nefarious mendaciousness inherent in and to such immersion of oneself in narratively-driven and centered texts à la William Hurt’s immersion of himself in the so-called sensory deprivation chamber of the Ken Russell-directed and Paddy Chayefsky-scripted (not to mention John Corigliano-scored) 1980 motion picture Altered States. Do you really believe a person will be transformed into a ravenously homicidal proto-human primate in direct and sole consequence of an afternoon of reading Pride and Prejudice or Rabbit, Run or The Fellowship of the Ring? No, I believe that in direct and sole consequence of an afternoon of reading one of these books one will be transformed into something potentially far more dangerous than a ravenously homicidal proto-human primate—viz., an early twenty-first-century Occidental biological human being who mistakenly believes he (or she) is a Regency toast or a beatifically harmless three-foot tall hairy footed humanoid hailing from a Solaris-like counterfactual oasis of rural early modern England in a counterfactual Gondwanaland inhabited by irredeemably harmful semi-humanoids; or vis-à-vis that pinnacle (or nadir) of the Munro-Doctrinaire epoch, Rabbit, Run one will be transformed into something potentially far more dangerous than either of the preceding. What, an early twenty-first century Occidental biological human being who fancies that he’s a middle-aged American paterfamilias of the mid-twentieth century? Indeed not, and indeed the net effect of millions of early twenty-first century Occidentals’ succumbation to such a delusion might actually be beneficial, inasmuch as the middle-aged American paterfamilias of the mid-twentieth century was on the whole quite a clubbable and productive chap, and withal a provider of amenities that remain not entirely superfluous in the early twenty-first century. No, the something I am thinking of is an historically and socially fungible ego constantly beset by snippy remarks from an overbearing superego reflexively patterned on the smugly purportedly omniscient narrators of Munro Doctrinaire novels (and most reminiscent in comportment to the purported best friend who in the Ben Stiller Show parody of a shampoo commercial trails the fellow trying to flirt with a girl and constantly interrupts his every utterance to her by remarking aloud that he [the flirter] has dandruff); in other words, a person who cannot find himself alone for two seconds without supposing that someone with more insight into his character is judging him and finding that he has been doing everything the wrong way or his entire life so far, that he is “ironically” the very epitome of the very sort of person he fancies he hates the most, etc. So then are you saying it’s wrong to wonder if one is after all the epitome of the sort of person one fancies one likes the most, if one has been doing everything the right way; that one should take self-satisfaction as one’s starting and ending point when one is alone with one’s thoughts? Of course not, and to the semi-contrary, I suspect that there is all too little such wondering chez present-day Occidentals alone with their thoughts. But I maintain that such wondering should generally be super-particularistic in scope and guided entirely by a combination of firsthand experience and precepts promulgated by the classic promulgators of moral wisdom—viz., religious scripture and moral philosophy. One should be constantly reflecting à la the Horace of one of his Satires [which are themselves vehicles of moral philosophy] on whether one did or did not say or do the appropriate thing at such-and-such a moment, and if one finds that the answer is “no,” determine what one should say or do instead when one next finds oneself in a situation approximating the one of that moment; and to the extent that one is compelled to appeal to outside sources rather than one’s own memory for guidance on how to judge the appropriateness of this utterance or action and its successor utterances or actions by principally by appealing to works (like the satires of Horace or certain books of the Bible) that explicit position themselves as dispensing guidance on what to say or do in certain situations. And what is wrong with appealing instead to a classic novel of the Monro Doctrinaire period, to putting oneself in the shoes of Updike’s Rabbit when he has said or done such a thing? Merely in describing it in those terms you have already shown what is wrong with it. For in a classic novel of the Munro Doctrinaire period, the hero (or more often super-anti-hero) is seldom if ever merely confronted with the matter of having said or done a certain specific thing that can be considered even provisionally in isolation from other things he has said or done or might say or might do; he is, rather, repeatedly and nearly incessantly confronted by having said or done something that impinges in some crucial or devastating way on the question of the very worthwhileness of the continuation of his existence; this because his “creator,” the hyperomniscient narrator, has seen to it that his every thought, utterance, and action bears some morally and metaphysically significant relation to all his other thoughts, utterances, and actions, such that the mere recollection that he has failed to leave a certain household appliance running (or , that he has failed to switch it off), that he has smiled or frowned (or failed to smile or frown) at a certain person at a certain moment, that he has failed to remember something about a certain person at a certain moment (or assigned a memory of a certain person to another person), autc., is capable of making him suddenly realize that he has been “living a lie,” that he has “never loved anyone other than himself,” that (as mentioned before) he has been “doing everything the wrong way.”  So is one really better off taking one’s life-coaching from the trashiest self-help guru, from a Leo Buscaglia or a Deepak Choprah or a Tony Robbins, than from even the tip-toppiest novel of the Munro-Doctrine school-cum-epoch? Why, merely in terming what one might be taking from these trash-mongers life-coaching you are already shewing yourself to be (for the umpteen-trillionth time, of course) of the devil’s party, for no sane and would-be virtuous person turns to any author of any sort of book for life coaching; for no sane or would-be virtuous person regards his own life as a series of physical-training sessions alternating with a series of athletic contests or expects to receive from any author the degree or kind of insight and discipline that an athlete receives from his coach. But if your question were modified to read, So is one really better off consulting the precepts of the trashiest self-help guru…, the answer I would give to it would be (or should that would be be is?) “Yes, abso-bloomin’-lutely.” For in the course of such a consultation no sane and would-be virtuous person would regard these precepts by default as the pronouncements of an unchallengeable authority; he would, rather, judge them one at a time as assertions to be proved or disproved by his own and others’ experience in the so-(and for the most part rightly)called real world, and he would consequently most likely find himself flinging the wretched bouqin the length of whatever (or should that be whichever?) unit of space in which he had been reading it. And he would regard these precepts in that light and probably find himself flinging the bouqin partly because not even the most self-aggrandizing self-help author presents himself as an omniscient god, because however brazenly he would have his readers believe that he knows exactly what he is talking about and knows what he is talking about better than anybody else who has ever talked about it, he never pretends to found his assertions on any other, broader, or more solid foundation than his own experience.  Whereas the author of a Monro Doctrinaire novel…Whereas the author of a Monro Doctrinaire novel always presents himself by default as an omniscient god in virtue not only of the fact that he has immediate access to his characters’ thoughts but also of the fact that he maintains a monarchical chokehold over the criteria by which their thoughts and actions are to be judged. And in consequence he cannot help inculcating in his readers the sense that these are the criteria by which they ought to judge their own thoughts and actions. And are these criteria in fact ones by which one ought not to judge one’s thoughts and actions? The answer to that question depends on the answer to the potentially unanswerable question (of) whether one ought not to do things that it is impossible to do. Surely you mean, rather, whether one ought not to try to do things that it is impossible to do, inasmuch as the notion of actually doing the impossible is oxymoronic.  Yes, I suppose I mean that inasmuch as that. Well, then, presumably in these criteria—or, strictly speaking, in the type of person who meets all of them—we are simply dealing with the familiar notion of an ideal, in which case the impossibility of meeting them by no means furnishes an open-and-shut case (sic—with apologies—on the repetition of “case”) against trying to meet them, for there is after all, an antient and respectable argument that even the most unrelenting, dogged, indefatigable attempt to meet an unattainable ideal is estimable given that one will consequently attain a higher niveau of excellence than one would otherwise have attained.  Why yes, there is indeed quite an antient and respectable argument to that effect, but it is not applicable in this instance because the impossibility of employing the criteria in question is of a categorically different nature than the impossibility of attaining any ideal.  For in an ideal we are always confronted with a standard that human beings might attain if they possessed more of some quality that they already possess in some substantial quantity—more physical stamina , more self-control, more courage, etc.  In the criteria tendered by the Monro Doctrinaire, on the other hand, we are confronted by standards that require the possession of a certain quality that, at least as of now, no human being indisputably possesses to the slightest degree and that all human beings would probably have to possess to an exhaustive degree before one could apply them with even barely serviceable fruitfulness—namely, telepathy. It is presumably no accident that the expression “I’m no mind reader” came into common use as a rhetorical defensive weapon shortly after the Munro Doctrinaire novel came into its own in the 1950s—which is to say, the moment at which the English-speaking reading public began being fed with hyper-omnisciently narrated novels centered on characters who, as one would say nowadays, “looked like them” (i.e., the typical New Yorker reader-cum New York Times subscriber) rather than on people at a safe socioeconomic-cum-lifestylistic distance from them, and thereby began to expect themselves to evince the same telepathic insight that the protagonists of Munro-Doctrinaire novels expected of themselves. Oh, for fudging out loud: you’re a fine one to hold forth on categorical distinctions given that you’ve forgotten the biggest categorical distinction on-hinging fact about these novels—namely that it’s the narrators of these novels, not their characters, who are hyper-omniscient. I have forgotten nothing of the kind, as is shewn by the occurrence of expected in the last sentence but one. The protagonists of these novels are indeed not hyper-omniscient, but they typically expect themselves to be hyper-omniscient, and their actually hyper-omniscient narrators never give the reader any reason to suppose that they are delusional to expect themselves to be hyper-omniscient, and indeed these narrators give the reader every reason to suppose that these characters are at their most rational when they are imposing this burden of hyper-omniscience on themselves most ruthlessly. The protagonist of such a novel is forever—or at least intermittently and far more than occasionally—supposedly realizing that he has never understood or never loved a certain other character or never, for a single instant, attended to that other person’s needs and desires, with the ineluctable implication that such understanding or loving or attention is attainable by anyone who, unlike the protagonist in the run-up to his epiphany, can be arsed to give a toss about trying to attain it. But of course—or, rather, of no course (i.e., inasmuch as the mere lingering in-print-ness of a single Munro Doctrinaire novel [let alone the actual ] would shew that scads of people do not regard the state of affairs I am about to describe as a matter of course)—one can never actually realize that one has understood or loved or failed to understand any other person or attended to another person’s needs or desires however perfunctorily or briefly. At most or best one can recall whether at such-and-such a moment one did or said for or to another person what one would have appreciated having done or said for or to oneself had one been in that person’s situation, to the extent to which certain external objective indicators allowed one to understand that situation; because of course (or of no course) one cannot conceive of another person’s mind (or psyche, if  mind sounds too computer-like to your readerly eyes’ ears) except as a merely conjectural analogue of one’s own, which one can in turn understand only to the apparently ineluctably puny extent that one’s memory cooperates in providing access to it. (Yes, yes, yes: the Golden Rule is none too stealthily lurking in the foregoing assertion.) A brief interchange from a Seinfeld episode (note that I am citing a sitcom—i.e., an instance of a genre of drama (yes, yes, yes, despite the contingently cinematic {and hence merely semi-dramatic} medium of the sitcom in point—and not a novel) clinches this Chinese-whispers-in-a-vacuum-like essence of the ethic-cum-metapsychology other minds. Jerry, in advance of expounding the particulars of his latest unfolding amour to George, says, “Have I ever been anything less than completely forthcoming to you?” Whereupon George rejoins, “I don’t know: have you?” Whereupon Jerry re-rejoins, “I don’t know.” Jerry presumably starts out with the assumption that he has always been completely forthcoming to George because at the moment he cannot recall having ever kept a secret from George. And George presumably rejoins, “I don’t know: have you?” both because he knows he ultimately cannot know whether Jerry has ever kept any secrets from him and he can no longer remember whether he has ever caught Jerry out in apparently keeping a secret from him. And Jerry presumably re-rejoins, “I don’t know” either because he has suddenly remembered that he has kept certain secrets from George or because he has suddenly realized that he may very well have kept many a secret from George and simply forgotten about such secret-keeping. (Of course all these “presumablies” are necessary because we can only infer the states of mind that have elicited these speech-acts; which just goes to show contra Benjamin that it is drama rather than storytelling that is the interpretation-inviting human-world presentation mode par excellence, that it is the playwright rather than the storyteller who perforce requires his audience to draw his own conclusions [yes, yes, despite the running commentary of the chorus, a device that was already moribund in the later classical tragedy, was as fully vestigial as the proverbial tetons on a boar by Shakespeare’s time, and is revived nowadays only occasionally and, pace all the ultimately still-unsubstantiated meta-epistemological pretensions of Brechtian dramaturgy, for a merely stylistic effect {i.e., for reminding the viewer that plays employing this device were once routinely written}].) This is not to say that people do not presumably have moments at which they suddenly believe that they have never loved or never understood another person adequately or at all but merely that such moments are unfounded in any verifiable state of affairs and (I suspect more controversially) that such moments are probably both far rarer than moments at which people change their minds into believing that they have always loved or understood another person more than adequately and moments at which they couldn’t say or care less one way or the other whether they have loved or understood another person. But the Munro Doctrinaire novelist can have no truck with such ineluctable befuddlement on the intersubjective front, partly because in virtue of long belief that people’s lives are “as accessible as mustard in a jar” they have come correlatively to believe that the contents of those lives are as readily identifiable as mustard (and naturally mustard of the bright yellow type [altho’ at this point the conceit starts to disintegrate because at least in the States, bright yellow mustard is almost always stored in bottles, many of them made of opaque plastic]), partly in virtue of his retention of the storyteller’s obligation to provide narrative interest—to provide what is generally known as drama even in a narratological context despite the far-above mentioned insuperable ontological difference between the narrative and dramatic modes—he is well-nigh-inexorably driven to give a certain moral emphasis to certain phenomenal-cum-psychological moments at the expense of certain other such moments to which he either attributes less pivotal moral significance or attributes no moral significance whatsoever in virtue of not dealing with them at all; he is driven to engage in a kind of meta-morally skewed metaphenomenal-cum-metapsychological analogue to both montage and prosody. And what can the consequence on the reader of such novels be but the assimilation of a belief that his own inner life prevailingly consists of a succession of clear-cut (i.e. both seamlessly spliced together and intelligible) intersubjective crises–and, perhaps what is even more disastrous, that the inner lives of his family-members, friends, enemies, coworkers et al. also consist of a succession of such crises (albeit crises that always complement rather than echo his own in patterns dictated by the other person’s socially imposed and inalienable relation to him [such that, for example, if Fred is constantly thinking that he should be thinking, “Have I been sufficiently attentive to my wife Gail’s need for pleasure during the act of coition?” he is complementarily constantly thinking that Gail is constantly thinking not, “Have I been sufficiently attentive to my husband Fred’s need for pleasure during the act of coition?” but, rather, “Has my husband Fred been sufficiently attentive to my need for pleasure during the act of coition?”), such that, in short, one cannot but go about one’s daily business (or, increasingly likely-ly, not-go about one’s daily prostrated idleness) in an emotional mélange of guilt and paranoia. That sounds awful. But in gonnegtion with—or rather at right angles to—this awful-sounding SOA, might I tender one shaftlet, one sunbeamette, of hope?  Chute, but this had better be one sturdy shaftlet or sunbeamette. I think it is. Well then, chute. Don’t mind if I do. You mentioned earlier that the expression “I’m no mind reader” came into common parlance just after the Munro-Doctrinaire novel became popular. I believe I used slightly more pedantic and tight-assed language than that, sir, miss, or madam, but they were words to that effect nonetheless. Well then: no sooner did you say that than it occurred to me that I hadn’t heard the expression “I’m no mind reader” in quite a long time indeed, namely, since the mid-late twenty-oughties, and that the person who had then caused me to hear it by saying it was in her—for she was a woman—mid sixties, such that she may have simply been adverting to it out of habit, much as she might have adverted out of habit to some bit of 1960s-originating slang like “groovy” or “psychedelicious”; such that I cannot but hope that by now, a decade-and-a-half after this last hearing of “INMR,” the entire metaphysical-cum-metapsychological-cum metaepistemological mind-cast engendered by the Munro Doctrine has passed not only out of vogue but out of common intelligibility. I can see why you cannot but hope that, for the disappearance of “INMR” from the current lexicon is undeniable, but I’m afraid that the reason for, the efficient cause of, this disappearance is one that tends to foster despair rather than hope—viz., the complete, universal naturalization of the Munro Doctrinaire-engendered mindset. Do you meantersay that the Munro Doctrinaire-engendered mindset is so universally and fully accepted that it no longer occurs to anyone to resile against it, to think the thought that impels one to protest that one is not a mind reader? Indeed, I meantersay that very saying. Cor’s whores! But [you, suddenly pulling yourself together] in the absence of any trace of this naturalization in the lexicon, isn’t it as reasonable to assume that the Munro Doctrinaire-engendered mindset has disappeared as that has been naturalized? No, it isn’t. Mind you, if “INMR” had disappeared after the manner of a sinking ship or drowning victim and the void in the English lexicon left by it had closed up and left the language looking and sounding exactly as it had done in ca. 1950 (or at any rate, if positively swimming with other sorts of trash, with trash hailing from an entirely different lexical storm sewer than the one collecting the lexical effluent of fandom of and sales resistance to the Monro Doctrinaire novel), why then, one might very well have been entitled at least to conjecture that the Munro Doctrinaire-engendered mindset had likewise disappeared. But alas! (or, rather, alas!-cum-no s**t, Dr. Watson!), “INMR” receded in almost exact synchrony with the rise of a lexeme that would have remained as obscure, as recondite, as Scrabble-confined, as prinzie or qapik, had not the Munro Doctrinaire-engendered mindset  not secured universal and complete naturalization—viz., empathy. Oh, go to! Go to! Empathy is hardly a Johnny cum lately ([sic] on the absence of hyphens) in the English lexicon. Indeed, I dare say empathy was in the full bloom of manhood and confidently cavorting on the tongues and pen-tips of English speakers when “INMR’s” great-great grandsire was still in short trousers (and “INMR”’s great-grandsire was still a twinkle in “INMR’s” great-great grandsire’s eye, natch)! I gran