What! I say, my foot my tutor?
—Prospero, The Tempest
0.
One of the last flashes of creativity I saw before I deactivated my Twitter account (more on that below, and on its significance for this Substack project), was a question launched by another user (whom I can’t locate now, without Twitter, but to whom I say “thanks”). “Are there,” this user asked, “more eyes or legs in the world?”
To be honest I’ve been thinking about little else for the past few weeks since I encountered this “prompt” (as American undergraduates now say, I’ve learned, of what I still call “paper topics”). Nor is this only the burrowing and obsessing of a curiosity that does not know when to quit. As I am about to show you, I think this question has profound implications for our understanding of certain fundamental matters at the heart of our ongoing debates about scientific realism. In particular, while I’m still on the fence about eyes, I don’t think legs, strictly speaking, exist, and I think the non-existence of legs offers an instructive illustration of the limits of the “manifest image” of the world. Moreover, I think this difference has vast consequences for our understanding of certain prejudices that run throughout the history of philosophy. For example, it becomes clear why René Descartes’s “I think, therefore I am” sounds like a serious and laudable stab at explaining the nature of our existence, while Thomas Hobbes’s retort, “Why not: ‘I walk, therefore I am’?” sounds like facetious trouble-making.
In a Hobbesian spirit, then, let us make some trouble.
1.
So, are there more eyes or legs in the world? Somehow, my parochial mind first interpreted the question to be one about human beings alone. We are bipedal, and we typically have two eyes, so there would seem to be a near-equivalence.
Presuming we do not follow Aristotle, who in On the Soul establishes that a blind eye is no more a real eye than is an eye sculpted in wood, since it is the function rather than the arrangement of the matter that makes the organ the organ it is, we can count as “having two eyes” any person who has all or most of that gelatinous substance in the eye socket in virtue of which the sighted are able to see. Aristotle’s reasoning is that “if the eye were an animal, sight would be its soul”, while if you were in the presence of a dead animal, say, your defunct dog, you would be abusing language if you were to keep insisting, in pointing at the cadaver, that you still “have a dog”. You had a dog. But today we reject most elements of Aristotle’s teleology, and so, for us, the non-functioning material part or corpse that might otherwise be an organ or an animal counts as an organ or an animal just as long as it holds together. So, only someone whose eye, blind or not, has been thoroughly gouged out can be said to be “missing an eye”.
If we agree so far, we will probably also agree that there are more people missing at least one leg —defined, let’s say, as amputation above the knee— than are missing at least one eye, given that car accidents, landmines, the progress of diabetes, and countless other unfortunate circumstances may eventuate in leglessness. This means that although we are bipedal, and binocular, the eyes almost certainly have a slight lead among men.
But this is really just the beginning, as the question was not about our species, but about eyes and legs in general. Several terrestrial animals belonging to the antiquated category of “quadrupeds” possess, as their name suggests, four legs. “Tetrapod”, which is really only that term’s Greek equivalent, remains a real taxon today, but it is inexact, as it includes many animals, among them birds and legless amphibians such as the Mexican burrowing caecilian, that have evolved away from leggedness. Some tetrapods have also evolved away from photoreceptivity, but for the most part the vast majority of animals in this wide taxon, which really includes all vertebrates other than fish, have four legs and two eyes. The fish, in turn —which John Dupré has shown to be merely a vestigial class, a class of “leftover” creatures rather than a class with necessary and sufficient conditions for membership—, would seem to balance things out a bit, as they tend to have two eyes and no legs. And so, it would seem that the answer to our initial question might be arrived at by counting the number of tetrapod species and comparing it to the total number of fish species.
It might seem that we’ve arrived at our answer, but this would be only a seeming, since in any case vertebrate species make up only somewhere between four and six percent of the total variety of animal species. The second-to-last flicker of intelligence I saw on Twitter came from Jacob Levy, who observed that if behind the veil of ignorance you were told that you could, if you like, come back into this world reincarnated as “an animal”, you should decline the offer, as it is so highly probable as to be basically a moral certainty that you would come back not as an elephant, a whale, a dog, or even a bat, a mouse, or an eel, but rather as some sort of arthropod. Coming back as anything other than an insect, a spider, a krill, or some other creature of that order is an anomaly practically as remarkable as winning the lottery. In other words, don’t bet on it. (I think Jacob underappreciates, however, the distinct joy that must attach to life as a beetle or a scorpion, too, and I personally would still probably take the offer knowing how it was likely to play out.)
2.
So it turns out not only that dwelling on human beings is parochial, but dwelling on any of the “paradigmatic” animals is as well. If you ask a school-kid what their “favorite animal” is, and the kid tells you “Echinoptilidae”, you will think he is being facetious. That is just not what we mean when we say “animal”, any more than by “bird” we mean “penguin” or “ostrich” — what we mean is our own tiny little neighborhood of the kingdom, where we spin out our fables, where anthropomorphization is no great stretch of the imagination. The teacher just wants to hear “goat” or “bear” or something and to move on.
But goats and bears, even alongside humans and cattle, barely tip the scales. Our “two eyes, four legs” arrangement is a rounding error, alongside the 10,000,000,000,000,000,000 or so individual insects with six legs, and presumably comparable numbers of members of other arthropod species with eight legs (e.g., Arachnida), ten legs (e.g., many species of Crustacea), and more (e.g., Myriapoda, the “many-legged”, including centipedes and millipedes).
We’ll get back to legs in a moment, but what about arthropod eyes? Counting is difficult here. Some have no eyes at all, but most have either median ocelli, “simple eyes”, or they have lateral compound eyes, and many have both sorts. Ocelli are mostly useful for dim-light detection, while compound eyes are specialized for discerning the outlines of figures in the visual field. Many species have single-lensed “stemmata” in their larval stage, but then move on to multi-lensed compound eyes after adult metamorphosis. A compound eye, such as the horse-fly’s, includes thousands of distinct ommatidia, each of which has its own cornea, photoreceptor cells, and other parts that we associate with an individual eye.
If we count each ommatidium as “one eye”, then it is almost certain that there are more eyes than legs in the world. I am not inclined to count them in this way. As far as I can tell, the structural discreteness of ommatidia never entails even the possibility of functional discreteness, and the clustering of these several eye-like parts is not like, say, the clustering of hairs when you tie them in a ponytail. Rather, compound eyes, like our own eyes, evolved from what were initially only “eyespots” capable of detecting light and dark. Spherical eyes that could focus light into images were a later development, and the arthropod compound eye simply pushes this development further by outfitting individual photoreceptor cells with dedicated corneas, lenses, etc., while in vertebrate eyes the photoreceptors alls share the same corneas, lenses, and other parts. In an evolutionary sense, then, it seems that compound eyes are simply a further development of the singular photoreceptive anatomy of certain animals, rather than the coming together of several anatomical structures into one.
So, a compound eye is one eye, while ocelli are to be counted separately. Arthropods, then, may be said to have no eyes (e.g., blind water beetles, though even these have opsin genes associated with visual organs), two compound eyes (e.g., most flies), five eyes (e.g., the two compound eyes and the three ocelli of most bees), eight eyes (e.g., most spiders). In short it is very hard to say how many eyes there are out there, because this would require us first to survey all of the different arthropod species — some estimates say there are around ten million of these, while only about one million have been discovered and described at the present time; then we would have to count, or reliably estimate, the number of individual members of each of these species. That desideratum is a long way off.
3.
Still, it seems at least in principle possible to count all the eyes out there. When it comes to legs, I don’t think we have even that small comfort, since I don’t think we have any coherent idea of what legs are.
Again, here, we think we know, but only because we start out from ourselves and dwell mostly on our closest neighbors. But even within our own taxonomic neighborhood there is some dispute. Largely bipedal mammals —or “tripodal” ones such as kangaroos— are often said to have front “legs” as well as hind legs, even though these remain practically unused as legs. The reason for this seems to be that bipedality is a sort of nobility, one of the key foundations of our own self-conception, and to see the anterior extremities as anything other than an additional pair of legs, to see them as “arms”, is to encroach upon our human singularity in the natural order.
More on that soon enough. For now what I want to emphasize is that, as with eyes, the distribution of legs among tetrapods barely even scratches the surface of the question of what legs are, and of how they are distributed. But unlike with eyes, there is no single well-defined function analogous to vision or photoreceptivity that we can point to and say: where this function is present, there are legs executing it. You might be tempted to say that that function is “walking”, but where walking leaves off and some other means of locomotion begins is anyone’s guess. Consider the starfish, which typically has five “arms” (an honorary designation), each of which has numerous tubular “podia” on its underside which it uses to brush the surface beneath it and slowly to propel itself along. Should each one of these count as a “leg”? Their function is rather more like that of motile cilia, the hair-like organelles on a cell surface that help it to move, than they are like the four extremities by which a gazelle executes what we easily recognize as running. Is every cilium a “leg”? What about the cephalopods, where while many species have eight “arms”, two of them are partially adapted to execute a running-like motion on the ocean floor? What about animals with metameric or segmented bodies, such as millipedes, which typically have pairs of spike-like appendages, one coming out of each side of a single body segment (which in truth number far fewer than one thousand, despite the creature’s name)?
Even if we restrict ourselves to insects, all of which have six “legs” all of which are attached to the thorax, it’s still not clear that these are any more similar to a quadruped’s legs than, say, animal eyes are similar to the photosensitive features of a sunflower’s physiology that facilitate its heliotropism. An insect’s legs have an entirely different evolutionary history than a mammal’s legs — at the time of the Urbilaterian, our last common ancestor with insects, 570 million or so years ago, there were certainly no legs of any sort. The six thoracic appendages that would evolve much later on the bodies of insects, and that would gradually come to do something somewhat like the locomotive work we observe in the four appendages of many tetrapod species, can be called “legs” if we want to call them that. But nothing about the world requires us to do so.
4.
We have been talking so far only about eyes and legs, but what I really want to talk about are minds. How many are there? Are they more like eyes, with a single causal history extending back to the first photoreceptor molecules in eukaryote protists 1.5 billion years ago, and with a stable and unitary function across all those years (namely, the reception of light)? Or are they more like legs, which only seem to exist when we “stay close to home”, and imagine the experience of other kinds of creatures to be fundamentally like ours?
It may be that the “clarity of purpose” that seems to attach to the visual organs explains, in part, why they have served throughout the history of philosophy as such a rich source of metaphors for the “higher” forms of non-visual mental activity. These metaphors are so embedded in the way we talk that we are almost unable to interpret them as metaphors (“the light of reason”, “I see what you’re saying”, etc.). I believe it was Martin Jay who referred to this habit of ours as a mark of philosophy’s deep-seated “oculocentrism”. To see is to be capable of at least some sort of proto-thinking, we seem to suppose, and indeed empirically we take the evolutionary emergence of photoreceptive “eyespots” as broadly contemporaneous with that of “experientialization”, or the emergence in evolutionary history of “something it’s like to be” a creature of this or that sort.
So it would not have sounded facetious if Thomas Hobbes had tried out, as a response to Descartes’s Cogito, the further variant “I see, therefore I am”. After all, in the Second Meditation Descartes himself had glossed the phrase “thinking thing” as “a thing that doubts, understands, affirms, denies, is willing, is unwilling, and also imagines and has sensory perceptions”, and plainly vision is to count as an instance of this last capacity. Why does seeing seem different than walking here? Hobbes’s exact objection to Descartes is not in fact that one might say Ambulo ergo sum just as easily as Cogito ergo sum, but rather that Descartes’s more elaborate claim, “I think, therefore I am a thinking thing” is no more justified than, “I walk, therefore I am a walking thing” (or, on some variants, “I walk, therefore I am a walking”). Hobbes’s point is that we might just as easily infer, from anything at all that we do, that that thing pertains to our essence — that we are, for example, a res ambulans.
But Descartes had in fact dealt with this sort of attempted substitution in the Second Meditation: the inference doesn’t work with walking, or eating, or breathing, since it is at least in principle possible that we are systematically confused about what our bodies are up to, or about having a body at all, as we see to some partial extent in what is today called phantom-limb syndrome (which Descartes probably saw up close, among the wounded soldiers, when he served as a mercenary in the 1620 Battle of White Mountain — war is hell, and can even turn you into a dualist). You can think you’re walking and be mistaken, but you can’t think you’re seeing and be mistaken, even if all you are seeing are hallucinations. And if you think you’re walking or seeing, whether you are mistaken or not, you are thinking; there is no conceivable scenario in which you are mistaken about what you are doing, and are in fact not thinking. Therefore you may infer from your thinking that you are a thinking thing, but not from your walking that you are a walking thing. Descartes prevails; Hobbes misunderstands.
And yet, Hobbes is not trolling. Or at least he is not just trolling. He is joining up with a long countercurrent in the history of philosophy that proposes bipedalism alongside, or sometimes even in place of, rationality, as our unique species marker. This is the motivating idea behind the jocular proposal in Plato that “man” might be alternatively defined not as the “rational animal”, but as the “featherless biped”. In 1699, when the English anatomist Edward Tyson has the opportunity to perform an anatomical study of an infant chimpanzee that had died within a day of disembarking its ship from Angola, there are two questions that preoccupy him beyond all others. First, are this animal’s vocal chords such that it could in principle be in possession of speech? Second, is this animal’s lower body such that it could in principle walk around on its two hind feet? Tyson answers the latter question in the affirmative, but cannot bring himself to concede the former: even though he sees nothing in the larynx and other vocal organs to suppose that its anatomy does not facilitate the same capacities as those found in a human being, still, speech is an external sign of the inherence of reason, and so there must be something other than anatomy that makes it possible in humans and that is missing in apes — in other words, Tyson’s study of primate anatomy pushes him into dualism. While he concedes in the matter of bipedalism, he is almost as hesitant to do so as to attribute speech to the brute. The apparent encroachment on human particularity is almost as great when we admit that other animals “go on two feet” as when we admit that they have language.
Hobbes and Descartes could not have been aware of the different evolutionary histories of seeing and walking —though Tyson, in his 1680 Phocaena, was perfectly comfortable acknowledging that a dolphin’s fin bones are modified hoof bones—, where seeing has a single, unitary causal story behind it stretching back more than a billion years, and is always “focused” on a single function, with the result that it is extremely hard to contemplate it, as even Darwin observed, without resort to teleology; while walking —or at least some sort of poduncule-propelled “going”— has emerged countless times in evolutionary history, executed in a variety of somewhat similar ways by all sorts of different appendages. But it is perhaps this stability, this singularity of purpose underlying vision that causes us to elevate it so highly, and practically to see it (to “see” it) as synonymous with thinking, which at least for some people is in turn synonymous with being a thinking thing.
The countercurrent, again, reminds us of the limits of this synonymy. For Spinoza an individual is “a proportion of motion and rest”: wherever there is a going, as for example a walking, there is a single being (or at least the closest thing to a being you’re going to get in his ontology). Wherever in turn you have such an individual, you have thought, a mind or something analogous to the mind, since every thing is for Spinoza simultaneously an extended thing and a “thought thing” (where “thought” is the passive participle, not the noun). No thinking without going, in other words, or at least without alternation between going and staying.
The suggestion that going and thinking are part of a package usually appears in connection with arguments against the exceptional character of human beings, against the conceit of an ontological rift between us and all the others. If mindedness, or at least mind-likeness, can be inferred from going as easily as from seeing, well, then, we’ve got a world with lots of minds in it.
5.
I suspect that minds are more like legs than like eyes, in that we could not even in principle hope to make an exhaustive census of them. I’m not a full-fledged Spinozist on this question; I’m not convinced that we’ve got a thought thing wherever we’ve got a going thing (but who knows?). It’s just that I’m inclined to think that the scope of our attribution of mindedness, like that of “pedality”, is not dictated by the world.
It is curious to note in this connection that in the history of philosophy the rejection of rationality as constitutive of our species essence often involves a simultaneous turn to the hiking trails. From Rousseau’s alpine randonnées to Heidegger’s Holzwege, it is almost as if the philosophers have intuited that in order to demonstrate their rejection of the supremacy of reason, they must take to the forest paths and do a little walkabout. Over the course of the nineteenth and twentieth centuries, the figure of the flâneur emerged as a distinct expression of resistance to hyperrational, goal-focused modernity. The flâneur goes, and perhaps thinks in the mode of the daydream, but does not think as one is supposed to think in order, the thinking went (“the thinking went”!), to realize his full human potential as a “problem-solver”.
And today, of course, it seems nearly every Anglophone analytic philosopher has a picture of themselves on their webpage out on some hiking trail. I have even seen CV’s where young job applicants list “hiking” among their “Other Interests”. What is that all about? I think this is meant to signal a mastery of what they call “balance” — a concept I have trouble grasping, even if I know it is valued by my peers. It is not that they want to signal they are going full Heideggerian down some dark forest path where the mind gets permanently lost in the thicket, but only that they have preserved a healthy equilibrium between “work” and “life”, where the former involves sober deployment of the rational faculty, while the latter involves a restorative and temporary indulgence of the sort of unprofessional sentiments that might be triggered by an encounter with nature.
We know that a huge amount of our neural bandwidth is taken up by spatial orientation and other elements of navigational cognition. It is worth mentioning here that cognitive scientists have studied the hippocampi of London taxi-drivers with mastery of what is locally called “The Knowledge”, as exemplary of what a human mind does at its most excellent. It is also worth noting that in many non-western contexts, this same cognitive ability that the cabbies display may be experienced as a sort of identity between the mind and its environment. Thus Australian Aboriginal “songlines” are at the same time both embedded in the geography of the continent in the form of natural features, and in the mind as vast bodies of memorized song. Where is the songline exactly — in the world or in the head? Is it a going or is it a thinking? It seems that this question would make no sense from the point of view of anyone who may be said truly to know the songs.
So it is not that going is an alternative to thinking, but only that it marks a shift to a different mode of thinking, the kind that Romantic philosophers have tended to value, and the kind that seems to be more continuous with what we may imagine it is like for other kinds of being to be.
When François Duchesneau and I were translating Georg Ernst Stahl’s Negotium otiosum we got stuck on a peculiar attribution suggested by the German phlogiston theorist. He claimed Aristotle wrote that thinking is an ambulatio animae, a stroll of the soul, but we were unable to find any passage that would justify this attribution. In the Voyage du Monde de Descartes, a 1690 satire written by the French Jesuit Gabriel Daniel, the reader is invited to imagine a world in which Cartesian dualism makes it possible to separate the soul from the body, and to fly around at night as what Carlo Ginzburg would call a benandanto. In Daniel’s story, an African servant gains knowledge of the secret, and decides to go out on a night-flight himself. His soul is far away as his body sits under a tree. In a village near that tree, a maiden is dishonored, and a sort of lynch-mob is formed to catch the perpetrator. When they come upon the African, he cannot think, for his soul is out strolling, but can only go, back and forth, a thoughtless automaton.
Can the soul stroll? On the dualist reading that Daniel is seeking to lampoon, it can, but only by leaving the body behind. In the experience of the singers of songlines, and I suspect also of Ginzburg’s Friulian peasants who fly about at night on stalks of fennel, the soul just is a strolling. Most of us continue to balk at this idea, out of fear that we will be unable to return from our night-flights, that we will no longer have the sense to know when to leave our “Other Interests” behind — and get back to work.
And now for a long and mostly unrelated postscript.
I’ve hated Twitter for a long time, and have been addicted to it for just as long. I opened my account in 2008, but only really started using it around 2018, when I began thinking about the book project that eventuated in The Internet Is Not What You Think It Is in 2022. My Twitter use, in that respect, was “research”, but like an undercover cop who starts shooting heroin in order to maintain cover, I got hooked. I often say that it was in these four years that I belatedly made the transition away from my life as an aging normie, and became something more like a typical young person in our present century: extremely online, always abreast of the most vanguard forms of irony, basically a hikikomori, disdainful of people my age and older who get their news and their manners of expression downstream from this vanguard, and don’t even know, like the fashion-ignoramuses exposed in Miranda Priestly’s ingenious monologue in The Devil Wears Prada, that they are not so much speaking and thinking, as channeling what’s already been spoken and thought. It is likely that I will be renormiefied, and I have to accept this as part of my withdrawal. I withdraw in disappointment, as just one moment in a slow, gentle “weening from the things of this earth”, to adapt a phrase from Mary Shelley.
The things of this earth, really, are a bunch of shit, which Twitter just concentrates into a dense fecal supplement. My system finally revolted against these years of heavy intake at the moment when my friend Agnes Callard got mobbed for something so stupid I can’t even bring myself to describe it (here’s BuzzFeed to do that for me; and here, incidentally, is a lovely conversation I had with Agnes as the first episode of my “What Is X?” podcast).
A number of the petulant babies who attacked her on Twitter complained that her friends were circling their wagons in the name of friendship while not “responding to the charges”, not just the initial charges of throwing away Halloween candy, but also the other charges they went back and unearthed after the fact —“Show me the man,” they used to say under Stalin, “and I’ll find the crimes to match him”—, of failure to respect a picket-line (which I agree is bad, but that’s irrelevant); and of what really just amounts to having moeurs légères, no matter how much they try to dress it up in administrative language (these neo-Puritan zealots will not tolerate a single drop of idiosyncrasy, especially when it comes from a woman, and especially especially when that woman is wearing the scarlet letter of amorous self-determination). Anyhow when I saw this all I could think was: fuck your “charges”, we don’t recognize you as legitimate charge-bringers. Friendship is better and infinitely more important.
I know Agnes is resilient (and hilarious — I absolutely loved seeing her “commit to the bit”, doubling and tripling down to the utter stupefaction of the swarming mob that had never seen such a thing and simply could not process what they were seeing). And it’s not that I fear for myself, either — I’m already living in exile from the Puritan colonies. It’s just that when I see this kind of thing happening I feel like I’m literally going insane. It is profoundly alienating. I also truly think it’s just built into the way Twitter works, that is, that there will be no training it out by trying to persuade people to adopt better norms.
As for the leaders of this brute siege and their anon armies, I continue to try to see what is happening in “material” terms. They are, objectively, the rabble, and the rabble is a real force in world history. While at the fledgling Assemblée Nationale the bourgeois men were speaking loftily about the universality of human rights and so on, they left it to the Sans-Culottes to go and capture the king’s rhinoceros at Versailles and to stage a gruesome public execution of the beast. Maybe we need effervescent moments such as this in order for anything ever to change in history, beyond the chattering of the genteel assemblies. I don’t know. But as to the material causes of the hostility, as far as I can tell the rabble hate the forty-somethings with jobs because they think they’ll never have what their elders have, which they exaggerate wildly into some kind of life of luxury and unchecked power, even though what we really have is not so much power as just a massive heap of administrative responsibility, and not so much luxury as just the tiny bit of comfort we’ve managed to eke out before we’re too old and decrepit to continue, which is of course something everyone should have, and something I sincerely hope even the most vicious agents of online ressentiment will get. And if they do get it, I hope this will happen in an anomalous moment of history, so that they will not in turn become the targets of hostility from those who come up after them, and have nothing, and are hungry.
As the agents of futile ressentiment, they foment a never-ending online great cat massacre, a topsy-turvy counter-world where they’re all in charge and calling the shots, and they relish the bit of power they have there. One young person I saw, an exemplary leader of the Twitter rabble, had it exactly right when they noted that those older bloated mandarins are fools who stumble into that forum thinking the ordinary rules apply, and are consistently stunned, in their foolishness and their refusal to learn anything new, to find themselves subjected to unrelenting mockery that cares not at all about their credentials. The more I think about it, it is indeed good that there is such a place. I also think it would be terrible if this counter-world were to gain any institutional legitimacy in its own right. Every time a credentialed old normie wanders into that counter-world and starts trying to carry on productive conversations with the class of people who see themselves as in a position of material antagonism towards them, they are helping to confer such legitimacy. Twitter is inherently a place for unproductive venting — for executing rhinoceroses, so to speak. That’s what it is designed to be. I don’t understand why Agnes and so many other people keep pretending it’s a latter-day agora. Perhaps that’s part of “the bit” too.
Anyhow, here’s the thing: for me, at this point, Twitter is hardly necessary. For I’ve got Substack. I have far more followers here than I ever had there, and a good number of them pay me for what I write. I can write as much as I want, in full, free-flowing prose. And when I do, the rabble keep their distance, or, if they show up, they understand they’re in a different sort of space, where different rules apply. (Thank you, rabble, for your interest and engagement; as long as you behave I’m happy to have you here.) Why the hell would anyone prefer a site that compels you to write your thoughts in fragments, which then takes all the monetary profit that results from this writing for itself, and leaves you to fend off the swarms of enraged Sans-Culottes who are committed, as a matter of principle, to not taking in what you have to say with any charity or judiciousness? Why would anyone settle for an arrangement like that?
The one benefit of Twitter was that I was able, slightly, to amplify my Substack writing there. So I suppose now I can only rely on internal Substack network effects for growth, and also, my loyal readers, on you. If you are enjoying my ‘stack, please do help it to grow by spreading the word to your friends, frenemies, and downright enemies, and please also consider upgrading to a paid subscription. The principal benefit of paying for a subscription is that you increase the likelihood that I will continue to work on this Substack project into the indefinite future, rather than putting it aside for other sorts of remunerative writing, for which I always have more opportunities than I can feasibly pursue. For now, Substack’s the best arrangement there is, but that depends on your support.
Long-time follower here - thank you for ditching Twitter! I finally quit Facebook in 2018 (after years of trying), never used Twitter, thank goodness. I'd hoped I'd be part of a huge exodus, but since then it's been frustrating to see even the people most perceptive about "social" media mechanisms so unable to get free of them or imagine alternatives.
For instance, I was just reading a piece by Zeynep Tufekci in the NYT (https://www.nytimes.com/2022/11/04/opinion/elon-musk-twitter-free.html). She's very smart (as usual) about the toxic social media mechanisms, but winds up like this: '“Just get off social media” sounds as much of a solution to me as telling people to stop watching news about a war on TV — the war is on, and influencing so much regardless of personal decisions to stay free of it.'
No, isn't it actually more like calling on the soldiers to desert? It's clear that she doesn't want to see herself as part of the problem, and that makes me want to tear my hair out! Interestingly, the vast majority of the people in the comments section pounced on exactly the same sentence - they're just as fed up by journalism's Twitter addiction. Tufekci replies to a few comments, but keeps making the same self-referential point - that journalists have to be on Twitter because that's where journalism is happening... as if they weren't making it happen by being there.
So there's this bizarre spectacle of the majority of readers yelling at journalists to get off Twitter because that's not the "news" we want to read, and the journalists yelling back that, sorry, that's where the "news" and the "vibe" is at, and you as the reader will eat what's put on your plate. It shows so clearly that the "news" is a completely self-contained Twitter-driven system that only marginally has anything to do with the real world that's out there to be written about, or with the millions of readers who want to hear about the real world...
I would love to hear about journalists and writers who are thinking of alternatives and ways to decrease "social" media dependency. I do hope Substack will be a viable option, though I was disillusioned to read your recent post about how Substack is pressing you to do various things to "increase engagement". Given the modus operandi of Twitter and Facebook, it's hard to read "increasing engagement" as anything short of "algorithmically whipping people into an endless frenzy of hatred to fuel our corporation's infinite growth model."
Anyway... I really enjoy your writing, whether on webs, eyes, legs or whatever, and I'm so glad you decided to go on writing what you damn well please and not succumb to the seduction of the engagement algorithm. As Alexander Kluge says, "wir müssen antialgorithmisch arbeiten"...
Justin, Sorry for the new. subscriber posting error. I was worried I'd post twice. Oh well to put your reply in context my original post was to let you know I was using a passage from one of your essays as a chapter epigraph in a forthcoming book out from Doubleday next summer. It was something ypu wrote about the way science/materiaism can "jump the fence" to realms beyond their ken. I saw
it when Big Data invaded Shakespeare attribution studies.
More details in an email and will send you an Advance Reading Copy. Glad that this has led me to your substack
--Ron Rosenbaum author, :"Explaining Hitler", "The Shakespeare Wars"