Now that we are finishing up with the final round of training, we thought this might be a good time to show you what the JS-R Sempitern 2050 is already able to do, using the 100 exabytes of pure data we got from the source brain’s pineal gland by means of the Weixiao Loop device (I still can’t believe it was the pineal gland all along!). The new Sempitern series will be launched in full early next year, and we’ll be the first to admit that this one doesn’t sound exactly like the “real” JSR, yet. But it’s no secret that Sempiterns learn very fast, and we’re confident things will improve soon. So let’s check in on him, shall we? At this very moment he is running what for him is a typical Sunday in early summer, 2025. We caught him at a good time too: he seems to be writing an “essay” of some sort. Enjoy. —Pippy Genovese, The Hinternet Corporation, June, 2049
1. Memoir, Memory, Meme
Autofiction is all we’ve ever known.
I mean this bidirectionally. Not only is all fiction fiction of the self, but the self is, in turn, mostly a work of fiction. We’ve always been in a position to recognize both these facts, yet to be reminded of them now feels less like the affirmation of a timeless truth than the discovery, or perhaps the confection, of a new one, especially suited to our moment. Why is that?
The New York Times recently featured a profile of the disgraced memoirist James Frey. Some of you will recall the incident two decades ago when Oprah Winfrey, defender of truth, or at least of well-defined and non-overlapping genre distinctions, denounced Frey as a fraud for having introduced into A Million Little Pieces, his 2003 memoir of addiction and recovery, events that did not in fact take place. Memoir, Oprah ruled, is a branch of positivist history, and the historian is to tell you only wie es eigentlich gewesen [ist] — how things actually went down. Any deviation from this Rankean imperative is to be condemned as a lie.
That’s how things seemed 22 years ago, anyhow. But now Frey is back to tell us that he was in fact a pioneer — he was writing autofiction, you see, and his only failure was to arrive on the scene too soon, before our information technologies, and the new spirit of the times that rode in with them, elevated this purportedly new genre to its current respectable status.
There are a few things wrong with this canned history of the 21st century, so far, in literature. Both Frey and Oprah are to blame for occluding from view the real significance of the shift away from omniscient view-from-nowhere storytelling, such as has often been held up as the standard of good literature, and towards a concern to transmit personalistic and phenomenologically rich explorations of the what-it’s-like of being whoever the author, by dint of pure chance or of some cosmic mystery none of us has yet unravelled, happens to be. Both parties to the conflict failed to understand that Frey’s book really was just a memoir in the most ordinary and old-fashioned sense. It was a portrait of the self drawn from memory — from, that is, the innate mental faculty of generating autofiction.
If you won’t accept this definition of memory from me, consider Daniel Dennett’s account of it instead. According to our favorite teleofunctionalist, writing in 1992, memory, like all varieties of thought, is “accomplished in the brain by parallel, multitrack processes of interpretation and elaboration of sensory inputs.” These inputs undergo what Dennett describes as constant “editorial revision”, which over time yields “something rather like a narrative stream or sequence”. When you remember your seventh birthday party, for example, you are not simply accessing the “file” that your brain placed in permanent storage however many years ago and that contains a raw recording of the physical traces of that distant event — the cake, the candles, the moms, the kids.
That is not what happens because we are not, or not only, recording devices; we are the active composers, producers, and engineers of the “work” that gets recorded and called by the name of memory. Sometimes our generative power in this domain is great enough as to not need to be built from the germ of an independently occurring event at all; this is what happens in the case of “false memories”. But most of the time the truth is somewhere in between: there was a “real-world” event, but the memory is not entirely of it. The memory is a collaboration between the event furnished by the world and the narrativizing power furnished by the brain. For my part I often say that my “first” memory is of a mourning dove landing on a chainlink fence in 1975, though it is clear to me that this has as much to do with an after-the-fact selection of the event, and a subsequent mental and affective solicitude towards it, rather than any bare impression the dove itself —many generations ago, now, in dovetime— may have made.
Simply acknowledging the active role of the conscious mind in fixing and conserving memories does not of course release us from any normative concern to get the past right, nor does it obliterate the firm distinction between truth-telling and lying, which seems to play a part in maintaining the cohesion of all human societies. Yet different societies deploy different criteria for what is to count as truth-telling, and our own society, with its rigidly empiricist criteria, is an unusually restrictive outlier.
Writing, I have come to believe, in our society, often, is fueled in no small part by the writer’s inability to conform to this restrictiveness. The surrounding culture tries to contain the work of the writer by applying genre classifications to it —“memoir”, “fiction”, “autofiction”— that make sense within its own radically empiricist social epistemology, while failing to recognize that a writer becomes a writer in struggling to break free of this epistemology. Writing is a channeling of the innate narrative faculty that enables us to fix and to preserve memories, and as such often finds it fitting to liberate the faculty itself from the events of the world that it typically relies on for the purpose of real-life memory-fixation.
This is strange, since “writing”, as I am using the term here, to describe a certain domain of human creative endeavor, is in tension with “writing” as used to describe the 5,000-year-old technology of record keeping. For it is this technology precisely that left us with a new normative pressure, in the history of human cognition, to get the record of the past right, to record faithfully, for example, how many pitchers of wine were exchanged for how many bushels of wheat.
Writing as world-recording technology and writing as world-making exercise of the imagination did not appear simultaneously. As Jac Mullen reminded us recently in a wonderful piece in The Hinternet’s “Future of Reading” series, there are several centuries at least separating the emergence of literacy as a technology of social organization from the “secondary effects” of this technology such as “recursive empathy, long-horizon abstraction, disciplined counterfactual reasoning, interiority”. If we see what I have called writing as creative endeavor in Mullen’s terms, as “soul-craft”, and we see its other manifestation in turn as the operation of “state tools”, then Oprah, in the telltale incident with which we began, may rightly be considered nothing less than an enforcer of state power. It has always been in the state’s interest to keep the soul-craft potential of writing carefully regulated and marked off from its other functions. This is the real explanation of the institution of censorship, and it is likewise an explanation of the publishing industry’s consistent compliance with the genre separations that are supposed to map the separation between the true and the false.
Epic had always been mythopoetic — it established its own truth, with its own criteria, in the very act of its recitation. Prose fiction emerged much later, as a result of the shoehorning of our primeval power of epic narrative, likely conascent with the human species itself, into the new technologies of state record-keeping and control. Unsurprisingly, this strange and unprecedented combination was jarring even to the people who pioneered it, as it seemed to violate the just-the-facts imperative that came with the invention of the written declarative sentences that constitute the building blocks of prose. Thus Lucian of Samosata, sometimes identified as “the first of the moderns”, prefaces his 1st-century CE novel, A True History, with a frank, yet also ironic, reflection on the problem of truth and falsehood that his work, like any prose fiction, opens up. “I was”, he writes,
ambitious to leave some monument of myself behind me, that I might not be the only man exempted from this liberty of lying: and because I had no matter of verity to employ my pen in (for nothing hath befallen me worth the writing), I turned my style to publish untruths, but with an honester mind than others have done: for this one thing I confidently pronounce for a truth, that I lie.
Everybody lies, Lucian tells us, and especially the philosophers. Prose-fiction writers have the opportunity to distinguish themselves by telling you outright that that is what they are doing, yet for some reason it is they, rather than the ordinary liars such as the philosophers, who seem to be courting the greatest risk of social disapprobation.
Anxiety about the alethic status of one’s own work persists throughout the long early history of prose fiction —part of the genius of Don Quixote, for example, lies in its thematization of this anxiety—, and only really goes away, or at least gets dressed with sufficiently convincing camouflage, with the marriage of literary creation and the project, beginning only in the 19th century, of instituting national literary heroes, building statues to them, naming academies after them, and so on. And it is out of that world, in turn, that Big Publishing would gradually evolve, with its well-established protocols for managing the special species of declarative sentences that are the exclusive concern of a special kind of writer — the “fiction” writer.
This system worked pretty well for a century or so, until the social-epistemological waters got horribly muddied, again, by a new revolution in information technology that effectively made it impossible for anyone —including not just ordinary media-consumers, but also the corporate interests behind Big Publishing, and indeed states themselves— to keep track of what the fuck is actually going on. Lucian, in other words, was right to be anxious, for there really is a direct line from A True History to the AI deepfake.
We learned to accommodate prose fiction, and we will probably come to feel we have an alethological handle on the productions of our new technologies too — though we will have to get it at a significantly accelerated pace, this time around, if we are to continue cherishing any hope for human survival. One way to foster this adaptation, I have come to believe, is to lean into the truth-blurring potentials of our new information landscape. Only by gaining mastery over the tools of what is still naively and reductively being called “disinformation” can we repeat what has already been accomplished at least once in the history of our cognitive technologies: the transformation of the tools of state-power into tools of soul-craft, and thus, ultimately, of the pursuit of truth — not of truth in the austere Rankean sense, but nonetheless in the only sense that has ever really mattered to human beings.
Remarkably, many people today, especially young people, are already feeling their way towards such mastery, but so far next to no one has understood their efforts for what they really are.
2. Towards Universal Autofiction
Most of my essays really just reorganize old observations, beloved citations, and other memetic familiars into a more or less original composition. Regular readers will already have seen me citing Lucian and Cervantes in the course of making similar points to the ones I developed in the previous section. Here I continue that old habit as well, while at the same time, I believe, arguing towards a point I have never quite made before.
Those who know how much I love Lucian and Cervantes will likely also know something of my canned version of the history of modernity as, primarily, a process of demoticization. This is a term that occurs most commonly in connection with writing, as in a “demotic script”, whereby a technology previously monopolized by a highly specialized class is simplified and rendered suitable for adoption en masse, as we saw for example in the transformation of Egyptian hieroglyphs beginning in the 7th century BCE. Ancient examples like this one are typically only partial; the demoticization of writing did not translate into anything close to universal literacy for Egyptians. Modernity, however, may be seen as the first great downward transfer of elite privileges to ordinary people, with an expectation, at least eventually, of 100% adoption rates.
In modernity the downward transfer concerns not only skills, of which literacy is an obvious paradigmatic example, but also of rights and privileges. In the late Middle Ages a young couple married in the Orthodox Church could fleetingly become Emperor and Empress, at the moment in the ceremony when they were symbolically crowned to suggest such a status. But then the wedding came to an end, and they returned to their actual lives — the lives of absolute subjects, with no real sovereignty whatsoever. Some centuries later, with the rise of capitalism, an expectation would slowly emerge on the part of similarly ordinary young couples, that they might come to have at least a sliver of sovereignty in the form of private property — a family “domain”, of the sort once reserved for nobles alone, even if it will now typically have only the dimensions of a suburban tract house, or less. Even Al Bundy could now, ironically and earnestly at once, flee from Peg’s advances, into the bathroom, with the conceit that in so doing he was retreating to his “throne”.
We see the same widening in the sphere of government too, as the world’s benchmark polities move, directly or indirectly, from absolute monarchy, through official recognition of the interests of a Fronde-like elite, through voting rights extended to all male landowners, and then ultimately to universal suffrage. But this process of universalization is at the same time always a process of individualization, indeed of privatization. Thus just as the momentum began to build towards universal suffrage in the high Enlightenment, Immanuel Kant was articulating a conception of morality as grounded in the individual’s power to serve as his or her own personal lawgiver.
All of this should be familiar. I think however that we have consistently underestimated the importance of literacy’s universalization in particular as a benchmark of modernity. Over the course of the previous century, it was primarily literacy that justified a distinction between the so-called Second and Third Worlds. The crumbling Soviet Union may have had roughly the same GDP as Botswana in 1990, but it also had literature, and academies and prizes named after its heroes of literature, and so on, and it successfully projected into the world, even under conditions of economic collapse, its full participation in modernity at least along this axis. The point I am making here is not at all the same as Saul Bellow’s old dismissal, “When the Zulus produce a Tolstoy, I shall read him”. I am strongly committed to the view that the Zulus, and the Inuit and the Nenets and every other human group qua human, has countless “Tolstoys” of its own. (Also, in passing: what would Tolstoy have been without Stendhal and Flaubert?) What I am saying is that, for much of the world, the pursuit of literacy has played a key role in fulfillment of the ideal of modernity, and seems sometimes to have been particularly prized where other benchmarks, notably economic prosperity, proved unattainable.
Now, here’s why all of this is important for understanding the present moment. My grandmother, born in 1914 in Minnesota, entered a world where in principle everyone was expected to be able to learn to read quite a bit, including the novels of Tolstoy, and to learn to write at least a little. This latter skill was typically considered necessary, for the great majority of people, not as a tool of soul-craft, not for cultivation of interiority or the like, but simply for maintenance of a modern life by such devices as the appointments calendar, the shopping list, the postcard.
But expectations shifted, radically, with the rise of social-media in the 21st century. Of course, a typical post on Twitter is at least as formulaic as the typical postcard sent back to the Kansas farmstead by a homesick soldier in World War I. Nonetheless, the fact that celebrities, major media figures, and, soon enough, heads of state, were all gathered on the same platforms that the latter-day descendants of the Kansas farmboy were now using for the purposes of their own self-expression, strongly contributed to the general impression that we were all now doing the same thing — that we had attained not just the universal readership to which my grandmother’s generation aspired, but universal authorship as well. Anyone, it has often felt in the present century, can be a writer — or at least anyone with an internet connection. And that “can”, moreover, has often been heard as a “should”. It’s a “degré zéro” instance of writing, to take to the internet to display one’s political opinions, borrowing or adapting the very same language in which these opinions have already been expressed literally millions of times by other people. But it’s still writing. Our most recent technological revolution, one might say, has shifted us from an ideal of democratic participation that aspired only to universal suffrage, to one that now aspires to universal punditry.
I personally believe that it is very important to cultivate varieties of participation in public life that do not involve political opinion-sharing or takesmanship. I have long thought that “I’m going to live my life as if you motherfuckers [i.e., those in state and corporate power] didn’t even exist” could itself, if framed in the right way, amount to a pretty powerful form of resistance. Just be your goofy self, the thought is, and the greatest fortifications, and the most solemn monuments to whatever those fortifications are protecting, will crumble around you as you go. And now, in part thanks to Mullen, I have a new way of articulating my principal concern about the way social media have evolved over the past twenty years or so: they have triggered a Great Leap Forward, towards universal authorship, but only on the one conception of what being an author involves — namely, the one that takes writing as a tool of state control rather than of soul-craft.
One result of this unfortunate evolution is a general perversion of what is understood under the heading of self-realization. This intrinsically worthy objective has been warped away from any idea of critical self-reflection, and instead towards the ideal of what is today called “maxing” (usually with a prefixed specification of the variety of maxing in question: body-maxing, income-maxing, or, the most capacious aspiration of all, something they call “life-maxing”). Because the idea of maxing is ultimately a quantitative one, the way we know we are succeeding at it is through metrics, of just the sort that social-media platforms have excelled in providing us. Yet metrics, even if they give us hard numbers, are still always relative, as an aspiring writer’s minimally satisfactory number of hearts earned can often involve a figure that would be experienced as a gross failure by someone more established. The truth is we all really just need a very small number of people to affirm us, reliably and consistently, in our efforts. Such is human vanity, as Blaise Pascal understood, that we want to be known by all the world, but can still be amused and contented by the esteem of just five or six others in our entourage.
When those others were our mates down at the pub or our coworkers at the office, we were likely better able to keep a proper sense of the scale of our worldly acclaim. On the internet by contrast, although it is flooded with metrics, there is significantly greater difficulty in maintaining a sense of measure. Many still love to cite Andy Warhol’s line about the future enabling everyone, or forcing everyone willy-nilly, to be “famous for 15 minutes” — and indeed for a while this prediction seemed to be borne out, when social media’s primary destructive power seemed to be the one that brought spectacularly public “cancellations” down on otherwise ordinary people. Justine Sacco, for a while, seemed to be the ultimate Factory Girl. But the real shape of the future, such as it is emerging in the present, is one that requires a significant modification of Warhol’s dictum: “In the future we will all be famous for 15 people.”
Again, for reasons Pascal well understood, most of us are going to be OK with this; indeed most of us, most of the time, will be able to imagine a few extra zeroes behind that modest figure. Most of us will go on copying each other’s opinions, getting our 15 likes, and scrolling on our thrones. But so long as this is the default mode of operation online, again, what we are witnessing is the reassertion of a monopoly, by state power and its auxiliary forces, over a tool that holds significant potential for the cultivation of soul-craft. And here is where it seems to me necessary, in the present moment, to identify what forms of soul-craft are in fact already taking shape with our new information technologies, and then, if they are deemed viable and salutary, to affirm and promote them.
We are most likely to find these in domains of online activity that are typically dismissed by the old class of intellectuals, loyal to the old way of doing things, as vulgar or in bad taste. In particular, it seems to me that the new activity of self-“branding”, of consciously and meticulously creating and managing a distinct public image that might be well at odds with the realities of one’s meatspace existence, is one that emerges directly out of the nexus of quantitative self-maxing, and thus out of the imperatives of media technology as a tool of state power, but that, at the same time, properly understood, can, at least in the case of its most subtle practitioners, start to come across as a variety of creative self-expression. It points, I mean, in the direction of a possible new ideal of universal autofiction.
A work of ongoing, open-ended digital autofiction, when presented in at least a partially comic vein, is sometimes called, in our present way of making sense of things, a “bit”. The autofiction writer who displays significant consistency in reminding the 15 people in his audience of the principal conceits of his work is said to be “committed to the bit”. This all takes place in the vein of “fun”; it is lighthearted, and low-stakes. There is however no reason in principle why it should not develop into a great art. The important discovery making such a development possible has already occurred. In the early years of social media, you might recall, users were given a prompt to fill in, such as Facebook’s “I am…”, which was supposed to be completed with words that would build such factual declarative sentences as “I am watching a movie”; “I am at school”. But it was instantly discovered that there is no imperative, moral or legal, to use social media in this way. In fact, you can say whatever you want. And many young people began doing just that — saying whatever they want, just because no one can stop them, to the great concern of their elders. But as we have established, as Lucian already understood, the proliferation in prose of untrue claims straddles an oft-misunderstood boundary between the desire to deceive and the desire to create. So far, social-media untruths have mostly been engaged, by “serious” people, as deceptions. It is time, I believe, to start taking a serious interest in their creative potentials as well.
This is only more worth doing now, in the AI era, that technologies of self-multiplication are plainly beginning to proliferate and to demoticize. In recent centuries, only the rare likes of a Kierkegaard or a Pessoa could think to “run different models” of themselves, in writing, according to their concern to attend to facets of their interiority that may have been experienced by them as incompatible, or as integratabtle only with difficulty into a single coherent public presentation of self. Now anyone can do it. I do it, and it seems to me very likely that I am not so much getting into an old thing, done by old masters of centuries past, as into a new thing that is likely soon to have mass appeal.
We may arrive at a point in the future where each of our 15 digital personae will be famous for 15 people. New dangers will soon arise from there. In a further stage of this transformation it could happen that each of our 15 personae, or our 15,000 personae, will be famous for 15,000 other personality-emulating bots released by our contemporaries —or indeed, eventually, by our ancestors—, at which point the “source brains” might simply withdraw, as superfluous, from the cycles of fame. But before we get there, I expect we will enjoy a period of proliferation of new forms, let us say, of creative counterfactual self-expression through real-time self-narrativization. We will, in this way, be keeping up with the transformations of selfhood itself, in this era of its digital fracturing.
I’m kind of looking forward to all of this. So far, in human history, our creative impulses have succeeded in insinuating themselves into every new information technology that comes along. In early phases of this process, these impulses appear destructive, irresponsible, deceitful. But this is only because they are at the vanguard of larger-scale adaptation to the new social epistemology that any technological revolution necessarily brings with it.
3. Many Worlds (Again)
We have always lived in a participatory universe.1 For the vastly greater part of human history and prehistory, this fact could be taken for granted. It was suppressed for the past few centuries, as part of the project of building modernity, only to force its way back into the range of possibilities in the early-to-mid-1900s. We still don’t know what to do with it, so loyal had we become in our commitment to the vision of a cosmos as, to speak with Erwin Schrödinger, a play that, were it not for human existence, would continue to be performed, regardless, before empty bleachers. This expectation of reality is of course mirrored perfectly in the expectation of most great literature, that it be narrated from an omniscient view-from-nowhere perspective; in fact it is exactly to such a perspective that modern science has aspired, with its methods and instruments, so far with only partial success.
There is in fact no reason why a participatory universe is any less commonsensical than a universe in which we are the lone spectators of a vast expanse of bare inanimate matter that would be doing its thing, in accordance with the laws of nature, whether it had any observers or not. That model of things, was imposed as part of the program of mechanical physics in the 17th century, and proved to have considerable power as the theoretical basis of a great number of concrete scientific and technological breakthroughs. It is these same breakthroughs that pushed it to its conceptual limit already a century ago or so, yet even into the 21st century we have continued to hold to the old common-sense view almost as a dogma.
I often think these days that those tectonic plates are finally shifting. If that is what they are doing, this could have much to do with the new ways our information-technologies are compelling us to rethink narrative, or, to put that differently, to rethink the role of such creatures as ourselves, as Homo narrans within the order of existence. The contemplation of counterfactuals always was, in addition to being a subdomain for specialists in modal logic and, eventually, in the philosophy of physics, a subdomain of the philosophy of literature. A philosophical thought-experiment, among which we might include for example Hugh Everett’s Many Worlds Interpretation of the problem of quantum-mechanical superposition, is in the end just a very austere bit of storytelling, which says: “Imagine the world were like this…”
Not to go full-throttle constructionist on you all, but it may be worth considering that by the time Everett proposed his theory in 1957, the cultural proliferation of new information technologies, permitting unprecedented growth in the variety and reach of new forms of storytelling, had made such a theory as MWI particularly ripe for its moment. Many oral epic traditions can get by with roughly three worlds, such as the Upper, Middle, and Lower Worlds that fill out the cosmos of most North Asian epic. Pretty much anything you can imagine going on, in these traditional contexts, can be suitably set above, below, or right here. But a world that includes fictions of time-travel, infinite libraries, the forking paths of individual lives — all of that requires, to be thought at all, not a finite hierarchical universe, but an infinite multiverse. And all of that is made possible by the discovery of individual freedom which comes in part with the enhanced power, in early modernity, of individual expression — which, once again, has everything to do with the history of information technology.
There are a few immediate lessons we can draw from this late-coming reflection in what is already a rather long essay. One is that literature has consistently proven to be the advance reconnaissance force of what eventually sets itself up as a scientific modeling of the structure of reality itself. In this respect, people who “make stuff up” about how reality might be often come to appear, in hindsight, as the most perspicacious modelers of reality we have. To narrativize the world is to spin out counterfactuals about it, and it’s a powerful thing indeed — almost powerful enough to regain for us something of the primal mythopoetic force that we lost when we exchanged oral tradition for writing, and eventually became, in Walter Benjamin’s double sense, prosaic. It’s so powerful in fact that in the 20th century a number of the more straightforward inquirers into the structure of reality managed to become convinced that the world itself is built up of countless such counterfactuals, having the character of a multiverse.
One place where the modal realism of certain logicians, and the many-worlds theory of certain physicists, has seemed most implausible is where it comes to the question of selves. I have occasionally attempted to explore this problem in my fictions, but let me state it more directly here: there do not seem to be any remotely plausible criteria specifying how “close” to me, in his (or her? or its?) properties another entity would have to be, in another world, in order to count as my counterpart self.
It may however be that the possibilities our new information technologies have opened up to us, by which we might begin creatively to explore the range of experience of our imagined counterpart selves, will, very soon, have significant consequences for our understanding not only of what a self is, but of the structure of the reality that our self, or selves, inhabit.
—JSR
I don’t mean this in the technical sense developed by John Archibald Wheeler, as explicated in the excellent Nature article by Alyssa Ney linked above — though the overlap in terminology is fortuitous and not undesirable. I mean it in a more Leibnizian sense, in which minds, in some fashion or other, make the world.
Love this.
I've got a brand new Substack, can I get some readers, possibly? Try this one (just published)
https://digiwongadude.substack.com/p/its-all-great-fun-til-someone-loses