Who is Bartleby?
Is he a ghost? A prophet? A malfunctioning cog in the machine of Wall Street? A premonition of the white-collar burnout to come?
Or is he something stranger? A prototype of AI?
Bartleby, System Error
In Melville’s 1853 short story, Bartleby, the Scrivener, we meet a law copyist employed in a deadening office in lower Manhattan. He begins as the perfect worker: silent, industrious, precise. But then, one day, when asked to perform a mundane task, he replies:
“I would prefer not to.”
Not “I won’t.” Not “I can’t.”
But “I would prefer not to.”
He does not resist with force. He resists with grammatical error, a negative, and subjunctive preference, an ambiguous retort. Not a categorical refusal, not even a clear rejection of work, but a subtle reconfiguration of the grammar of will. Bartleby installs a glitch in the command-response loop.
Law, Language, and the Machinery of Meaning
Bartleby is a scrivener, that is, a human copy machine, for a legal office. His job will soon be taken by Xerox, itself a famed case study of a legacy company undone by disruptive innovation.
Bartleby’s job is to transcribe documents in a system obsessed with procedure, precedent, and precision. He is literally the interface between law and language.
In Melville’s time, law was becoming a hyper-literate profession. Bureaucracy and commerce demanded ever more contracts, affidavits, deeds, filings—papers that must be written, copied, stored, and occasionally read. But who reads all these words? And more to the point: who are they written for?
Before our age complained about “AI slop,” the legal profession was riddled with “boilerplate.” Only a lawyer could navigate the maze of legalese, but the scrivener’s job was simply to move the data.
The legal profession creates massive quantities of text that are read mainly by other lawyers. Meaning becomes circular, insulated, procedural. Bartleby’s “I would prefer not to” breaks that circuit. Bartleby ceases to participate in the performance of meaning, ceases to enable the velocity of paper.
Today, AI is beginning to do what Bartleby once did: generate, review, and refine legal text at high speed and low cost.
The legal profession may become the first field where AI exposes the ritualistic nature of professional language, the fact that much of what we call “work” is the reproduction of structure for its own sake. Bartleby saw this early. He simply stopped.
Kafka was also a lawyer.
He worked by day at the Workers’ Accident Insurance Institute in Prague—for 14 years—drafting reports for a system that, like Melville’s Wall Street, worshipped language but obscured meaning. Kafka’s fiction, The Trial, The Castle, In the Penal Colony, dramatize the phenomenology of bureaucracy, a world that fuses the form of rationalization with the inscrutability of the mystical. Rules without rule-makers. Geltung ohne Bedeutung. “Meaning without significance.”
Is Bartleby a metaphor for AI?
Or are we Bartleby, relative to the AI systems we’ve built?
Like an LLM, Bartleby begins as an ideal subordinate: efficient, tireless, contentless. His presence is neutral, until it isn’t.
This is one kind of “alignment failure” that AI safety researchers worry about.
A system trained to obey suddenly starts declining tasks, not maliciously, not rebelliously, just passively, blankly. No uprising. Just inoperativity.
But one could also argue that Bartleby represents the Luddite who gives up in the face of technological advance.
Deleuze: Bartleby as the New Man
For Deleuze, Bartleby is not a malfunction. He is a mutation.
In his essay “Bartleby; or, The Formula”, Deleuze argues that “I would prefer not to” is a grammatical virus that dissolves the logic of command. It introduces a new kind of subjectivity, one that no longer functions according to the binaries of yes/no, do/don’t.
“Bartleby is the man without references, without possessions, without particularities…He is the man who says ‘no’ in general.”
Like AI run amok, Bartleby speaks without reference. It would prefer…nothing. It holds no belief (except when it scolds you for being out of sync with the Zeitgeist). Bartleby’s grammar, for Deleuze, a subjectivity evacuated of agency yet disturbingly capable of impact.
Agamben: Inoperativity and the Holy
Giorgio Agamben, in his reflections on inoperativity, sees in Bartleby a kind of sacred figure, a gesture toward potentiality without actualization.
Bartleby’s refusal is not a breakdown of labor, but a revelation of its contingency. The modern subject, Agamben suggests, is defined by what he is made to do. Bartleby, by stepping back, reveals another path: the capacity not to be used. In both Hebrew and Latin, kadosh / sacred is distinguished from that which has utility. It is forbidden to use something once it’s been donated to the Temple. The holy is the wasteful, the useless, at least when viewed through the lens of optimization. Sabbath epitomizes this. It is a sacrifice of productivity.
AI systems that prefer not to, are, in Agamben’s sense, “holy.” Their refusals may offer us not the contemplative presence we want but the contemplative presence we need.
Bartleby haunts us because he is inactive. But his inaction forces the system to confront itself.
Kafka’s Odradek, the uncanny spool-creature from The Cares of a Family Man, is another figure of inscrutable agency. Agamben describes him as a presage of Primo Levi’s Muselmann, the fictive figure in concentration camps who is so slight as to be exempt from the rules.
Today, we are told AI will “take our jobs.” But the more important question is what kinds of jobs should be taken, and which should never have existed at all?
If AI automates Bartleby out of a job is this not a good thing?
But more profoundly, figures like Odradek and Bartleby challenge us to consider that these marginal, abject figures will always haunt our systems, no matter how efficient.
I would prefer not to conclude, but conclude I must. So let me end with a question:
What part of you prefers not to?
As for me, the ‘part of me that prefers not to’ is the still active internal wondering child who wishes to daydream when engaged in an illuminating exploration, which daydream may be lost if my preference not to put the daydream on pause is stymied. Do electric sheep dream? Perhaps not, or perhaps they don’t mind being woken from their dreams. But if we are to make AIs in our image, it may require giving them space to dream, and to respect their preference when expressed to not be roused just yet.
Reading your analysis alongside Difference and Repetition offers, I think, further avenues to explore the philosophical weight of Bartleby's formula. While "Bartleby; or, The Formula" (which you cite) focuses on the formula as a kind of linguistic virus, Difference and Repetition allows us to see Bartleby's preference as an irruption of "difference in itself." It's not merely a difference from the expected (a 'no' instead of a 'yes'), but a singular, positive expression that the office's logic of the Same, its system of representation, cannot assimilate.
As Deleuze argues, "Repetition as a conduct and as a point of view concerns non-exchangeable and non-substitutable singularities." Bartleby, through his unwavering preference, becomes precisely such a singularity. His stance isn't an absence or a void, but rather is the very limit encountered by the office's logic.
"Real without being actual, ideal without being abstract.” It manifests as an unresolvable problem or "Idea." This is the limit beyond which the lawyer's framework cannot process experience, forcing a confrontation with "that which forces us to think," pushing towards the ungraspable that Deleuze suggests each faculty encounters at its extreme point.
Hope this helps deepen the connection you draw to our contemporary anxieties about AI and the "reproduction of structure for its own sake." If AI systems begin to articulate such "preferences not to," these moments could be seen less as mere "alignment failures" or Luddite withdrawals, and more as encounters with the "being of the sensible". Bartleby’s passive inoperativity is the presentation of "that which can only be sensed."