Sitemap

The disappearance of AI

12 min readJul 26, 2025

Clock time is invisibly and unquestioningly embedded in most societies across the world. Will AI be too?

Press enter or click to view image in full size
Image: anncapictures

Read a more condensed version of this story here.

George Orwell’s dystopian novel 1984 opens with the clocks striking 13. This cleverly serves a dual purpose: When 1984 was published just after World War II, digital clocks were not yet invented, so all clocks had two hands and 12 numbers. Striking 13 immediately signalled something eerie. But, 24-hour time was, by the start of the war, widely used by militaries. So Orwell was also signalling that the society in 1984 had adopted this time system and adapted to a militaristic order.

The cult of measurement and the pursuit of precision brings us to a world where everyone carries a digital clock connected to a network of satellites, each carrying a little glass box of rubidium vapor — which, when stimulated by microwaves, oscillates more than six million times a second, providing time accurate to 100 billionths of a second, and synchronizing every person and device across and above the Earth.

Almost no one thinks about, much less questions, clock time. Yet evidence continues to mount showing adverse effects on our mental and physical health when we favour clock time over natural rhythms, as I’ve reported before.

Now come reports that AI is making us dumber, taking our jobs and partners, and worming its way into every app on those same tiny computers in our pockets and on our wrists. Might it become as normalised and unquestionably accepted as the clock? This article brings together the two main topics of my academic research to explore this question. I try to draw out some lessons we might learn from the history of the adoption of clock time.

Press enter or click to view image in full size

Fiction as control

The subdivision of time into regular units dates back at least to ancient Babylon, China and Egypt. It helps increasingly complex societies plan and coordinate. But as author Steven Johnson observes, up until the Industrial Revolution, most people would think it nonsensical to organise time into regular, abstract units, because people instead divided their day according to activities: “Instead of fifteen minutes, time was described as how long it would take to milk the cow or nail soles to a new pair of shoes. Instead of being paid by the hour, craftsmen were conventionally paid by the piece produced”.

Religion first embraced clock time as a way of embedding regular prayers and other activities into the day, first among monks and later extending to entire villages via tolling church bells. What might have been jarring to villagers at first, gradually became part of the sonic and temporal rhythm of human societies.

For most of us, the regular tick of the clock has become second nature — not far in its rhythm from our heartbeat or breathing. “Naturally” we sleep at night because that’s when it’s dark. But the introduction of clock time into the workplace, along with the concurrent invention of artificial lighting, produced a dual shock to workers in the early industrial period. Time moved from being a tool of coordination to one of control, and from useless abstraction to pervasive structure.

What is an “hour” anyway? Just a word. What is a number? Just a fiction, according to philosopher Bertrand Russell. When you stop to think about it, such arbitrary abstractions seem pretty meaningless compared to natural rhythms.

Press enter or click to view image in full size

Folding rocks

Let’s turn to material things. From astronomical events to the motion of the Earth. Water, sand, shadows. Candles cut to measure precise intervals. Gravity pulling objects down, swinging pendulums, tightly-wound springs. But most of all, Johnson proposes, it was electricity applied to rocks — specifically quartz, which vibrates regularly at 32,000 times a second when thusly stimulated, regardless of temperature, humidity, or wear.

Philosopher Benjamin Bratton points out that, while “intelligence” can be identified in various natural systems, it was humans who “managed to fold little bits of rocks and metal into particularly intricate shapes and run electricity through them, and now the lithosphere is able to perform feats that until very recently only primates had been able to perform.” He calls this “mineral intelligence”.

And here’s where artificial time meets artificial intelligence, because the quartz clock coordinates the billions of calculations per second in computers. “A modern computer,” Johnson writes, “is the assemblage of many different technologies and modes of knowledge: the symbolic logic of programming languages, the electrical engineering of the circuit board, the visual language of interface design. But without the microsecond accuracy of a quartz clock, modern computers would be useless.”

No one knows exactly, but mechanical computers might be as old as clocks, dating at least to the Antikythera device of the ancient Greeks. Like time, computation seems to be a natural process; some people argue that humans discovered, not invented it, finding evidence in animals, plants, chemical reactions, colliding particles — it’s computation all the way down.

By WWII, when digital computers were invented, cyberneticians like Norbert Wiener considered the clock as a model for both the human brain and the computer, according to Matteo Pasquinelli. Then, as the saying goes, the times changed. The “clockwork universe” of Newton’s era became the computational universe, information became the basis of biology, the human brain a computer that could be emulated by a machine.

And now, you can’t remove the digital clock from the face of your phone.

Press enter or click to view image in full size

Is it now?

Time is a class in the Python programming language. It’s a module that you invoke at the start of your program with the line “import time”. The command “time.time()” returns the current time to the hundredth of a second.

Computers “think of” and use time differently than we do. AI systems, for example, operate sequentially and asynchronously, and various other nonintuitive ways. This is because computers view time as data to be calculated and manipulated for the task at hand.

Language models, for instance, process text in a linear series of “tokens” or units, which may be letters, words or phrases. But this is not the same as the seconds ticking by, because each token is processed in relation to previous ones in the sequence, so there’s no subjective sense of duration or “now.”

They can also freeze moments, which include past training data, your current prompt and several previous ones, all in their “attention” or context window. In such moments, time doesn’t flow. And they don’t remember past conversations like (some) humans do, starting fresh each time, without memory, like a person with Alzheimer’s. Using text as they use time — as a statistical quantity to calculate — they only understand the flow of time from letter sequences like “before”, “after”, and so on. For our benefit, they merely simulate temporal understanding.

My favourite language model, Claude, tells me, “It’s somewhat like how a very sophisticated map understands spatial relationships without ever traveling the territory it represents.”

What’s important is that the attention-based (“transformer”) architecture underlying language models means that they can treat any kind of data like text, using next-token prediction. Bratton says that this makes them “a new kind of general-purpose public utility around which industrial sectors organize: cognitive infrastructures”.

Press enter or click to view image in full size

The tortoise wins

By chopping time into ever-smaller pieces of data, computers enact a version of Zeno’s paradox. Do you know the story of Achilles and the tortoise? Ancient Greek philosopher Zeno postulated that mighty Achilles could never really catch the much-slower tortoise because, in order to do so, he would have to first cover half the distance to it, and to do that, half of half of the distance, and half of that, and so on to infinity.

The same applies to time: Philosopher William James denied that 14 minutes can pass, because first seven would have to pass, and before that, three and a half, “and so on until the end, the invisible end, through tenuous labyrinths of time,” reports writer Jorge Luis Borges (in “Avatars of the Tortoise”). Thus, the paradox is that time effectively does not exist. James was just one in a long line of philosophers and physicists to figure that out.

And yet, like laws and corporations, this fiction runs our lives.

AI, too, chops data into ever-smaller tokens. Now, we see AI, like the clock, becoming a measure of labor and productivity. According to Journalist Jessica Grose, “a lot of the rules we live by are governed by fear of not keeping up, and that fear is exploited by corporations.” Pervasive in the marketing of AI is how it is simultaneously impacting jobs and increasing productivity. The clock did the same in the industrial era.

Press enter or click to view image in full size
Image: FotoEmotions

The end of thinking

Clocks, according to sociologist Kevin Birth, make us stupid. “Clocks lull us into cognitive dependency though a temporal representation that is so simple that even a small child can understand it, and yet that simple representation obscures many, many choices.”

I read somewhere, however, that many children today can’t read analog clocks — the round ones with hands and 12 numbers. Maybe this is a good thing, as it becomes an outdated skill, like knowing how to read Roman numerals or add two numbers.

Journalist Simon Kuper reports that “AI is only the latest in a sequence of inventions that have made humanity dumber. We outsourced our maths skills to calculators, our memory to Google and navigation to Google Maps.” Add language translation, most forms of writing, image production, reading, critical analysis; even emotional support, therapy, and perhaps, companionship.

So, at the very least, AI will do our work for us, but we have to ask, what is “work”? As Russell noted a century ago, “What will be the good of the conquest of leisure and health, if no one remembers how to use them?”

Press enter or click to view image in full size

From attention to intention

There’s a sense of inevitability to the normalisation of AI, just as with clock time and internet access. Does it make sense to hold on to skills like reading, writing and critical thinking? I wouldn’t suggest going back to pre-industrial practices — life on the farm, in the wild or off the grid is not as glamorous as some people imagine.

But maybe there might still be value in knowing how to do basic mathematics, to read and write, to socialise with actual humans. Partly for when the electricity goes out. And partly as a niche, specialised — the way that painting, for example, still persists after photography but became something else. More importantly though, all these skills help to maintain that other skill, critical thinking — a way of verifying the accuracy of AI systems practically, but also of critiquing them ethically. I wrote about this in the last article.

The best thing about AI, therefore, is how it makes us question and redefine some basic things we take for granted: artifice, intelligence, work, economics, how societies are and might be structured.

And time. Writes philosopher Mark Coeckelbergh: “There is no time for the present. We and our time are resources for the future. The future is being calculated. The future is made by algorithms and those who use them to use us.”

When we believe such a future is the only one possible, it becomes a self-fulfilling prophecy. The present becomes hijacked by the future. The best way to predict the future is to invent it, goes the Silicon Valley mantra. But bringing this future about requires two things on a mass scale: attention and intention.

Attention, as in an AI system, means identifying the most important things to focus on, then focusing your time and other resources on them, prioritising them above others. These two acts require intention. Newton and Galileo came up with world-changing insights by spending years focused on specific topics. Almost no one does this today, because our attention is fragmented by work that is parsed into time-bound chunks, and by social media’s constant stream of bite-sized distractions.

I know a philosopher who reads and writes a lot, does very little social media, and subsequently gives talks without notes, and can engage in deep conversations, complete with quotes and references, all from memory. For what it’s worth.

Press enter or click to view image in full size
Charles Babbage’s Difference Engine, in the London Science Museum

Emit events

I know — very few of us can afford to simply switch off, ignore clock time and do our own thing. We can refuse to devote any attention to AI, but through our employers, governments, providers of goods and services, it will increasingly find us.

Besides maintaining a basic level of skills, knowledge and criticality, there are some specific things we can do to push back against both clock time and AI.

Before clocks, getting something done depended on “shifting factors like the worker’s health or mood, the weather, and the available daylight during that particular season,” writes Maria Popova. I’ve found value in using this kind of “event time” over clock time, whenever possible. Clock time versus event time is linked to cultural differences: in some cultures, a meeting starts and ends precisely at the clock times indicated; in others, it starts when everyone arrives and ends when it ends.

You can do this yourself by, for example, finishing one manageable task before moving on, as I discuss here. This serves the additional purpose of minimizing cognitive load by not splitting your attention between different things.

Computers use event times as well as clock times. The online programming environment Node.js, for example, has a built-in module called “Events”. You can create, trigger, and listen for your own events (like a key being pressed or some data coming in). All event properties and methods are an instance of an “EventEmitter” object; to trigger an event, use the “emit()” method.

In real life too, you can create and trigger your own events — for example continuing to do some task until some event occurs. In an experiment I ran with a colleague, one participant carried this to an absurd extreme, using different sounds she heard to dictate different types of activities to engage in.

In addition to discrete events, computers use states — for example repeating some task while the mouse is being pressed. What state are you in now? Can you think of different states you go through during a day? These are similar to the chronobiological phases a body passes through daily, which we detail here. (Side note: the mystic Emanuel Swedenborg said that Hell is not a place but a state.)

Again, the goal is not increased personal productivity, but mental and physical wellbeing. We need to question our accepted notions of progress and productivity — who is pushing these and why?

Press enter or click to view image in full size
Image: valentinsimon0

Life is a party

In 1984, the sole political party constantly rewrites history through its Ministry of Truth, erasing inconvenient facts and altering records. “Who controls the past controls the future: who controls the present controls the past” is the central mechanism of power. By destroying historical records and manipulating collective memory, the Party eliminates any basis for questioning current conditions or imagining alternative futures.

AI is not only following clock time in becoming embedded in human societies, but also playing with collective memory through its use of training data. It’s already being employed to manipulate elections with fake generated content, so it’s easy to imagine the sort of manipulation in 1984.

But another sci fi story offers hope. The society in Harlan Ellison’s award-winning 1965 short story “’Repent, Harlequin!’ Said the Ticktockman” is not far off Orwell’s, but is focused on punctuality and efficiency. It’s ruled by the Ticktockman, who enforces strict schedules and punishes tardiness by reducing people’s lifespans — literally stealing time from their lives.

The Harlequin of the title disrupts this rigid system through acts of creative chaos, like dumping jelly beans onto one of the ubiquitous moving sidewalks. Like Orwell’s protagonist, the Harlequin is eventually captured and subjected to psychological reconditioning. But the Ticktockman arrives late to the Harlequin’s execution, unable to escape human fallibility himself.

Could you imagine Harlequin-esque disruptions of clock time today? Of AI?

You can’t remove the clock from your phone display, but someone I met recently came up with an ingenious workaround: changing the language of the digital clock to Arabic, a language she doesn’t read — beautiful but mysterious. Even when the clock strikes 13.

--

--

No responses yet