Skip to main content

How Does the Brain Process Time?

Leonard Kelley holds a bachelor's in physics with a minor in mathematics. He loves the academic world and strives to constantly explore it.

The funny thing about memory is that it involves past recall for our present to help anticipate the future. Dean Buonomano has given the brain the title of a time machine because of this, just not necessarily in the conventional sense. In fact, pinning down the behavior of memory is tricky for this very reason: it is a feature of temporal data. And that, folks, gets very tricky very quickly.

Basic Questions

Somehow, our brain takes our cumulative experience so far and when they happened, builds a mental model of how the world works, and can then predict events based on how far it could happen from any given moment. Somehow, our brain develops “the sense of time” from hearing, seeing, feeling, tasting, and smelling our surroundings and not from “a form of energy or a fundamental property of matter that can be detected.” Somehow, our brain allows for reflection and prediction based upon all of this. Somehow however isn’t sufficient for us, is it? (Buonomano 20-2)

Really a Day?

How good is your internal clock? Our estimation of time passage fluctuates greatly depending on our situations. We have had time stretch when bored or go too quickly when having fun. One aspect that we frequently mess up is our circadian rhythm, or roughly our sleep schedule. A 24-hour day was found to be convenient by humans with about 8 hours of rest and 16 hours of wakefulness but really has no other basis for being the time in a day, and animals are not as tuned to it as you would think (35-7)

Sure, we have a fundamental sense of a day based on light and darkness, but our sleep/active cycle isn’t as clear cut as that. In fact, nocturnal animals usually have a circadian cycle that is less than 24 hours while diurnal animals have a circadian cycle more than 24 hours. So, this may lead one to wonder how an animal will cycle without the alteration of day and night? (Ibid)

Thus arose isolation experiments, which aimed to see how circadian cycles were impacted without those external cues. One of the most famous of these experiments was conducted in 1972 by Michel Siffre, who stayed in a cave for 6 months. Now that is dedication to science, people. He was able to communicate with the outside world via phone but had no external sources of light to indicate what time of day it was (37-9).

And to no one’s surprise, once removed from normal cycling the human mind goes through depression and forgetfulness. Michel’s circadian cycle went nuts, too. He eventually ended up in a 48-hour cycle, with 16 straight hours sleeping and 32 straight hours awake. After 6 months (really, 179 days), Michel exited the cave and reported only 151 days had passed by. He lost nearly a month of his life, from his vantage point (Ibid).

Does this just happen to humans? Well, an experiment with rats points to no and in fact revealed the sleep center of the brain. Called the suprachiasmatic nucleus, the sleep center only has about 10,000 neurons and is located below the hypothalamus and above the optic chiasm. In experiments from the 1980s, rats with altered centers were found to have free running circadian cycle like Michel (40-1).

But one bonus of this brain structure is that it is easily removable and transplantable (in fact, it’s one of the few places with the brain that operate like this, mainly due to its compartmentalized nature). So scientists took the altered centers and transplanted them into normal rats, which quickly developed the sleep habits of the prior rat (Ibid).

And if that wasn’t weird enough, there is evidence that cells can have some free running circadian cycles! Looking at the oscillations of incoming and outgoing proteins, and it does form a cycle. How long? If temperature and fluidity are kept constant, then…24 hours. Woah seems that cells didn’t get the message of the arbitrariness of a day length! (42)

So time perception seems to be broad, with complex life forms like us and single cells demonstrating different methods for tracking it. In fact, that hints at not equating the suprachiasmatic nucleus to being in charge of all our time perception. The isolation experiments also point to this. People were asked to report when they felt 10-120 seconds, 16 hours, and 44 hours had passed by (52-4).

The results showed during the 44-hour window people scales 1 of their hours to 3.5 of ours or a 1:3.5 ratio, the 16-hour window had a 1:2 ratio, and the 10-120 seconds had a 1:1 ratio. Smaller time intervals were pretty accurate but long-time spans were not. Different pieces of the brain govern different time scales of perception, it seems (Ibid).


The brain is many things, but a clock it is not. We use external references with some periodic motion to define the passage of time, as the isolation experiments demonstrated rather well. But for events on the millisecond to second range, its often too quick to notice and yet we do have a reactivity in that range.

Read More From Owlcation

One model that accounts for this is the internal clock model, where certain neurons which fire at a specific frequency are counted by other neurons and that count give a beat for the timing of a microevent. Neurons certain do oscillate, but count? Not so much as we now know (101-2).

So it is time (probably not the only pun you will find here) to delve into the brain and see what’s going on. And trust me, it’s a plenty. The brain is composed of roughly 100 billion neurons and 100s of trillions of synapses. Neurons are what send/receive electrical signals and synapses are the space between neurons where the transference occurs (27-9).

To clarify a bit, we have a presynaptic neuron which sends the signal and the post synaptic neuron which receives it. We gauge the strength of a synapse based on how much the presynaptic neuron impacts the postsynaptic neuron, and because of neural circuity it can adapt much like a plastic and change the strength of the synapses (Ibid)

Signs point to short-term synaptic plasticity (STSP) as a more likely model than the internal clock. In STSP, changes to synaptic strength can be gauged based on their usage. And the time range of these changes? Why, over the desired millisecond to second range, as “short-term fluctuations” increase or decrease synaptic firing. The amount of that change will depend on the time interval between the synaptic firings, with the maximum change usually occurring before 100 milliseconds post-firing, with subsequent peaks decreasing much like ripples on a pond (106-7).

Buonomano contends that it is through STSP that the time management of the brain over the millisecond range is achieved. To demonstrate, he considers a 2-neuron circuit (for simplicities sake). The presynaptic neuron fires at a 50, 100, or 200 millisecond intervals between spikes. The first one will be at constant 1 millivolt but the second one will be hampered by STSP. If the postsynaptic neuron only fires for a certain spike value, then we could have a timer of sorts (based on the time interval required for the STSP to yield the spike we desire). There is evidence that simple circuits of excitatory and inhibitory neurons with STSP behavior have this interval selectivity, but as to how a neuron selects this sensitivity is still a mystery (107-9).

Of course, a two-neuron network just isn’t realistic. One cubic millimeter contains about 100,000 neurons and 100s of millions of synapses. How these circuits process space and time is a hot topic especially when considering the potential STSP implications. One possible theory, developed by Buonomano and Wolfgang Maass, is state-dependent networks. Its overall idea is physics simplified: if you know a state of a system at a certain point then you can backtrack and predict the system going forward. How could we do this with neuron circuits? (109-12)

Well, one has to map out the firing phase of a neuron network at several time intervals and note the strength at 50, 100, and 200 millisecond intervals. Then, one can note any STSP changes that occur and so build a model that can be backtracked based upon a later firing. That is why it is state-dependent, for it relies on the prior firings to build a history of behavior. It is likely that the brain tracks subpopulations of neurons and notes the state-dependent network firing at a given moment to gauge the timing (Ibid).

Now, this is all fine and dandy for really small time scales, but what about your perceived notions of the past, present, and future? Surely the neurocircuitry governing when you perceived something as happening has too much complexity to fully understand. Well, perhaps we are beginning to get a handle on how the brain does this. Work by Marc Howard and Karthik Shankar developed a mathematical model that accurately shows mental past processing, and it is though indirect methods. The brain doesn’t keep a temporal data section like it does for other pieces of information (Cepelewicz).

Instead, a Laplace transform takes the temporal data and converts it into something else that the brain can then interpret later using an inverse Laplace transform, essentially arriving at its initial state. Evidence for this has sprouted in what are known as time cells, or neurons that always fire a certain time span after a stimulus has been received, regardless of what kind. If you know which cells fire off, then you can determine the correlating stimulus (Ibid).

Further evidence for this modeling came from work by Albert Tsao, who was researching the lateral entorhinal cortex, the companion structure to the medial entorhinal cortex which is used for spatial navigation. Both of these feed into the hippocampus, which is the center for episodic memory, but did the lateral entorhinal cortex really have time information encoded into it? (Ibid)

An experiment was set up with rats looking for food, being removed for a few minutes, then placed back in. This was done 12 times over about 90 minutes. Brain activity was recorded at that cortex and the surrounding areas during all of this. Each time the rat was placed back into the box (whose walls were alternated with black and white colors), that cortex fired off, and since the environment was different the rat was remembering the timing of the last event (Ibid).

Based on these cells firing and the increased/decreased activity of other proximity areas, a model started to emerge showing the brain’s ability to develop a timeline of events. When the rats were then put into a figure 8 racetrack and ran through it in different directions, the lateral entorhinal cortex fired the same, indicating the rat’s couldn’t tell the trials apart but the neuron activity around the cortex did seem to note the passage of events during the race. The pattern seen fit the Laplace transforms developed by Howard and Shankar (Ibid).

Learn from the Past, Prepare for the Future

It may be surprising to learn that different pieces of the brain govern our prospective and retrospective thinking and yet they are frequently used in tandem when studying potential behavior for the present. Unless you use some external processes to record the timing, our prospective timing is dependent on our retrospection aka memories, which is troubling because of their fluid nature (Buonomano 59-60).

And as anyone who is excited can tell you that perception seems to be accelerated while boredom takes forever to pass by. Therefore, it’s safe to say that the level of interest in the activity we are participating in is inversely related to the awareness of the time passing by. And the transition from prospective to retrospective timing seems to only take the highlights of the vents and condenses their time value (Ibid).

But where is some science to back this up? Can we actually note these transitions? Or even their individual behavior? Time scaling does seem related to the activities we do but really this is code for what cognitive load we have at a given moment. So fluctuate that and start taking some brains scans.

One such experiment is done with a deck of cards. A task with little thinking required would be putting the cards face up while a higher thinking task would be to put the cards into piles of each suit. Then people were asked to estimate the time length of the task they had just performed. For those who knew they were going to be asked this before the task, they had to be prospective in their thinking and the lower load people averaged 53 seconds while the higher load averaged 31 seconds (in self reporting). But people who were not told beforehand that they were going to give an estimate averaged roughly the same (33 seconds to 28 seconds) amount of time. So, if aware of the timing when we tend to induce dilating effects to our subjective time. Interest does impact our timing (61-2).

Another experiment verifying this really demonstrates the power of perception over our supposed timing of reality. Conducted by Virginie van Wassenhove. A static circle was shown to people for 500 milliseconds, then again for either 450 milliseconds or 550 milliseconds. People were then to indicate if the second circle was shorter or longer. And post people were accurately correlated to the actual time frame. But then the experiment made the second circle grow in size but had the same time lengths. And people overwhelmingly said the second circle was longer in duration no matter what (62-3).

Of course, all of this was just a glimpse into the timing mechanism of the brain. More awaits us but at least these early steps will point to a promising future of time comprehension in terms of neuromechanics.

Works Cited

Buonomano, Dean. Your Brain is a Time Machine. W.W. Norton & Company, Inc. New York, NY. 2017. Print. 20-2, 27-9, 35-42, 52-4, 59-63, 101-2, 106-112.

Cepelewicz, Jordana. “How the Brain Creates a Timeline of the Past.” Quanta, 12 Feb. 2019. Web. 09 Mar. 2021.

This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.

© 2022 Leonard Kelley

Related Articles