Multi-tasking or parallel processing? Operating Systems versus processing in the brain.

To multi-task or to process the tasks in parallel are well-researched topics in computer science, tested and implemented in operating systems. When the early (and multiple successive) Microsoft Windows versions were released, it was as if you could work on different programmes at the same time. But these processes were not exactly handled by the computer at the same time – in fact, the sub-processes were interleaved and had access to the processor in successive turns until the processes were finished. Multi-tasking, but not all tasks at the very same time. Or take the background (“&”)-processes in UNIX running in the background, with or without an importance-level set to carry out the process – same case. Plug in more processors, be it in one computer or distributed over multiple computers, and the computer(s) can process the tasks truly in parallel. These solutions were invented before actually knowing how nature does it in the human brain, which, if known, could have been used to emulate it in the computer. For, as Patch has written about Masaccio, Da Vinci some 2 centuries before him, and is coincidentally also the slogan of the systems biology-focussed Microsoft Research University in Trento, Italy,  

Those who took other inspiration than from nature, master of masters, were labouring in vain”
Quegli che pigliavano per altore altro che la natura, maestra de’ maestri, s’affaticavano invano. (the original sentence from Da Vinci in Trattato della pittura, 1500) 

So, how does the brain cope with processing multiple tasks? And is that anything like currently done in operating systems? Did computer scientists ‘reinvent’ the wheel, or can they learn something from task processing strategies the brain employs? 

This month’s PLoS Biology contains an article about process handling in the brain. Biology, cognitive science, neuroscience, psychology and all that, and none of the 92 references refer to a computer science publication. It is fascinating nevertheless. 

Sigman and Dehaene [1] conducted experiments where the subjects had to perceive stimuli – numbers and tones – and decide if the number presented was larger or smaller than 45, similar for the frequency of the tones. The order and interval between the stimuli was random and then they looked at speed & delays in response time of the subjects to perform the tasks. For instance, if there is a short interval between the stimuli, but the time to complete the two tasks is the same as when performed independently, then there’s serial processing going on in the brain, if shorter, then some parallel processing is going on as well, if longer, then dual-task interference and management overhead is to blame. 

Their main conclusions are that, in addition to a “central bottleneck” (i.e., tasks are, roughly but not exclusively, executed on a first-come-first-serve basis), there is an active process of task setting; hence, a “central executive” that manages the whole thing. This central executive has four distinct architectural properties: information collection from different modules, impossibility to proceed in parallel with other processes of the same type, it is sustained for a mere few hundred of milliseconds, and is highly stochastic. An operating system certainly is not stochastic.
Anyway, concerning the brain, when you are faced with having to perform multiple tasks, you are first going to think of planning the best or preferred sequence of executing the tasks, and then carry out the processes. Alike the adagio ‘think before you act’. This planning-thinking management component for dual-task processing seems to involve three successive central stochastic decision processes: 1) task choice, 2) selection of the first response, and 3) selection of the second response.Also, there is such thing as task disengaging, whereby the execution plan set for the first tasks is suppressed in order to go on with the next one.Overall, there is an interaction between bottom-up task processing based on the input-stimuli and top-down decisions from the brain’s management centre that determines what is done first. 

Some fun-facts of the results obtained.
– Response times are slower if the subjects have less certainty about which task is presented first and when it is presented. Unpredictability slows down your acts.
– Responding to the second task does not depend on the (perception of the) stimulus for the second task, but waits for the unlocking by the process for responding on the first task.
– There may be two bottlenecks in the brain system: response selection and response initiation. That is, task setting and task disengagement – management overhead takes its time.
– Some caution: processing certain stages of tasks serially or in parallel may be a matter of experience. Try to teach that to a computer. 

With two competing processes that want to get access to the central executive, the winner is not fully determined at the time when the stimulus is presented, but the process that can be performed the quickest has an advantage over the other process that requires more resources. It is like a sad printer queue management system where printing small documents always take precedence over printing large files from computers that have a shoddy connection to the print server. Well, more precisely, it’s not necessarily the complexity or the total duration of the task that has to be carried out, but the duration of perceptual processing counts too. Many more details can be found in the article and its references.  

How they would investigate the possibility of, or with certainty excluding, interleaving subtasks (of the ‘remainder’ of the task to perform after stimuli processing) of the two main tasks, I do not know, but that also depends on how broad one defines the tasks. It would also be nice to know the percentage of time spent on task planning when an increasing amount of tasks have to be carried simultaneously. Can it get ‘stuck’ in the planning process? When does the central management module get overloaded? How large is the effect of learning on task processing, and can one achieve more parallel processing thanks to the learning factor? Which processes and tasks are more amenable to being processed in parallel, and which types can be done only serially? 

Either way, the brain’s processes seem to be a little more complex, where more variables are taken into account than the multi-tasking and multi-processing of the operating system. On the other hand, with computers one can plug in more processors for parallel computing, which is a no-no for the brain. But parallel processors still need central process management too.
So, the brain does a bit of both serial and parallel processing. That will be interesting for a computer process management system: maybe finer-grained distinctions of the sub-processes can allow for further optimization of process execution? 

[1] Sigman M, Dehaene S (2006) Dynamics of the Central Bottleneck: Dual-Task and Task Uncertainty. PLoS Biololgy 4(7): e220.