What is the meaning and importance of vigilance?

Vigilance is a concept that holds great significance in our daily lives, yet it is often overlooked or misunderstood. It refers to the state of being watchful and alert, especially in regards to potential dangers or risks. In today’s fast-paced and ever-changing world, the need for vigilance has become even more crucial. From ensuring our personal safety to safeguarding our communities and nations, being vigilant plays a vital role in maintaining security and stability. In this essay, we will delve into the meaning and importance of vigilance, and how it impacts our lives in various aspects.

Meerkat Keeping Watch


In modern psychology, vigilance, also termed sustained concentration, is defined as the ability to maintain concentrated attention over prolonged periods of time. During this time, the person attempts to detect the appearance of a particular target stimulus. The individual watches for a signal stimulus that may occur at an unknown time. The study of vigilance has expanded since the 1940s mainly due to the increased interaction of people with machines for applications involving monitoring and detection of rare events and weak signals. Such applications include air traffic control, inspection and quality control, automated navigation, military and border surveillance, and lifeguarding.


Origins of Research

The systematic study of vigilance was initiated by Norman Mackworth during World War II. Mackworth authored “The breakdown of vigilance during prolonged visual search” in 1948 and this paper is the seminal publication on vigilance. Mackworth’s 1948 study investigated the tendency of radar and sonar operators to miss rare irregular event detections near the end of their watch. Mackworth simulated rare irregular events on a radar display by having the test participants watch an unmarked clock face over a 2-hour period. A single clock hand moved in small equal increments around the clock face, with the exception of occasional larger jumps. This device became known as the Mackworth Clock. Participants were tasked to report when they detected the larger jumps. Mackworth’s results indicated a decline in signal detection over time, known as a vigilance decrement. The participants’ event detection declined between 10 and 15 percent in the first 30 minutes and then continued to decline more gradually for the remaining 90 minutes. Mackworth’s method became known as the “Clock Test” and this method has been employed in subsequent investigations.


Vigilance Decrement

Vigilance decrement is defined as “deterioration in the ability to remain vigilant for critical signals with time, as indicated by a decline in the rate of the correct detection of signals”. Vigilance decrement is most commonly associated with monitoring to detect a weak target signal. Detection performance loss is less likely to occur in cases where the target signal exhibits a high saliency. For example, a radar operator would be unlikely to miss a rare target at the end of a watch if it were a large bright flashing signal, but might miss a small dim signal.

Under most conditions, vigilance decrement becomes significant within the first 15 minutes of attention, but a decline in detection performance can occur more quickly if the task demand conditions are high. This occurs in both experienced and novice task performers. Vigilance had traditionally been associated with low cognitive demand and vigilance decrement with a decline in arousal pursuant to the low cognitive demand, but these views are no longer widely held. More recent studies indicate that vigilance is hard work, requiring the allocation of significant cognitive resources, and inducing significant levels of stress.


Vigilance Decrement and Signal Detection Theory

Green and Swets formulated the Signal Detection Theory, or SDT, in 1966 to characterize detection task performance sensitivity while accounting for both the observer’s perceptual ability and willingness to respond. SDT assumes an active observer making perceptual judgments as conditions of uncertainty vary. A decision maker can vary their sensitivity, characterized by d’, to allow more or less correct detections, but at the respective cost of more or less false alarms. This is termed a criterion shift. The degree to which the observer tolerates false alarms to achieve a higher rate of detection is termed the bias. Bias represents a strategy to minimize the consequences of missed targets and false alarms. As an example, the lookout during a bank robbery must set a threshold for how “cop-like” an approaching individual or vehicle may be. Failing to detect the “cop” in a timely fashion may result in jail time, but a false alarm will result in a lost opportunity to steal money. In order to produce a bias-free measure, d’ is calculated by measuring the distance between the means of the signal and non-signals (noise) and scaling by the standard deviation of the noise. Mathematically, this can be accomplished by subtracting the z-score of the hit rate from the z-score of the false alarm rate. Application of SDT to the study of vigilance indicates that in most, but not all cases, vigilance decrement is not the result of a reduction in sensitivity over time. In most cases a reduction of detections is accompanied by a commensurate reduction in false alarms, such that d’ is relatively unchanged.


Vigilance Taxonomy: Discrimination Type and Event Rate

Mental workload, or cognitive load, based on task differences can significantly affect the degree of vigilance decrement. In 1977, Parasuraman and Davies investigated the effect of two task difference variables on d’, and proposed the existence of a vigilance taxonomy based on discrimination type and event rate. Parasuraman and Davies employed discrimination tasks which were either successive or simultaneous, and presented both at high and low event rates. Successive discrimination tasks where critical information must be retained in working memory generate a greater mental workload than simultaneous comparison tasks. Their results indicate the type of discrimination and the rate at which discriminable events occur interact to affect sustained attention. Successive discrimination tasks indicate a greater degree of vigilance decrement than simultaneous discriminations, such as comparisons, but only when event rates are relatively high. For detection tasks, empirical evidence suggests that an event rate at or above 24 events per minute significantly reduces sensitivity. Further investigation has indicated that when the discrimination task is difficult, a decrement can occur when the mental workload is low, as with simultaneous comparisons, at both high and low event rates.

The effect of event rate on monitoring task performance can be affected by the addition of non-target salient objects at varying frequencies. Clock test research conducted in the late 1950s and 1960s indicates that an increase in event rate for rare irregular low salience signals reduced the vigilance decrement. When non-target “artificial” signals similar to target signals were introduced, the vigilance decrement was also reduced. When the “artificial” signal differed significantly from the target signal, no performance improvement was measured.

Other dimensions beyond event rate and discrimination task difficulty affect the performance of vigilance tasks and are factors in the Vigilance Taxonomy. These include but are not limited to: sensory modality, or combinations of sensory modalities; source complexity; signal duration; signal intensity; multiple signal sources; discrete versus continuous events; intermittent versus continuous attention requirement; observer skill level; and stimulation value.


Measuring Mental Workload During Vigilance Tasks

Initial Vigilance Taxonomy studies relied on assumptions regarding the mental workload associated with discrimination tasks, rather than a direct quantification of that workload. Successive discriminations, for example, were assumed to impose a greater workload than simultaneous discriminations. Beginning in the late 1990s, neuroimaging techniques such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI) and Transcranial Doppler sonography (TCD) have been employed to independently assess brain activation and mental workload during vigilance experiments. These neuroimaging techniques estimate brain activation by measuring the blood flow (fMRI and TCD) or glucose metabolism (PET) associated with specific brain regions. Research employing these techniques has linked increases in mental workload and allocation of attentional resources with increased activity in the prefrontal cortex. Studies employing PET, fMRI and TCD indicate a decline in activity in the prefrontal cortex correlates with vigilance decrement. Neouroimaging studies also indicate that the control of vigilance may reside in the right cerebral hemisphere in a variety of brain regions.


Associated Brain Regions

Reductions in arousal generally correspond to reductions in vigilance. Arousal is defined as a component of vigilance, though it is not, as one may believe, the sole source of the main effect of the vigilance decrement.

As such, subcortical brain regions associated with arousal play a critical role in the performance of vigilance tasks. Because the amygdala plays an important role in the recognition of emotional stimuli, it appears to be an important brain structure in the regulation of vigilance.

Subcortical brain regions associated with arousal include the basal forebrain cholinergic system, and the locus coreulus (LC) noradrenergic system. Both regions are components of the reticular activating system (RAS). The basal forebrain cholinergic system is associated with cortical acetylcholine release, which is associated with cortical arousal. Blocking the release of acetylcholine in the forebrain with GABAergic compounds impairs vigilance performance.

Several cortical brain regions are associated with attention and vigilance. These include the right frontal, inferior parietal, prefrontal, superior temporal cortices and cingulate gyrus. In the frontal lobe, fMRI and TCD data indicate that brain activation increases during vigilance tasks with greater activation in the right hemisphere. Lesion and split brain studies indicate better right-brain performance on vigilance tasks, indicating an important role for the right frontal cortex in vigilance tasks. Activity in the LC noradrenergic system is associated with the alert waking state in animals through the release of noradrenaline. Chemically blocking the release of noradrenaline induces drowsiness and lapses in attention associated with a vigilance decrement. The dorsolateral prefrontal cortex exhibits a higher level of activation than other significantly active areas, indicating a key role in vigilance.

The cingulate gyrus differs from other brain regions associated with vigilance in that it exhibits less activation during vigilance tasks. The role of the cingulate gyrus in vigilance is unclear, but its proximity and connections to the corpus callosum, which regulates interhemispheric activity, may be significant. Reduced activation in the cingulate gyrus may be a by-product of asymmetrical frontal lobe activation initiated in the corpus callosum.



Stressful activities involve continuous application of extensive cognitive resources. If the vigilance decrement were the result of less brain activity rather than more, vigilance tasks could not be expected to be stressful. High levels of epinephrine and norepinephrine are correlated with continuous extensive mental workloads, making these compounds good chemical indicators of stress levels. Subjects performing vigilance tasks exhibit elevated levels of epinephrine and norepinephrine, consistent with high stress levels and indicative of a significant mental workload. Vigilance tasks may therefore be assumed to be stressful, hard mental work.


Individual Differences in Performance

Large individual differences in monitoring task performance have been reported in a number of vigilance studies. For a given task, however, the vigilance decrement between subjects is generally consistent over time, such that individuals exhibiting relatively higher levels of performance for a given task maintain that level of performance over time. For different tasks, however, individual performance differences are not consistent for any one individual may not correlate well from one task to another. An individual exhibiting no significant decrement while performing a counting monitoring task may exhibit a significant decrement during a clock test. Relative performance between subjects may also vary based on the nature of the task. For example, subjects whose task performance is well correlated for a successive task may exhibit a poor performance correlation for a simultaneous task. Conversely, subjects performing similar monitoring tasks, such as radar versus sonar target detection, can be expected to exhibit similar patterns of task performance.

Levine et al. propose that individual differences in task performance may be influenced by task demands. For example, some tasks may require rapid comparisons or “perceptual speed”, while others may require “flexibility of closure”, such as detection of some predefined object within a cluttered scene. Linking task performance differences to task demands is consistent with the Vigilance Taxonomy proposed by Parasuraman and Davies described above, and also supports the hypothesis that vigilance requires mental work, rather than being a passive activity.


Reducing the Vigilance Decrement with Amphetamines

Considerable research has been devoted to the reduction of the vigilance decrement. As noted above, the addition of non-target signals can improve task performance over time if the signals are similar to the target signals. Additionally, practice, performance feedback, amphetamines and rest are believed to moderate temporal performance decline without reducing sensitivity.

Beginning in the mid-1940s research was conducted to determine whether amphetamines could reduce or counteract the vigilance decrement. In 1965, Jane Mackworth conducted clock test experiments in which half of 56 participants were given a strong amphetamine and half were given a placebo. Mackworth also provided false feedback and feedback in separate trials. Mackworth analyzed detection and false alarm rates to determine d’, the measure of sensitivity. Participants dosed with amphetamine exhibited no increased sensitivity but did exhibit a highly significant reduction in vigilance decrement. In feedback trials, sensitivity increased while the performance decline was significantly reduced. In trials where both amphetamine and feedback were given, sensitivity was increased and there was no significant vigilance decrement.


Practice and Sustained Attention

Training and practice significantly reduce the vigilance decrement, reduce the false alarm rate, and may improve sensitivity for many sustained attention tasks. Changes in strategy or bias may improve task performance. Improvements based on such a criterion shift would be expected to occur early in the training process. Experiments involving both audio and visual stimuli indicate the expected training performance improvement within the first five to ten hours of practice or less.

Training improvements may also occur due to the reduced mental workload associated with task automaticity. In pilotage and airport security screening experiments, trained or expert subjects exhibit better detection of low salience targets, a reduction in false alarms, improved sensitivity, and a significantly reduced vigilance decrement. In some cases the vigilance decrement was eliminated or not apparent.



Vigilance research conducted with subjects across a range of ages conflict regarding the ability to maintain alertness and sustained attention with age. In 1991, Parasuraman and Giambra reported a trend towards lower detection rates and higher false alarm rates with age when comparing groups between 19 and 27, 40 and 55, and 70 and 80 years old. Deaton and Parasuraman reported in 1993 that beyond the age of 40 years, a trend towards lower detection rates and higher false alarm rates occurs in both cognitive tasks and sensory tasks, with higher and lower mental workloads respectively. Berardi, Parasuraman and Haxby reported no differences in 2001 in the overall levels of vigilance and the ability to sustain attention over time for when comparing middle aged (over 40) and younger subjects. Age dependent differences in cognitive tasks may differ with task type and workload, and some differences in detection and false alarms may be due to the reduction in the sensitivity of sensory organs.


Lack of Habituation

Early theories of vigilance explained the reduction of electrophysiological activity over time associated with the vigilance decrement as a result of neural habituation. Habituation is the decrease in neural responsivity due to repeated stimulation. Under passive conditions, when no task is performed, participants exhibit attenuated N100 Event Related Potentials (ERP) that indicate neural habituation, and it was assumed that habituation was also responsible for the vigilance decrement. More recent ERP studies indicate that when performance declines during a vigilance task, N100 amplitude was not diminished. These results indicate that vigilance is not the result of boredom or a reduction in neurological sensitivity.

Scroll to Top