September 08

Car Talk

For many Americans—and southern Californians in particular—a good chunk of our lives occurs in the confines of a car above a tangle of highways (or side streets). Time spent in traffic is the pits, so it’s no small wonder that drivers might dabble in multi-tasking.  Driving itself involves a coordination of many tasks, both perceptual (scanning traffic, checking mirrors) and motor (using turn signals, accelerating, braking, and so on). It may come as no surprise that adding additional undertakings, like talking on the phone, can lead to sub-optimal driving performance [e.g,. 1-2].

It may seem intuitive that multi-tasking would be harder than performing a single task, and that performance on any one task might suffer when another task is added (think rubbing your belly while patting your head). But it’s also true that some tasks interfere with one another more than others—for example, the act of simultaneously rubbing my belly and speaking out loud (e.g., exclaiming “MMMMMMM” after a tasty meal) has never posed a problem for me.

Thinking beyond belly rubbing, researchers have speculated that some types of language might be more likely to interfere with driving than others [3]. This brand of hypothesis comes from research suggesting that language comprehension—the process of deriving a meaningful understanding of written or spoken language—can in some cases rely on motor and perceptual representations in the brain. For example, when processing language about different types of motion, areas of the brain involved in planning or producing actual motion are at times recruited [4]. Moreover, when people process language involving specific types of perceptual content (e.g., language about things are up, like the sky, vs. down, like a basement), they show specific interference with making judgments based on real perception of images in the upper vs. lower parts of the visual field [5]. Bergen and colleagues hypothesized that because driving relies so much on visual perception and motor function, sentences containing motor and visual descriptions would be particularly likely to interfere with driving performance.


Example driving simulator. Source: Wikimedia Commons.



Using a fancy-schmancy driving simulator with realistic controls (steering, braking, and so on), Bergen and colleagues were able to study what happens when people listen to different types of content and drive, all in the controlled environment of a lab. In their study, participants had a 180-degree view of the road and were trained to follow a car at a steady distance, braking when the lead car braked, and so on. Meanwhile, they had to listen to four different types of language, each presented in a different block of the study, and respond with whether sentences were true or false (for each condition, half were true). These included sentences describing motor events (‘To open a jar, you turn the lid counter-clockwise’); sentences containing visual descriptions (‘The Golden Gate bridge is shiny and silver’); sentences containing mostly abstract descriptions (‘There is a total of 100 senators in the U.S. Senate’); and control sentences instructing participants simply to ‘Say the word true’ (or false).

As a critical part of their design, the simulator was synchronized with the offset of the spoken sentences so that the lead car braked just as the sentence was concluding. The researchers then looked both at braking time (as a measure of potential interference with immediate attention) and following distance over the whole block (as a measure of higher-level cognitive processes related to driving). They predicted that if only general attention resources are shared between driving and language comprehension, then all kinds of language should disrupt driving equally (except for the control condition). In contrast, if motor- and/or vision-specific resources are tapped into during both language and driving, then driving would be disrupted only by these types of language (and not abstract language or the control condition).

Surprise! The researchers found evidence for both types of resources. On the immediate measure of braking time, people were equally impacted by all forms of language, other than the control. But on the measure of average following distance over the course of each block, distance was highest for the visual block, and slightly higher for the motor block compared to the abstract block.

The authors suggested a couple of possible interpretation of these results. For one, people might have some level of awareness that their (e.g., visual) resources are being taxed, and adjust their driving to be more conservative. Another possibility is that people aren’t necessarily adjusting to be more conservative, but that the additional task (processing language about visual content) interferes with the goal of driving forward.

One takeaway from this study is that while some types of language may indeed influence at least some aspects of driving more than other types of language, any language influenced the immediate ability to react to traffic. Though listening to factoid sentences through a speaker isn’t exactly the same as being on the phone with someone, it approaches an experience where a conversational partner isn’t present in the environment. Indeed, other research using driving simulators suggests that passengers, but not phone conversation partners, can be more in tune with the driving situation, providing one reason why talking on the phone can be particularly dangerous [6]. Splitting attention between the road and a phone call impairs driving more than splitting attention between the road and a passenger.

Someday soon, we may all be glued to laptops in the back seat while our self-driving cars propel us toward our destination. But for now, limiting communication via technology in the car seems like the safest bet.


  1. Brown, I.D., Tickner, A.H., & Simmonds, D.C.V. (1969). Interference between concurrent tasks of driving and telephoning. Journal of Applied Psychology, 53(5), 419-424.
  2. Lamble, D., Kauranen, T., Laakso, M., & Summala, H. (1999). Cognitive load and detection thresholds in car following situations: safety implications for using mobile (cellular) telephones while driving. Accident Analysis and Prevention, 31, 617-623.
  3. Bergen, B., Medeiros-Ward, N., Wheeler, K., Drews, F., & Strayer, D. (2013). The crosstalk hypothesis: Why language interferes with driving. Journal of Experimental Psychology: General, 142(1), 119-130.
  4. Tettamanti, M., Buccino, G., Saccuman, M.C., Gallese, V., Danna, M, Scifo, P., Fazio, F., Rizzolatti, G., Cappa, S.F., & Perani, D. (2005). Listening to action-related sentences activates fronto-parietal motor circuits. Journal of Cognitive Neuroscience, 17(2), 273-281.
  5. Bergen, B.K., Lindsay, S., Matlock, T., & Narayanan, S. (2007). Spatial and linguistic aspects of visual imagery in sentence comprehension. Cognitive Science, 31, 733-764.
  6. Drews, F.A., Pasupathi, M., & Strayer, D.L. (2008). Passenger and cell phone conversations in simulated driving. Journal of Experimental Psychology: Applied, 14(4), 392-400.