Mindreading, Driving, and Limitations for Self-Driving Cars

Mindreading, Driving, and Limitations for Self-Driving Cars



Mindreading, Driving, and Limitations for Self-Driving Cars

Driving involves reading minds. You can’t simply treat other drivers as objects. If you want to be a safe and effective driver, you need skills to imagine what others are thinking and trying to do. You need to read their minds.

Mindreading and Theory of Mind

Driving through a city is not the same as navigating an obstacle course. When you drive, not only do some of those other obstacles move, but many have their own goals and intentions. Other people are driving those cars, trucks, and buses. There are people walking or cycling through town. And maybe there’s a dog, cat, or some other animal crossing the road.

Figuring out what the other person is trying to do helps you predict and respond to their actions. Are they turning? Is that pedestrian going to cross the road? Is that child likely to run into the road?

Theory of mind is the cognitive and emotional skill that enables us to figure out what other people know, plan, and intend. Having a theory of mind allows you to recognize that other people have their own thoughts and emotions that are separate from yours. Developing a strong theory of mind capability lets you figure out and track what the other person is trying to do.

Young children gradually develop theory of mind skills. It’s why they have a hard time lying or playing hide and seek (see “Teach Your Children to Lie“). They don’t understand what others know, so playing games that involve deception is an impossible challenge. But even as adults, we can still be challenged to know and track what others are thinking and feeling (Bernstein et al., 2017). This is one of the constant challenges of communicating – what does that person know and what do I need to tell them? (See why we are sometimes like 3-year-olds.)

Mindreading and Driving

When we drive, we constantly use theory of mind. We use information about the other people around us to predict their movements. If we only treat each other as objects to be avoided, then we will be less successful than if we are aware of who each other is. Cars, trucks, cyclists, pedestrians, children, and animals behave differently (Hyman et al., 2014). We need to be able to track where each person is trying to go. We can then use that information to predict and navigate around them. I started thinking about theory of mind and driving because of a recent BlueSky post I saw from Olivia Guest. She noted that self-driving cars need, but don’t have, theory of mind skills.

Adult ease of using theory of mind predicts safe driving. People who score higher on theory of mind tasks are quicker at detecting driving hazards (Goodhew et al., 2025; Nori et al, 2024). The link between mindreading and driving probably reflects a better ability to predict what other road users are planning, giving you better anticipation.

Billiard balls or people

What does this mean for self-driving cars? Everything hinges on how you understand the way that cars (and other things) move on the road. This would determine the way you’d build an AI system to navigate.

You could simply treat each object as a billiard ball. You’d model predictions of future movement based on location, speed, and acceleration. This probably works on freeways, at least most of the time.

But for city driving, you need more. You need the intention and goal of the other drivers. In slow moving, complex situations, you need to understand what the other person is trying to accomplish. You need a theory of mind.

You could build self-driving AI systems based on either of these approaches. You could build a system to solve a problem and ignore how people solve the problem. In contrast, you could try to understand how people solve the problem and model that.

Solving the problem without attending to what people do can get you a long way. You can build a very good chess playing machine, for example, that can beat most people. But you can’t beat grand masters. To do that, you must understand how experts solve the problem. You need AI that models how people understand the problem (Guest & Martin, 2023; van Rooij et al., 2024).

We are all experts at mind reading, at having a theory of mind. It is so basic, that people are often unaware of how much they use it. As a driver, pedestrian, and cyclist, I’m aware of how much I try to understand and predict the action of every driver near me. Seeing their heads and eyes helps a lot. We rely on where people look to understand their goals. We constantly use these mindreading skills on city streets.

Theory of Mind Essential Reads

Mindreading by Self-Driving Cars?

Do self-driving cars engage in mindreading? Probably not. And this is one of the serious limiting factors for navigating complex spaces with lots of other people each engaged in their own goals. This was the point that Olivia Guest made about self-driving cars needing theory of mind. Interestingly, this is probably why self-driving cars use humans as outside guides in complex situations (Tagerman, 2026). Since robot cars don’t have a theory of mind, they call in the experts in mindreading. Analyzing those goals makes it easier to predict what the other drivers and people will do. If we expect robot cars to navigate safely, they’ll need a theory of mind – they need to acknowledge and read minds.



Source link

Recommended For You

About the Author: Tony Ramos

Leave a Reply

Your email address will not be published. Required fields are marked *

Home Privacy Policy Terms Of Use Anti Spam Policy Contact Us Affiliate Disclosure DMCA Earnings Disclaimer