How Do We Decide Which Strangers to Trust?

That’s what a group of researchers is hoping to figure out—and to do so, they’re using a pretty cute humanoid robot named Nexi.
Specifically, they wonder whether our judgments about trust are based on nonverbal gestures and cues, and which ones. So they’ve programmed Nexi to make certain subtle gestures while speaking to volunteers. As you can see, Nexi can create a range of facial expressions by moving her eyes, eyebrows, and mouth:

She can also move her lower arm, wrists, thumb, and fingers. The researchers can control Nexi’s every movement, allowing them to test and identify which signals might lead a person to trust her (or distrust her). As researcher David DeSteno, a psychologist at Northeastern University, predicts:

People tend to mimic each other’s body language, which might help them develop intuitions about what other people are feeling—intuitions about whether they’ll treat them fairly.

After the volunteers chit chat with the robot for 10 minutes, they’re asked to play an economic game in which they have to predict how much money Nexi will give them at her own expense—and, at the same time, decide how much they will give the robot. How the volunteers ultimately decide to treat Nexi might not depend on one particular gesture, DeSteno says in an early description of the research, but more likely results from “a ‘dance’ that happens between the strangers, which leads them to trust or not trust the other.”

Category: Technology


Leave a Reply