Researchers use Virtual Human Interaction Lab to study body language September 2, 2014 0 Comments Share tweet Hannah Knowles Staff Writer By: Hannah Knowles | Staff Writer Two Stanford researchers in the Department of Communication are using the technology of the Virtual Human Interaction Lab (VHIL) to quantify body language by linking nonverbal behavior to creativity and successful learning. Jeremy Bailenson, founding director of the VHIL and associate professor of communication, and Andrea Stevenson Won, a fourth-year communication doctoral student, useda video game sensor called the Kinect to track subjects’ movements during experiments without relying on markers worn on the body. The technology allowed the scientists to analyze a large amount of data on body language in a more precise way. “People have been studying nonverbal behavior for a long time,” Won said. “The most exciting thing about what we did was finding a way to track and draw conclusions from nonverbal behavior as a sort of good accompaniment to what people naturally see.” The VHIL was founded in 2003 to explore the implications of people’s online interactions through avatars, or virtual bodies. Taking advantage of the lab’s gaming devices, the communication researchers studied the head and body movements of more than 100 people, who interacted in pairs. In their first experiment, published in the journal IEEE Transactions on Affective Computing, Bailenson and Won found that certain nonverbal behavior could predict the success of learning. One member of the pair, the “teacher,” was given five minutes to teach a “student” about water efficiency while the Kinects recorded. Afterward, the student was asked to recall the lesson. A comparison of the data from each recording revealed that students who showed extreme upper body movements tended to retain less information. Students also performed worse when teachers had moved their heads or torsos suddenly. The second experiment, published in the Journal of Nonverbal Behavior, focused on body language and creativity. Pairs brainstormed ways to conserve water, and theircreative outputs were measured by the number of ideasthey produced. Once again, a pattern in nonverbal behavior emerged. Backing up previous research on the topic with more objective data, the researchers found that couples in which the individuals moved more synchronously with each other produced more ideas. “Our research demonstrates that a ‘digital footprint,’ the nonverbal behavior automatically tracked by video game and virtual reality hardware, can be very telling,” Bailenson said. “When people are online they have an illusion of being anonymous, but combining tracking data with machine learning provides many clues about who you are and what you are doing.” The scientists say that their results are one piece of a complex, much larger puzzle. “It’s important to point out – we don’t want to say that based on our one experiment we’ve figured out exactly what people need to do to teach or to learn effectively,” Won said. “It’s a very complicated set of interactions. I would not want to say that, based on our study, everyone should hold perfectly still while they’re learning.” Still, nonverbal behavior’s ability to predict performance has the potential for real-world applications. The results of the experiments suggest ways to improve learning and creativity in certain situations. For example, people who naturally synchronize well could be paired on a project to maximize productivity, and online learning could be tailored to the student’s nonverbal cues. “We hope to expand the learning research to design systems that can dynamically change a teacher’s voice, gestures and course content based on the learner’s movements,” Bailenson said. Currently, they are conducting a new experiment on synchronization that they hope will reproduce their earlier results and give more insight into why synchronization matters. But at the moment, they cannot reveal too much about the study. “The results are very cool, and I’m excited to follow up on them,” Won said. Contact Hannah Knowles at 15hknowles ‘at’ castilleja ‘dot’ org. body language department of communication Virtual Human Interaction Lab 2014-09-02 Hannah Knowles September 2, 2014 0 Comments Share tweet Subscribe Click here to subscribe to our daily newsletter of top headlines.