Developing Computers to Read Audience Engagement
March 3, 2016 Leave a comment
When people are really engrossed in something that they are reading, watching, or playing, they tend to subconsciously reduce involuntary movements, You’ve probably seen it with small children who are normally quite active but sit in rapt attention while watching cartoons. A recent study performed at Brighton and Sussex Medical School used computers to “read” the body language of subjects and determine how engaged they were with the material they were watching.
The goal was to determine if such technology was viable, in order to build computers or robots which can read their users and determine if they’re engaged or not. Such technology has a lot of applications. Learning software could measure whether or not students were engaged, and respond accordingly. Companion robots, which many propose could well become a common thing in the near future, would be able to read the humans that they are supposed to be helping or otherwise interacting with to make sure that they are as effective as possible.
For directors or video game developers, this technology could be very useful. While people can offer their subjective opinions about whether or not they enjoyed a film, show, or game, they might leave some things out, whether consciously or otherwise. They may not realize just how interested they were, or weren’t, during the experience, or they may choose not to express some aspects of their experience for a myriad of personal reasons. But, if the computers that they were watching or interacting with cold read their movements to track how engaged they were, content producers would be able to better tweak their productions in order to maximize audience engagement.
That’s all a ways down the road, of course. The study only involved 27 participants, so more research is absolutely needed, but it’s a promising start to some really interesting technology.