Are you tired of watching a flat avatar of yourself in VR that shows a very limited set of expressions, if not completely inexpressive? Avatars are one of the things that have captured interest of a lot of people diving into virtual reality. If we’re pitching virtual reality as the future of our world, it only makes sense to create avatars that can completely imitate our real world actions. It includes facial expressions, body movement and other not-so-obvious humanly gestures. Aiming to achieve that level in VR would be quite something, but definitely not without its challenges.
That’s where Emotional Beasts comes in. It was launched by Guillermo Bernal in 2017, as an effort to create emotive avatars in virtual reality, adapting off-the-shelf VR hardware with open-source gaming engine. For his work to create something closest to real life human expressions in VR, Bernal was awarded the 2019 Harold & Arlene Schnitzer Prize in Visual Arts by MIT.
“If you go to any state-of-the-art virtual reality platform, you’ll see avatars with faces that are static masks. I’d like to give them facial expressions, to show whether they are happy or surprised or even angry.”
~ Guillermo Bernal, MIT Media Lab
That’s not something that can be done easily. To create an emotive avatar, it’s necessary that computer running avatar can sense same emotions of the live human subject. Even if it’s figured out, there’s the question of possible secondary effects. Imagine that a computer has been taught to read emotions to create a virtual setup to facilitate mental-health diagnoses, what’s to stop someone from using it to manipulate people into buying stuff? With advancement in Bernal’s technology, there’s the issue of legal and ethical implications of such a technology.
For Emotional Beasts to work Bernal has modified a VR headset, equipping it with sensors that is capable to read a broad range of emotions accurately. Bernal began his experiment with a Vive headset and equipped the overall rig with sensors placed strategically around it. He added bio-signal sensors to the headset that already consists of a camera tracking eye movements of the user. Moreover, he even measures how much an individual is sweating by placing dry electrodes on the forehead to know how agitated they are becoming with time and usage. A heart-rate sensor is added to the user temple and microphone is tweaked to observe and record how the tones of voice differ with different emotions.
All these sensors equip Emotional Beasts to understand the anxiety and stress levels, and physical state of respondents and find out how physiological changes are connected to their decisions and actions. Using Unreal Engine, Bernal created a VR space that gets its data from the custom algorithms running in Python script, by plugging headset into a bunch of computer micro controllers.
His prototype was designed in 2018 but the data is still limited, considering the vast scope of the project. If we figure out how to resolve the ethical and legal issues, this system can prove to be a groundbreaking addition to the field of VR. Imagine having a VR existence that is nearly as expressive as our real bodies, if not more. Once all has been worked out, there are going to be endless possibilities with this technology.