Metahumans in the Hololens 2 – First Steps!

Starting research into the uses of AR in Speech Pathology Therapy. How viable is it to use the Unreal Engine Metahumans and Microsoft’s Hololens 2 in apps designed to expose patients to various emotions portrayed by body language and facial expressions. Can we use the gaze detector to determine where patients are focusing so that therapists can help to shift concentration to the important areas. This is the very first step – using Metahumans in the HL2. The HL2 recording doesn’t show the quality of the humans, but as I went up close, even knowing it was a projection, I still expected him to turn and face me. So incredibly real.