UTS Data Arena Mocap Session

Dr Lucy Bryant and I were at the UTS Data Arena capturing some emotions for the Unreal Metahumans in our Hololens Therapy App. The fantastically clever Thomas Ricciardiello ran the session for us. I can’t wait to see how it looks in the app. Stayed tuned to find out!

Metahuman Designs for Therapy


I’m really loving creating the characters for our Speech Therapy App.

A definite advantage of using Metahumans is that we can change the character easily, with the press of a button. If we can provide patients with a wide variety of characters to practice reading emotions on then they will be better equipped when they go out into the real world. Epic have provided us with an amazing tool and we’re lucky to be able to use it to make real improvements in the quality of patient’s lives.

Unreal Multiplayer with Metahumans

I now have all the major functionality working in a multiplayer version of the therapy app. There was another very steep learning curve with multiplayer functionality but I’m slowly starting to get my head around it. Luckily there’s not a lot of interaction requirements in this app to try and figure out, just enough to make it really interesting.

Next stop is to create the animations that our Metahumans will be portraying, creating more cool Metahumans and implementing some of the nice lighting setups I was working on earlier.

Unreal Remote Streaming Multiplayer to the Hololens 2

For our therapy app it would be highly beneficial if the therapist could also wear a headset and engage in the experience with the patient. That way they will be able to guide patients through the learnings easily. To do this with Metahumans I will need to use remote streaming from a PC so that the heavy lifting is done by the PC and not the Hololens. I’ve finally got it all working – just with basic shapes for now, but next stop will be Metahumans!

UE4 Hololens Multiplayer

After a bit of an enforced break due to catching COVID, I’ve started back on our therapy app looking at how to make a multiplayer Hololens experience. Therapists will be able to join patients in the therapy to guide them and encourage positive outcomes.

I found two excellent resources that showed me how to do this.

Microsoft Tutorial – https://docs.microsoft.com/en-us/samples/microsoft/mixedreality-unreal-samples/hl2collab—hololens-multiplayer-networking-in-unreal/

Unreal Engine Multiplayer docs – https://docs.unrealengine.com/5.0/en-US/networking-and-multiplayer-in-unreal-engine/

MHC Lighting Presets

Epic have released MHC Lighting Presets – A set of lighting set ups designed by Greig Fraser (Dune) and they are free to use in your Metahuman setups.

The presets enhance the render of your Metahumans in the Hololens, but I had to make a few edits to the settings when I copied them into my Unreal Hololens Project setup. So if you find they don’t quite look the same in yours, then check out what I did here.

Alpha Demo

I’m starting to get the hang of all things Unreal and Hololens 2 so I’ve done a very basic app with the main requirements in rudimentary form. I’ve just used Mixamo animations to start with but will create some much more fine tuned and subtle animations soon.

Metahumans in the Hololens 2 – First Steps!

Starting research into the uses of AR in Speech Pathology Therapy. How viable is it to use the Unreal Engine Metahumans and Microsoft’s Hololens 2 in apps designed to expose patients to various emotions portrayed by body language and facial expressions. Can we use the gaze detector to determine where patients are focusing so that therapists can help to shift concentration to the important areas. This is the very first step – using Metahumans in the HL2. The HL2 recording doesn’t show the quality of the humans, but as I went up close, even knowing it was a projection, I still expected him to turn and face me. So incredibly real.