Speech Therapy
Unreal Multiplayer with Metahumans
I now have all the major functionality working in a multiplayer version of the therapy app. There was another very steep learning curve with multiplayer functionality but I’m slowly starting to get my head around it. Luckily there’s not a lot of interaction requirements in this app to try and figure out, just enough to make it really interesting.
Next stop is to create the animations that our Metahumans will be portraying, creating more cool Metahumans and implementing some of the nice lighting setups I was working on earlier.
Unreal Remote Streaming Multiplayer to the Hololens 2
For our therapy app it would be highly beneficial if the therapist could also wear a headset and engage in the experience with the patient. That way they will be able to guide patients through the learnings easily. To do this with Metahumans I will need to use remote streaming from a PC so that the heavy lifting is done by the PC and not the Hololens. I’ve finally got it all working – just with basic shapes for now, but next stop will be Metahumans!
UE4 Hololens Multiplayer
I found two excellent resources that showed me how to do this.
Microsoft Tutorial – https://docs.microsoft.com/en-us/samples/microsoft/mixedreality-unreal-samples/hl2collab—hololens-multiplayer-networking-in-unreal/
Unreal Engine Multiplayer docs – https://docs.unrealengine.com/5.0/en-US/networking-and-multiplayer-in-unreal-engine/
Unreal UMG spawned on hand gestures
This is a really neat way to introduce a control panel to your Hololens app, which hides away when not in use. Here I have it changing the Metahuman displayed, and the emotional animation I have for them.