For our therapy app it would be highly beneficial if the therapist could also wear a headset and engage in the experience with the patient. That way they will be able to guide patients through the learnings easily. To do this with Metahumans I will need to use remote streaming from a PC so that the heavy lifting is done by the PC and not the Hololens. I’ve finally got it all working – just with basic shapes for now, but next stop will be Metahumans!
I found two excellent resources that showed me how to do this.
Microsoft Tutorial – https://docs.microsoft.com/en-us/samples/microsoft/mixedreality-unreal-samples/hl2collab—hololens-multiplayer-networking-in-unreal/
Unreal Engine Multiplayer docs – https://docs.unrealengine.com/5.0/en-US/networking-and-multiplayer-in-unreal-engine/
This is a really neat way to introduce a control panel to your Hololens app, which hides away when not in use. Here I have it changing the Metahuman displayed, and the emotional animation I have for them.
After going through multiple video tutorials and getting completely confused and frustrated, I finally figured it out and have compiled my own instructions with images. If you’re like me and just want a quick and to the point tutorial then this post is for you.
EPIC TUTORIALS – Check out this fantastic quick video tutorial about optimal fields for elements when you are creating a Hololens experience. There’s so much to think about when designing AR for therapy.
I’m astounded to be able to announce that I have received one of the Epic Games MegaGrants!
With it I will continue research with Lucy Bryant of the UTS Speech Pathology Department into using Hololens technology to benefit patients.
Watch this space for updates on what we’re creating together and how it will contribute to exciting developments in future therapies.