Proof of Concept: UE Metahuman Face Animation data use in Houdini
- Oluseyi Ekanem
- Oct 22
- 2 min read
The Rat Trap Project – Post 12:Proof of Concept
This week, I set out to achieve something ambitious — to take facial animation data from Unreal Engine’s Metahuman and apply it directly to a Metahuman rig inside Houdini, maintaining full control and editability within Apex. What followed was a fascinating deep dive into Houdini’s rigging systems — and what a journey it’s been.
My initial plan was to understand how facial animation is handled in Houdini using abstract controls. Since the Metahuman rig functionality in Houdini is still quite new, it’s understandable that there aren’t many tutorials or workflows documented yet. So, I reached out to the SideFX support team.
They were candid — there’s no fully defined workflow yet. However, they pointed me toward a presentation on Project Violet, which integrates Metahumans. That gave me useful insight into exporting FBX animations from Unreal Engine and importing them into Houdini. I wasn’t able to achieve my full goal using the Project Violet approach, but I’m deeply appreciative of the learning journey that rabbit hole provided.
During that exploration, I revisited the incredible work of Junichiro Horikawa, whose VEX for Algorithmic Design series first opened my eyes to procedural thinking in Houdini. His lesson on dictionary data types reminded me of just how powerful and flexible Houdini’s data model can be for rigging and animation.
Then, a few days ago, Diyz3n on YouTube released a video that opened new pathways of thinking for me. In his approach, he successfully animated the Metahuman face, but he had to step outside of Apex to do so.
That’s where my workflow diverged. Since I already had the same rig setup both in Apex and from the Unreal Engine FBX animation, I realized I could stay entirely within Apex.
I modified the Houdini metahuman rig creating a tag for the facial points in the metahuman head skeleton and initialised FK transforms for them. This created transform objects associated with these points in the APEX rig. Then applying the controlsupdateparms approach, I was able to bring the Unreal Engine facial animation directly into Houdini — inside Apex — and then use Scene Animate to further refine the motion using animation layers.
This integration was the breakthrough I was aiming for. It allows facial mocap data from Unreal to flow seamlessly into Houdini, while still giving me the ability to make secondary adjustments directly in Scene Animate — a flexibility not possible in the Diyz3n approach.
Going forward, I plan to refine this workflow and submit a tutorial on my method to the Houdini Tutorial Contest, which is currently ongoing.
A huge thank you to Junichiro Horikawa for inspiring my understanding of procedural workflows, and to Diyz3n for sparking new ideas through his recent video.
Every step — the mentorship, the experiments, and the breakthroughs — reinforces why learning Houdini remains such a thrilling and creative journey. This challenge was a tough one to crack. but closed doors couldn't stop my zeal. As my people say "If dem close the door, we go bust am open". And that's what we did this week navigating uncharted waters.
#TheRatTrap #PaitanMedia #AfricanAnimation #Metahuman #HoudiniSolaris #SideFX #UnrealToHoudini #3DModeling #USD #Houdini #PipelineDesign #AnimationNigeria #UnrealEngine #Nollywood




Comments