.footer { } Logo Logo
deutsch
/// News
Resolve 18.5: New features explained with Screencaptures: Relight, Text-Based Editing ...

Animating facial expressions via iPhone - MetaHuman Animator for Unreal Engine

[16:09 Fri,24.March 2023   by blip]    

There&s a lot happening in the motion/performance capturing space right now - the latest news comes from Epic Games. For their MetaHuman Creator, with which photorealistic, digital human figures can be created, new possibilities were presented that significantly accelerate and simplify the animation process. The new MetaHuman Animator is intended to benefit not only developers of computer games or Hollywood studios, but also indie producers and hobby users.

metahuman-animator-capturing_iPhone
MetaHuman Animator Face Capturing via iPhone


With a simple iPhone camera - i.e. without 3D setup or markers - it will soon be possible to perform face capturing within a few minutes and transfer it to a rigged "MetaHuman"; for the latter, of course, you will need sufficiently powerful hardware. If you want, you can still use a helmet camera and possibly achieve even more accurate results. The following presentation shows how well this works - even fine nuances of facial expressions are transferred well, even though it is of course still recognizable that it is an animation.



In order for the system to apply facial expressions as faithfully as possible to any avatar, the face to be tracked must first be analyzed. First, a MetaHuman actor identity is created, with a standardized rig and mesh - only a short recorded sequence is required. With the help of this character, the position changes of the rig during the tracked performance can be detected and transferred to the target MetaHuman character.

metahuman-animator-model
MetaHuman actor identity


metahuman-animator-facial_capturing
Capturing performance


metahuman-animator-facial_capturing2


Since MetaHuman Animator also supports timecode, performance capturing of a face can be easily combined with body capturing. It is also supposed to realistically imitate tongue movements when speaking by using audio data. You can already guess: Deep Learning algorithms are at work here again, of course.

The new MetaHuman Animator features will be part of the MetaHuman plugin for the Unreal Engine, which (like the engine itself) can be downloaded for free. However, it won&t be publicly available for a few months.

MetaHuman Animator works with an iPhone 11 or later, also required is the free Live Link Face app for iOS, which will be updated with some additional capture modes to support this workflow.

By the way, an iPhone has recently been sufficient for simple motion capturing as well, or more precisely it has to be two phones. The matching app Move.ai was introduced just last week - it also supports the Unreal Engine, among other things.

Link more infos at bei www.unrealengine.com

deutsche Version dieser Seite: Gesichter einfach per iPhone animieren: MetaHuman Animator für Unreal Engine vorgestellt

  



[nach oben]












Archiv Newsmeldungen

2025

May - April - March - February - January

2024
December - November - October - September - August - July - June - May - April - March - February - January

2023

2022

2021

2020

2019

2018

2017

2016

2015

2014

2013

2012

2011

2010

2009

2008

2007

2006

2005

2004

2003

2002

2001

2000






































deutsche Version dieser Seite: Gesichter einfach per iPhone animieren: MetaHuman Animator für Unreal Engine vorgestellt



last update : 11.Mai 2025 - 15:02 - slashCAM is a project by channelunit GmbH- mail : slashcam@--antispam:7465--slashcam.de - deutsche Version