Some of the earliest animators used tracing paper to capture the motion of an actor and then draw their new characters onto that. Decades later, animators adapted the idea and started using motion capture (mocap) suits with tracing spots to transfer an actor’s movements into their computer-generated characters.
The next step in this evolution is to do away with the mocap suit and use only video footage as the source, and also expand the concept to facial expressions and the voice – to get more animating done with less, and in real-time.
The EBU has established a new working group that aims to achieve precisely that, but with the use of off-the-shelf hardware and software, and with a workflow and pipeline that allows collaborators to work smoothly and across different physical locations.
The group will first gather a complete overview of the state of the art and the solutions available on the market, define a methodology to benchmark these elements, and then perform an analysis.
Next, the group will integrate chosen solutions into a game engine pipeline, and finally, demonstrate the result during a live show.
The group is open both to EBU Members and members of the wider industry.