In Visual Effects, the creation of realistic facial performances is still a challenge that the industry is trying to overcome. Blendshape deformation is used to reproduce the action of different groups of muscles, which produces realistic static results. However, this is not sufficient to generate believable and detailed facial performances of animated digital characters.
To increase the realism of facial performances, it is possible to enhance standard facial rigs using physical simulation approaches. However, setting up a simulation rig and controlling material properties according to the performance is not an easy task and could take a lot of time and iterations to get it right.
We present a workflow that allows us to generate an activation map for the fibres of a set of superficial patches we call pseudo‐muscles. The pseudo‐muscles are automatically identified using k‐means to cluster the data from the blendshape targets in the animation rig and compute the direction of their contraction (direction of the pseudo‐muscle fibres). We use an Extended Position–Based Dynamics solver to add physical simulation to the facial animation, controlling the behaviour of simulation through the activation map. We show the results achieved using the proposed solution on two digital humans and one fantastic cartoon character, demonstrating that the identified pseudo‐muscles approximate facial anatomy and the simulation properties are properly controlled, increasing the realism while preserving the work of animators.