Nicograph International 2024 Full paper (Honorable Mention Award)
Authors: Yuhao Dou and Tomohiko Mukai
Abstract: This paper proposes an unsupervised framework for retargeting human facial animations to different characters. Our method uses a branching structure of two parallel autoencoders and a variant of generative adversarial networks. The two autoencoder branches, composed of graph convolutional networks, share a common latent space through which the retargeting between different mesh structures can be performed. The shared latent codes are obtained by graph pooling operators, and the character face is reconstructed from the latent codes by the unpooling operators. The graph pooling and unpooling operators are designed based on multiple landmarks in optical-based facial motion capture systems. The GAN-based unsupervised learning method requires no paired training animation data between source and target characters. Our experimental results demonstrated that the proposed framework provides a reasonable estimation of a target facial expression that mimics a source character.
Acknowledgements: This work was supported by PlatinumGames Inc.
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The definitive version of record was published in Nicograph International 2024. https://doi.org/10.1109/NICOInt62634.2024.00022
Pingback: NICOGRAPH International 2024 – Mukai Laboratory