On this page

A Study on the Enhancement of Digital Modeling on Character Emotional Presentation Ability in Film and Television Art

By: Zhe Xu 1
1China-Korean Institute of New Media, Zhongnan University of Economics and Law, Wuhan, Hubei, 430000, China

Abstract

In the era of experience economy, film and television art, as an important way to satisfy users’ entertainment needs, has a direct impact on the communication effect of its character emotion expression ability. This paper constructs a digital character emotion expression framework by taking the PAD three-dimensional emotion space as the theoretical basis and combining the first-order motion model and generative adversarial network. The study adopts the MATLAB development environment to verify the effectiveness of the emotion calculation method, evaluates the quality of digital character synthesis through experiments based on the TensorFlow framework in the Linux environment, and conducts an experience test of emotional characteristics among 80 college students. The results show that: the accuracy of the emotion calculation method proposed in this paper reaches 84.78%, which is 14.86% and 5.16% higher than that of OCC and fuzzy inference methods, respectively; the peak signal-to-noise ratio of the constructed FOMM-GAN digital character synthesis model reaches 35.63, with an average gradient of 5.96, and the synthesis time is 94.27ms, which is significantly better than the comparison method; the subjects’ perception of the digital character’s emotion The mean values of the subjects’ experience of the degree of emotional perception, adaptability, anthropomorphism, engagement and initiative were 1.654, 1.688, 1.538, 1.513 and 1.484, respectively, indicating that the digital characters have a high ability to express emotions. The study confirms that the digital character modeling method based on PAD space and FOMM-GAN can effectively enhance the emotional expressiveness of film and television characters, and provide new ideas for film and television art creation.