Title: Physically based cosmetic rendering
Authors: Huang, Cheng-Guo
Huang, Tsung-Shian
Lin, Wen-Chieh
Chuang, Jung-Hong
Department of Computer Science
Keywords: skin rendering;translucent rendering;cosmetic rendering
Issue Date: 1-May-2013
Abstract: Simulating realistic makeup effects is one of the important research issues in the 3D facial animation and cosmetic industry. Existing approaches based on image processing techniques, such as warping and blending, have been mostly applied to transfer one's makeup to another's. Although these approaches are intuitive and need only makeup images, they have some drawbacks, for example, distorted shapes and fixed viewing and lighting conditions. In this paper, we propose an integrated approach, which combines the KubelkaMunk model and a screen-space skin rendering approach, to simulate 3D makeup effects. The KubelkaMunk model is used to compute total transmittance when light passes through cosmetic layers, whereas the screen-space translucent rendering approach simulates the subsurface scattering effects inside human skin. The parameters of KubelkaMunk model are obtained by measuring the optical properties of different cosmetic materials, such as foundations, blushes, and lipsticks. Our results demonstrate that the proposed approach is able to render realistic cosmetic effects on human facial models, and different cosmetic materials and styles can be flexibly applied and simulated in real time. Copyright (c) 2013 John Wiley & Sons, Ltd.
URI: http://dx.doi.org/10.1002/cav.1523
ISSN: 1546-4261
DOI: 10.1002/cav.1523
Volume: 24
Issue: 3-4
Begin Page: 275
End Page: 283
Appears in Collections:Articles

Files in This Item:

  1. 000319003500015.pdf