HairControl: A Tracking Solution for Directable Hair Simulation
We present a method for adding artistic control to physics‐based hair simulation. Taking as input an animation of a coarse set of guide hairs, we constrain a subsequent higher‐resolution simulation of detail hairs to follow the input motion in a spatially‐averaged sense. The resulting high‐resolution motion adheres to the artistic intent, but is enhanced with detailed deformations and dynamics generated by physics‐based simulation. The technical core of our approach is formed by a set of tracking constraints, requiring the center of mass of a given subset of detail hair to maintain its position relative to a reference point on the corresponding guide hair. As a crucial element of our formulation, we introduce the concept of dynamically‐changing constraint targets that allow reference points to slide along the guide hairs to provide sufficient flexibility for natural deformations. We furthermore propose to regularize the null space of the tracking constraints based on variance minimization, effectively controlling the amount of spread in the hair. We demonstrate the ability of our tracking solver to generate directable yet natural hair motion on a set of targeted experiments and show its application to production‐level animations.
No Supplementary Data
No Article Media
Document Type: Research Article
Publication date: December 1, 2018