Learning-Based Bending Stiffness Parameter Estimation by a Drape Tester

Xudong Feng, Wenchao Huang, Weiwei Xu, Huamin Wang

Real-world fabrics often possess complicated nonlinear, anisotropic bending stiffness properties. Measuring the physical parameters of such properties for physics-based simulation is difficult yet unnecessary, due to the persistent existence of numerical errors in simulation technology. In this work, we propose to adopt a simulation-in-the-loop strategy: instead of measuring the physical parameters, we estimate the simulation parameters to minimize the discrepancy between reality and simulation. This strategy offers good flexibility in test setups, but the associated optimization problem is computationally expensive to solve by numerical methods. Our solution is to train a regression-based neural network for inferring bending stiffness parameters, directly from drape features captured in the real world. Specifically, we choose the Cusick drape test method and treat multiple-view depth images as the feature vector. To effectively and efficiently train our network, we develop a highly expressive and physically validated bending stiffness model, and we use the traditional cantilever test to collect the parameters of this model for 618 real-world fabrics. Given the whole parameter data set, we then construct a parameter subspace, generate new samples within the subspace, and finally simulate and augment synthetic data for training purposes. The experiment shows that our trained system can replace cantilever tests for quick, reliable and effective estimation of simulation-ready parameters. Thanks to the use of the system, our simulator can now faithfully simulate bending effects comparable to those in the real world.

Learning-Based Bending Stiffness Parameter Estimation by a Drape Tester

Neural Cloth Simulation

Hugo Bertiche, Meysam Madadi, Sergio Escalera

We present a general framework for the garment animation problem through unsupervised deep learning inspired in physically based simulation. Existing trends in the literature already explore this possibility. Nonetheless, these approaches do not handle cloth dynamics. Here, we propose the first methodology able to learn realistic cloth dynamics unsupervisedly, and henceforth, a general formulation for neural cloth simulation. The key to achieve this is to adapt an existing optimization scheme for motion from simulation based methodologies to deep learning. Then, analyzing the nature of the problem, we devise an architecture able to automatically disentangle static and dynamic cloth subspaces by design. We will show how this improves model performance. Additionally, this opens the possibility of a novel motion augmentation technique that greatly improves generalization. Finally, we show it also allows to control the level of motion in the predictions. This is a useful, never seen before, tool for artists. We provide of detailed analysis of the problem to establish the bases of neural cloth simulation and guide future research into the specifics of this domain.

Neural Cloth Simulation

Motion Guided Deep Dynamic 3D Garments

Meng Zhang, Duygu Ceylan, Niloy J. Mitra

Realistic dynamic garments on animated characters have many AR/VR applications. While authoring such dynamic garment geometry is still a challenging task, data-driven simulation provides an attractive alternative, especially if it can be controlled simply using the motion of the underlying character. In this work, we focus on motion guided dynamic 3D garments, especially for loose garments. In a data-driven setup, we first learn a generative space of plausible garment geometries. Then, we learn a mapping to this space to capture the motion dependent dynamic deformations, conditioned on the previous state of the garment as well as its relative position with respect to the underlying body. Technically, we model garment dynamics, driven using the input character motion, by predicting per-frame local displacements in a canonical state of the garment that is enriched with frame-dependent skinning weights to bring the garment to the global space. We resolve any remaining per-frame collisions by predicting residual local displacements. The resultant garment geometry is used as history to enable iterative roll-out prediction. We demonstrate plausible generalization to unseen body shapes and motion inputs, and show improvements over multiple state-of-the-art alternatives.

Motion Guided Deep Dynamic 3D Garments

Mixed Variational Finite Elements for Implicit Simulation of Deformables

Ty Trusty, Danny M. Kaufman, David I. W. Levin

We propose and explore a new method for the implicit time integration of elastica. Key to our approach is the use of a mixed variational principle. In turn, its finite element discretization leads to an efficient and accurate sequential quadratic programming solver with a superset of the desirable properties of many previous integration strategies. This framework fits a range of elastic constitutive models and remains stable across a wide span of time step sizes and material parameters (including problems that are approximately rigid). Our method exhibits convergence on par with full Newton type solvers and also generates visually plausible results in just a few iterations comparable to recent fast simulation methods that do not converge. These properties make it suitable for both offline accurate simulation and performant applications with expressive physics. We demonstrate the efficacy of our approach on a number of simulated examples.

Mixed Variational Finite Elements for Implicit Simulation of Deformables