Skip to content

IGL-HKUST/UNIC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

UNIC: Neural Garment Deformation Field
for Real-time Clothed Character Animation

Chengfeng Zhao1, Junbo Qi2, Yulou Liu1, Zhiyang Dou3, Minchen Li4,
Taku Komura5, Ziwei Liu6, Wenping Wang7, Yuan Liu1,†

1HKUST    2Waseda    3MIT    4CMU    5HKU    6NTU    7TAMU
Corresponding author

arXiv Project Page

🚀 Getting Started

1. Environment Setup

We tested our environment on Ubuntu 20.04 LTS with CUDA 12.1, gcc 9.4.0, and g++ 9.4.0.

conda create python=3.10 --name unic
conda activate unic

pip install torch==2.5.0 torchvision==0.20.0 torchaudio==2.5.0 --index-url https://download.pytorch.org/whl/cu121
pip install -r requirements.txt

conda install -c fvcore -c iopath -c conda-forge fvcore iopath
pip install "git+https://github.com/facebookresearch/pytorch3d.git@stable"

git clone https://github.com/unlimblue/KNN_CUDA.git
cd KNN_CUDA
make && make install
cd ..

2. Run Demo

python -m test --cfg configs/config_train_unic_jk.yaml

🔬 Training

1. Data Preparation

Overall, the data structure should be constructed like this:

<your_data_root>/unity_smpl/
|-- hanfu_dress
|   |-- sequence_*
|   |   |-- deformation
|   |   |   `-- s_*.obj
|   |   |-- motion
|   |   |   `-- s_*.obj
|   |   `-- animation.fbx
|-- jk_dress
|   |-- sequence_*
|   |   |-- deformation
|   |   |   `-- s_*.obj
|   |   |-- motion
|   |   |   `-- s_*.obj
|   |   `-- animation.fbx
|-- princess_dress
|   |-- sequence_*
|   |   |-- deformation
|   |   |   `-- s_*.obj
|   |   |-- motion
|   |   |   `-- s_*.obj
|   |   `-- animation.fbx
|-- tshirt
|   |-- sequence_*
|   |   |-- deformation
|   |   |   `-- s_*.obj
|   |   |-- motion
|   |   |   `-- s_*.obj
|   |   `-- animation.fbx

2. Data Pre-processing

First, install FBX Python SDK:

wget https://damassets.autodesk.net/content/dam/autodesk/www/files/fbx202037_fbxpythonsdk_linux.tar.gz
tar -xzf fbx202037_fbxpythonsdk_linux.tar.gz --no-same-owner

mkdir <your_path_for_fbx>
chmod 777 fbx202037_fbxpythonsdk_linux
./fbx202037_fbxpythonsdk_linux <your_path_for_fbx>

cd <your_path_for_fbx>
conda activate unic
python -m pip install fbx-2020.3.7-cp310-cp310-manylinux1_x86_64.whl

Tip

If you encounter the error libc.so.6: version GLIBC_2.28 not found when import fbx, try to add deb http://security.debian.org/debian-security buster/updates main in your /etc/apt/sources.list. Then, run the following commands:

sudo apt update

# if NO_PUBKEY error occurs, run the commented command:
# sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 112695A0E562B32A 54404762BBB6E853
# sudo apt update

sudo apt list --upgradable
sudo apt install libc6-dev libc6

Second, run the pre-processing code:

python -m data.preprocess.unic

The pre-processed data will be saved in <your_data_path>/pre_processed/ folder.

3. Train UNIC

Note

We tested our training code on NVIDIA RTX 3090 and NVIDIA RTX 2080Ti GPUs.
Set USE_DDP = False in train.py to disable DDP (Distributed Data Parrallel) while training if you need.

As an example, run the following command to train a deformation field for jk dress:

python -m train --cfg configs/unic_jk_dress.yaml --nodebug

Checkpoints will be saved in checkpoints/ folder. In our experiment, we choose epoch300.pth for all the comparisons, evaluations and presentations.

Acknowledgments

Thanks to the following work that we refer to and benefit from:

Licenses

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

About

Official repository of paper "UNIC: Neural Garment Deformation Field for Real-time Clothed Character Animation".

Resources

License

Stars

Watchers

Forks

Contributors