FabricDiffusion – SIGGRAPH Asia 2024

Visual Content Production | Unreal Engine 5

humansensinglab.github.io/fabric-diffusion/

About the Project

FabricDiffusion introduces a method for transferring fabric textures from a single clothing image to 3D garments of arbitrary shapes. Leveraging a denoising diffusion model trained on a large-scale synthetic dataset, the system generates distortion-free, tileable textures that integrate seamlessly into Physically-Based Rendering (PBR) pipelines.

The method focuses on preserving fine texture details even under occlusions, distortions, or complex poses, enabling realistic relighting and visualization of garments under various lighting conditions.

For more technical details, visit the official project page:
https://humansensinglab.github.io/fabric-diffusion/

Project Video

My Role

I contributed to this project by creating and optimizing the entire visual content used in the publication and website presentation for SIGGRAPH Asia 2024. My specific responsibilities included: