Project Profile

Goals of the project

Reverberating captures a designer’s hand gestures for sculpting expressive, wearable forms around the human body. It’s intuitive, interactive interface combines 3D scanning, 3D modeling, and 3D printing such that non-expert users can quickly craft intricate, ergonomic, and ready-to-print designs.

Nature of the Collaboration



Programming (for interactive 3D modeling) and Fabrication (3D printing)


  • Programming: java using Processing, toxiclibs, and simplekinect
  • Fabrication: SLS Printing from nylon and elastomer
  • Computer vision: Microsoft Kinect


The project started off as a very technical pursuit: how to embed fabrication constraints into a deformable 3D model. A bug in the original software led to a lovely aesthetic discovery, that was never intended to be within the scope of the project: I forgot to clear an array, so multiple 3D modules were extruded through time. Lastly I integrated 3D scanning so I could easily make something that I knew would fit around my body (so I wouldn’t make any expensive printing errors by mis-scaling a design!)


Some initial proof-of-concept prints tested whether or not my 3D modeling-to-printing workflow actually worked.

Later, I tested designing and printing at different scales (at a smaller scale around the wrist, and a larger scale for part of a building façade.)

Lastly, I found an intermediate scale to print around the bust.

Challenges encountered

This entire project developed from by a ‘bug’ in my original software. Being open for discovery and input from my software tool helped me extend my imagination and my creativity beyond its current limit. Just as a sculpture is inspired and responds to wood or stone, we can respond to code as a material.

Major outcomes

Featured in a 3D printing fashion show; conference paper; featured on numerous tech, art, and fashion blogs; cornerstone of PhD research

Innovations, impact and successes

This project shows how layers of technical knowledge (in 3D modeling, fabrication, and ergonomic design) can be embedded into a software interface. As a result, experts and non-experts alike can intuitively use their hand gestures to 3D model complex forms around physical contexts. These designs can be immediately 3D printed and fit back on the body.