Research

Multimodal Tactile Perception for Contact-Rich Manipulation

I design tactile sensing systems that combine optical deformation sensing, vibration, and acoustic cues to infer contact state, interaction events, and manipulation outcomes. These systems are designed to be compact and deployable, while providing rich signals for learning-based inference.


Selected Projects: VisTac, VibTac, TacScope

VisTac overview

VisTac

Controllable-transparency elastomer enabling unified tactile imaging and through-sensor vision.

Watch on YouTube

VibTac setup

VibTac

High-bandwidth tactile perception using synchronized optical deformation, vibration, and acoustics.

Watch on YouTube

TacScope prototype

TacScope

Minimally invasive tactile probing for subsurface geometry reconstruction in soft media.

Watch on YouTube


Bio-inspired and Compliant Interfaces for Embodied Sensing

I explore bio-inspired and compliant robotic interfaces that couple mechanical morphology with sensing, reducing reliance on precise models and enabling robust interaction in uncertain environments. A central theme is whisker- and fiber-inspired sensing for contact and flow.

Selected Project: FibTac

Watch on YouTube


Learning-Enabled Sensor Fusion and Robust Inference

Across these platforms, I develop learning-based methods that fuse multimodal signals to improve robustness under occlusion, noise, and partial observability.