Research
Multimodal Tactile Perception for Contact-Rich Manipulation
I design tactile sensing systems that combine optical deformation sensing, vibration, and acoustic cues to infer contact state, interaction events, and manipulation outcomes. These systems are designed to be compact and deployable, while providing rich signals for learning-based inference.
Selected Projects: VisTac, VibTac, TacScope
VisTac
VibTac
TacScope
Bio-inspired and Compliant Interfaces for Embodied Sensing
I explore bio-inspired and compliant robotic interfaces that couple mechanical morphology with sensing, reducing reliance on precise models and enabling robust interaction in uncertain environments. A central theme is whisker- and fiber-inspired sensing for contact and flow.
Selected Project: FibTac
Learning-Enabled Sensor Fusion and Robust Inference
Across these platforms, I develop learning-based methods that fuse multimodal signals to improve robustness under occlusion, noise, and partial observability.


