I am passionate about bridging the gap between human-like perception and robotic capabilities. I am currently exploring how this can be done in the aviation domain by developing new features for an autonomous pilot.
Enhancement of perception algorithms to estimate object states through the integration of vision and tactile sensor data.
•
Development in a simulated environment and ensuring their successful transfer and performance on real-world advanced robotic platforms with multi-fingered hands (Sim2Real).
•
Exploration of temporal approaches to improve the stability of the estimates over time.
Master’s Capstone: Applications of the Soft Bubble Grippers: Visuotactile Sensing for Visuomotor Policy Learning
•
Re-designed 3D fabricated parts for compatibility with UR5e robot and Robotiq grippers.
•
Wrote ROS package to interface with Intel Realsense cameras and implement pose and shear deformation estimation functionalities.
•
Trained and evaluated diffusion and transformer-based policy learning models on teleoperation demonstrations of various manipulation tasks including Push T and handover/placement of objects.
Other research:
•
Developed a Sim2Real framework to classify objects and derive pose estimations from Gelsight tactile data.
•
Evaluated the affect of object-centric representations on a diffusion-based policy learning model.
Project 1: Configured MQTT connection of soft robotic compression garment WIFI microcontroller to a central Hub server through Docker. Collaborated with an interdisciplinary team of faculty and students in computer science. Integrated a conversational voice assistant to provide real-time, in-home, unobtrusive sensing and on-body stimulation solutions (e.g., pressure, heat, etc.). Participated in a formal interview of my contribution and in a demonstration of the controllability and functionality of the technology.
•
Project 2: Constructed a remotely controllable soft robotic compression garment to assist novice meditators. Assembled and soldered hardware for Bluetooth and MOSFET-driven actuator circuit. Programmed Arduino using C/C++ code to control the system. Developed a user interface in Processing, using Java code, to visualize which body areas stimuli was being presented to. Tested garment system through a guided meditation user study (n=10).
•
Project 3: Assisted with data collection and system testing for a hybrid shape memory alloy (SMA)- pneumatic-based wearable upper limb exoskeleton. Recorded power consumption, flection angle, and time of flection for a mannequin arm with various SMA layouts.
•
Project 4: Designing neural-network-based controller for SMA-driven model arm flexion/extension using Matlab and Simulink. Comparing performance of various data-based model-free controllers on the system.