Optimus & Autopilot
2024 — 2026
Palo Alto, California, United States
Led AI visualizations. Built 3D visualization library with navigable temporal scenes, primitives, multi-bot views, and 4D gaussian splats. Thousands of visualizations have been made to date, and is the primary visualization tool for the AI team.
Led QA tooling. Revamped testing workflows with automated video searching, walking metrics, test playlists, and aggregated dashboards - reducing friction for all code deployments.
Solely built full-stack metrics for Optimus Mission Control - data ingestion, deployment tracking, fleet utilization. Core dashboards across navigation, calibration, connectivity, locomotion, and more. Featured in staff meetings and executive reviews.
Designed real-time connectivity dashboards critical to Optimus showcase at the 10/10 Event, used by 50+ engineers during the event. Enabled network debugging and Wi-Fi-to-cell fallback planning for 2,500+ attendees (from network congestions).
Led software tooling for 2025 GFTX Shareholder Event, creating unified live dashboards (health, battery levels, connectivity, ML policy) with streaming security lockdowns.
Engineered geospatial telemetry pipeline with intuitive UI to visualize bot location, pose, and connectivity over time. Reduced multi-bot deadlock interventions by 80%.
Led Optimus data collection mission control and mobile app development, supporting world model development
Solely build data ingestion pipeline for thousands of speech sessions recorded for VLA/VLM model improvement in collaboration with xAI's Grok team.
Created org-wide alerts channel for bot incidents, interventions, and service outages - enabling rapid triage at scale.
Led Optimus intervention queue project analogous to Robotaxi interventions, with scale for thousands of bots.
Created hand tactile sensor visualizer for robot hands and data collection.
Built CI robot interface - enabling viewing health issues, send commands, and automate setup for automated testing.
2023 — 2023
Visualized robot data such as quaternion, position, joints, and enhanced end-to-end manipulation efforts as seen in the Optimus Investor Day video. One of the key features I made allows ML researchers and engineers to compare and visualize various signals in vector space and perform centralized comparisons.
Accelerated training AI from human demonstration, built tools for saving and loading user-based debugging templates, and facilitated real-time code execution for data logging and command dispatch.
Implemented features for visualizing robot projections on live camera feeds (as seen in AI Day), monitoring health signals, and providing alerts related to the Electrical Control Unit (ECU).
I took ownership of a testing workflow tool that streamlined test creation and execution. Also, I contributed to rendering objects in vector space, displaying running tasks and services on the ECU, and providing logs in an effective way to the user.
Worked on tooling to test, train, and analyze foundational models for both autonomous driving and humanoid robots. The enhanced features include in depth infra debugging logs and automating calculations of daily key metrics that are presented to executives for high-level engineering decisions.
Worked on robot joint calibration, signal recording features, enabling the application to send multiple commands to robots simultaneously, fetched Controller Area Network (CAN) signals, and displayed user-facing alerts based on signal values and calculations.
Design and implement large-scale data processing pipelines that handle a diverse set of Autopilot related data such as images, sensor inputs, and human labels.
Design and implement tools, tests, metrics, and dashboards to accelerate the development cycle of our model training.
Work closely with frontend engineers to seamlessly integrate with backend systems
Work closely with AI researchers to support evolving research projects and implement new production features
Designed and developed a scalable crane driver using Golang that enabled seamless communication with hundreds of Giga Berlin cranes to perform continuous pick-and-place operations and read status updates for enhanced efficiency and productivity
Developed full-stack functionality using React to control cranes based on type, predetermined locations, and custom user input mapped to a relational database of valid inputs for Automated Storage and Retrieval Systems
Optimized the robot fleet manager’s track creator to reduce load times and increase response time by 25x using React
Designed and built custom fleet manager functionality to upload and unload custom payloads based on selected waypoints and dynamic UI visualization of payload contours with scale for 100+ Autonomous Guided Vehicles (AGVs) in Giga Shanghai
Created a server to simulate handshakes for hundreds of HMIs, cranes and automated factory equipment using Python
Placed 3rd at the international level against colleges such as Vanderbilt, UPenn, Tufts, UCSD and Edinburgh.
Education
University of Waterloo
Bachelor of Applied Science - BASc
Iroquois Ridge High School