New York, New York, United States
• Led the Tabular Foundation Model (IngesTables) project and driving adoption in multiple product areas at Google.
• Designed, implemented, and optimized a few-shot object-detection pipeline, turning a tedious and error-prone week-long workflow into a one-step 7-hour workflow.
• Designed and implemented "partial run" capability in TFX, enabling teams to reuse artifacts and rapidly prototype their ML pipelines in a production environment, replacing the previous caching-based solution.
• Co-organized NeurIPS 2020 Competition Predicting Generalization in Deep Learning, a CodaLab competition that evaluates participants' code submissions. Minimized the operational toil of running the competition by autoscaling GPU nodes using Google Kubernetes Engine, Kubernetes, and Docker.
• Generalized the AdaNet algorithm by using DNN generalization gap predictors (besides Rademacher complexity) to drive neural architecture search.
• Derived a tighter upper-bound for the Rademacher complexity of CNNs with skip-connections.
• Designed and implemented features in AdaNet, a TensorFlow library for neural architecture search. Integrated AdaNet into Google products.