· Engineered solutions in a Python/Django/PostgreSQL back-end environment, including API endpoints, integrations, Airflow tasks, and Django management commands and admin features.
· Integrated live data feeds (MDS, GBFS, and custom APIs) into Zoba’s systems.
· Communicated data requirements to customers and helped guide their engineering work.
· Confirmed the validity of customers’ data feeds, addressing issues or raising concerns to their teams.
· Developed solutions to streamline integrations and provide operations with additional internal tools.
· Gained experience in our Typescript/React/Next.js front-end through pair programming.
· Contributed to the simulator and optimizer for Zoba’s expansion into delivery optimization.
· Balanced timely completion of roadmapped work with urgent requests from ops, sales, and customers.
ABOUT: Zoba is a decision automation platform for micromobility operators, optimizing the efficacy of their operations in 200+ cities. Zoba generates recommendations for vehicle deployments, pickups, rebalances, and battery swaps by simulating the effects of each intervention, finding the optimal set of operational tasks given fleet distribution, battery, weather, day, and time. Zoba then enables the efficient completion of these tasks by batching and sequencing them according to operational capacity.
I worked on the Advanced Data Analytics team within Asset Management, which is responsible for the procurement and scraping of hundreds of alternative data datasets. My task was to design and implement an improved system for monitoring and detecting issues with the quality of these data sets.
· Designed a Python application to calculate daily data quality scores on 100+ data sets.
· Wrote in-depth documentation so the application could be scaled and launched across several teams.
· Launched a Tableau dashboard and email notification system for monitoring data quality.
· Saved 300+ hours yearly by automating existing data quality checks with Python and SQL.