Focused on the 0-to-1 development of high-stakes production environments. Currently responsible for the end-to-end technical lifecycle of a well-funded trading system, spanning data ingestion, transformation, and automated execution.
Architected and owned the end-to-end data infrastructure for trading system, integrating Bloomberg and open-source APIs into high-availability ingestion pipelines. Engineered a modular backend to automate complex workflows and research cycles, ensuring seamless scalability and operational stability. Established a production-grade deployment environment via Docker and AWS to ensure high-integrity execution and system reliability. Focused on building high-ownership, 0-to-1 systems where technical autonomy is the baseline.
Redesigned the core trading platform monitoring up to $90M in daily trades, adding early failure detection to prevent downtime. Built a machine learning strategy to benchmark internal RL research and contributed regularly to strategy reviews and paper discussions. Also developed analytics used directly in investor pitch decks to support fundraising.
Data for Good Scholar. Worked at the intersection of data science and human rights policy. Partnered with Rights CoLab and the ISSB to extract financially material ESG language using NLP and data science, in an effort to standardize human rights policies global. Built custom web scrapers and scripting pipelines to collect and process financial disclosures from international sources.
Built interactive dashboards in Looker to monitor cloud spend and usage, surfacing key trends and inefficiencies. Helped identify ~$10K/month in potential savings and improved visibility across 50+ KPIs.
In partnership with Columbia Build Lab and Columbia Business School. Built a full data pipeline from scratch using NLP to extract insights from interview audio. Designed backend systems to process, store, and serve results, enabling real-time visualizations in Tableau.