Software engineer focused on building production AI agent systems for enterprise SaaS. I design multi step agentic workflows that go beyond chatbots, systems that reason, act, and deliver measurable business outcomes at scale.
Experience
2022 — Now
San Francisco Bay Area
Lead architect for Highspot’s flagship AI platform capabilities, Deal Agent and Meeting Intelligence, powering real time coaching and decision support for go to market teams.
Architected Deal Agent, a multi turn agentic system that unifies CRM, buyer engagement, and meeting signals to deliver real time deal coaching and next best action recommendations. Designed a two phase agent architecture with token level streaming and production grade reliability for enterprise scale usage.
Built Meeting Intelligence to extract structured insights from sales conversations and transform them into coaching signals at scale, with deep integrations across Gong, Microsoft Teams, and Zoom.
Owned framework and LLM strategy for agent systems. Evaluated and selected core architecture stack, including Pydantic AI, and designed a multi provider model infrastructure spanning OpenAI, Anthropic, and Google. Drove cost and performance optimization while maintaining production quality across high volume workloads.
Designed scalable agent infrastructure with streaming pipelines, observability, and safeguards required for enterprise AI deployment.
Co inventor on patent pending AI systems for conversational and deal intelligence.
2019 — 2022
2019 — 2022
Kansas City, Missouri Area
Contributed to the architecture and development of a large scale operational insight platform supporting data intensive healthcare workflows.
Designed and deployed cloud infrastructure on AWS using CloudFormation, Lambda, Kinesis, ECS, Fargate, EC2, S3, Route53, CloudWatch, and IAM, enabling scalable and resilient production environments.
Developed distributed data pipelines using Apache Beam and Flink to structure and archive terabytes of operational data into S3 in consumable formats for downstream analytics.
Worked across the full software lifecycle from design to release, improving code quality and scalability through refactoring, design pattern adoption, and rigorous code reviews.
Built automation tooling and CI, CD pipelines to streamline developer workflows and operational processes, reducing manual overhead.
Implemented unit, integration, and performance testing strategies to ensure reliability of production systems under high load.
Collaborated with stakeholders to translate business requirements into technical solutions and participated in user requirement sessions.
Mentored new engineers through onboarding, code reviews, and continuous feedback, strengthening team development practices.
Operated and monitored cloud systems using Jenkins, Zabbix, Spinnaker, Grafana, and New Relic to maintain production stability.
2017 — 2018
Cincinnati Area, KY
Developed an interactive web based bioinformatics platform in Python to mine and analyze large public and private biological datasets for research use.
Implemented machine learning models including regression, classification, and ensemble methods to support predictive analysis on biomedical data.
Optimized model performance through systematic hyperparameter tuning using grid search and evaluation pipelines.
Applied text mining and NLP techniques to extract gene level insights from large research article corpora and structured text databases.
Built data ingestion pipelines to stream external datasets into local research systems, enabling continuous data availability for analysis.
Designed analytical algorithms to process biological signals and translate raw data into researcher friendly knowledge outputs.
Collaborated with research stakeholders to translate scientific and operational requirements into data driven solutions.
2016 — 2017
2016 — 2017
Cincinnati Area, KY
Conducted predictive analytics and statistical modeling using Python, R, MATLAB, and Tableau to identify trends in complex datasets.
Designed algorithms that reduced reliance on centralized data hubs by enabling distributed data collection and modeling workflows.
Evaluated multiple data mining projects and applied statistical techniques to generate actionable analytical insights.
Developed Hadoop MapReduce programs to process large scale datasets and extract meaningful patterns from high volume data.
Education
University of Cincinnati
Master's degree
University of Cincinnati
Data Science Certification
Vellore Institute of Technology