# Dhruv Mevada > Senior Software Engineer Location: Fremont, California, United States Profile: https://flows.cv/dhruvmevada Highly-motivated backend software engineer with 10 years of experience in Java, Python, microservices, REST API design, and experienced in delivering end-to-end solutions. Proficient in object-oriented design and development and working in a fast-paced agile environment. Mentored junior engineers to overcome challenges and grow. Interested in solving large-scale problems using distributed systems, big data frameworks/tools, and databases. ## Work Experience ### Software Engineer @ Epic Games Jan 2025 – Present ### Senior Backend Engineer @ Windfall Jan 2025 – Jan 2025 | San Francisco, California, United States Windfall is the platform for modern wealth intelligence. As a part of the platform team, worked on various initiatives to improve data validation, monitoring, and ingestion of customer data. • Implemented a match-validation job to flag potential bad records based on various heuristics, using Apache Spark. Successfully scaled to process over 12 million records and built pre and post processing jobs in Java to handle input table creation, streaming results to BigQuery and Postgres. This implementation improved the company's data quality, reducing customer churn rate by 18% and also saving the company approximately $150,000. • Implemented alerting functionality in Windfall’s custom event driven orchestration framework which runs data processing jobs for customers, using Kotlin thus reducing the number of support tickets created by customers. • Implemented a new parser to parse customer CSV files to JSONL (json-lines) using Kotlin. The CSV files were of various sizes ranging from a few megabytes to multiple gigabytes. Improved CSV parsing time using the Apache Commons CSV library and decreased the turnaround time for processing customer data. ### Senior Software Engineer @ Activision Blizzard Media Jan 2022 – Jan 2024 | San Francisco, California, United States Contributed significantly to the Ads Monetization Platform’s engineering team, implementing core features to improve scalability, availability, and performance. The platform serves real-time ads to 300M+ MAUs across ABM’s game portfolio, processing 55,000+ requests/sec, and generating $400M+ ARR. • Developed a feature in the campaign manager in Java to group ad placements by name, enabling accurate tracking of ad performance by game and audience. • Built a publish-subscribe system using Google Pub/Sub in Java to enable real-time updates of ads configuration data from the campaign manager to the mediation service, reducing update latency from 2 hours to 30 seconds via MySQL-based distributed locking. • Designed and implemented a scheduler job to upload incremental experiment view changes to Google Cloud Storage, improving update frequency from 24 hours to 15 minutes, enhancing downstream data consumption. • Improved scalability and robustness by removing session affinity in the mediation service, enabling ad requests to be processed by any pod. Utilized Redis for user data storage and distributed locking, preventing overload on game servers. • Separated experiment and whitelist view generation jobs, allowing independent uploads to Google Cloud Storage, streamlining A/B testing and audience management. • Spearheaded the creation of a Java library to externalize application properties, enabling updates without redeploying services, and conceptualized and established a central config store to decouple configuration from code. • Developed a JavaScript generation framework to dynamically override ad behavior in games using an in-house query language (AQL). • Refactored legacy Java code to optimize memory usage and streamline event delivery to the event service, improving system performance. ### Senior Software Engineer @ Mapbox Jan 2021 – Jan 2022 | San Francisco, California Mapbox's Search POI (Points of Interest) team was responsible for creating data pipelines to ingest maps data from various sources and load the maps data into the data warehouse. The data warehouse contains all of the information necessary to power the search, and Mapbox generates about $100 million in ARR. ● Spearheaded the development of the Foursquare v3 data ingestion pipeline, consuming roughly 600GB of maps data into the data warehouse using Python/PySpark, Apache Airflow, and AWS. This pipeline boosted customer engagement in search results by 21%. ● Designed and implemented logic to deduplicate and conflate records of map data from various vendors, including Foursquare, Safegraph, and OSM, resulting in a 12% decrease in errors in customer search results. ● Delivered high-quality maps data to downstream teams and created internal tools to streamline pipeline operations, significantly improving efficiency. ● Conducted interviews for 10+ candidates, contributing to team growth and talent acquisition. ### Software Engineer III @ Cisco Jan 2019 – Jan 2021 | San Jose, California Cisco's Ultra Cloud Core is a solution that supports telecom 3GPP standards for 5G. The 3GPP contains many standards and specifications. The standards contain a definition of the Policy Control Function (PCF) network function. PCF is responsible for enforcing policy control and providing the appropriate Quality of Service (QoS). It is one of the core network functions for 5G. I worked as a Software Engineer on the PCF team. PCF is a cloud-native solution. ● Developed key features for the PCF policy engine, enhancing service discovery, load balancing, and request processing to improve system efficiency and scalability. ● Resolved a critical scalability issue in production by identifying and fixing an OutOfMemory error caused by 100+concurrent connections to the PCF REST endpoints, ensuring system stability. ● Designed and implemented a microservice in Golang to generate Yang, Render, and ConfigMap files for customers using Kubernetes, scaling to 35+ columns and 4000+ rows. This microservice eliminated human error in complex 5G configuration deployments. ● Developed a Vue.js UI feature for exporting custom reference data stored in MongoDB, improving data accessibility for users. ● Engineered a critical feature in a multi-threaded network-cutter tool, distinguishing IPV4/V6 connections and identifying connection drops between server-side components (PCRF) and the User Data Cache (UDC). This feature enabled up to 50 peers to be handled simultaneously for improved troubleshooting. ● Interviewed 8+ candidates, contributing to team growth and technical hiring efforts. ### Software Engineer II @ CA Technologies Jan 2017 – Jan 2019 | Santa Clara, California CA APM is an industry leader in the Application Performance Management (APM) domain with revenues exceeding $200 million that provides deep visibility in to applications, real-time alerts, triaging capabilities, and much more. I work specifically on the agent team, and the agent is the core component within APM because it is the real-time, data-gathering unit. There are many agents being built by the agent team, and I work on the Java agent. The Java agent redefines the classes using Bytecode Instrumentation (BCI). The agent captures real time data and does smart processing based on code complexity and runtime characteristics to provide very intelligent and useful insights for the end user. This is a team with which makes use of stringent coding standards and makes heavy use of Java's concurrency library. • Implemented a feature to determine the compute unit. This critical metric is used in APM SaaS for calculating agent usage and determining cost for all SaaS customers, and generates millions in revenue. • Implemented an automatic, dynamic clamping feature to un/clamp metrics using Java’s concurrent data-structures. • Designed and developed a new microservices agent to have a 35% smaller footprint such that it can be used in Docker. • Implemented a new sustainability metric in Java that makes it easier for all 250+ on-premise customers to view metrics. • For the agent’s extension manager, fixed a critical defect that prevented new extensions from starting. • Implemented critical code in CloudFoundry’s Java buildpack by adding support for newer Java agent properties, and developed the entire framework for supporting APM’s Java agents using CloudFoundry’s Liberty buildpack. This open source change was available to all 250+ on-premise and SaaS customers. • Improved quality of Java agent by fixing several critical defects, including intermittent database connection problems from the agent to the enterprise manager. ### Software Engineer @ CA Technologies Jan 2015 – Jan 2017 | Santa Clara, California Application Performance Monitoring (APM) Java Agent Team: • Implement transaction trace decoration by injecting correlation ids to all transaction traces generated by agent, in Java. Service Virtualization Team: • Developed a light weight service virtualization software called TestDoubles using NodeJS and the HAPI framework. • Developed a CLI (command line interface) using shell scripting as a second option for the user to interact with the TestDoubles backend. • Implemented the backend and the REST APIs of TestDoubles in Node.js and architected it as a microservice for high scalability. • Built and automated CI/CD pipelines using Hashicorp's open source tools, and also with tools such as Docker, Vagrant, NPM, and Shippable/Semaphore. • Wrote shell scripts to perform quick testing of scalability. • Wrote end-to-end unit tests also in Node.js using Jasmine for TestDoubles. • Released TestDoubles into open source, npm (NodeJS package manager), DockerHub, and Github. https://www.npmjs.com/package/testdoubles https://github.com/DevTestSolutions/TestDoubles ### Software Developer Intern @ Apollo Education Group Jan 2014 – Jan 2015 | San Jose, California • Implemented a Maven plugin for Bamboo to perform artifact bundling and deploy services to the repo, using Java. • Created a POC of the new UI for the Boomstick tool, using AngularJS and D3JS. • Integrated a circuit breaker API (REST) for the Boomstick tool to monitor the health of nodes, in Ruby. • Developed features for the Boomstick tool in Ruby such as automatic versioning of services, integration of external modules, and a callback API for Bamboo. ### Software Developer Intern @ Gap Inc. Jan 2013 – Jan 2013 | San Francisco, California • Developed and improved features for an internal Hacker News clone website (GAP IT's Hackernews) using Ruby on Rails, HTML and CSS. • Patched registration, login, counting, and commenting portions of the internal Hacker News clone website, in Ruby. • Eliminated bugs in the authentication and registration portion of the website. Also ensured that the votes and comments section worked correctly. • Redesigned the user-interface of the website using CSS, and made it relevant to IT. ### Software Engineer Intern @ Cerner Corporation Jan 2012 – Jan 2012 | Kansas City, Missouri • Converted the Java Hibernate framework to instead use JDBC to simplify the code and reduce the abstraction layer. • Replaced generated HQL code with native SQL to improve runtime performance. • Implemented JUnit tests to ensure code functioned as necessary, and code was reviewed by other software engineers. ## Education ### Master’s Degree in Computer Science Santa Clara University ### Bachelor of Science in Software Engineering San José State University ## Contact & Social - LinkedIn: https://linkedin.com/in/dhruvmevada --- Source: https://flows.cv/dhruvmevada JSON Resume: https://flows.cv/dhruvmevada/resume.json Last updated: 2026-04-11