# Douglas Campbell > Software Engineer Location: San Carlos, California, United States Profile: https://flows.cv/douglascampbell Highly skilled and experienced software engineer with expertise in designing, developing, and deploying scalable, data-driven applications. Proven ability to lead, mentor teams, and deliver impactful results across various industries. Expertise in cloud platforms (AWS, GCP), big data technologies, data pipelines, CI/CD, and security best practices. ## Work Experience ### Software Engineer @ Suki Jan 2024 – Present | Redwood City, California, United States All things engineering! Wide array of integration efforts with healthcare Electronic Medical Record Systems such as Athena, Epic, Meditech, and Cerner. ### Senior Principal Engineer @ Dun & Bradstreet Jan 2022 – Jan 2024 Cloud Migration – Lead for a complete migration from AWS infrastructure for two data product business lines to Google Cloud Platform (GCP) for all of Netwise Data and Platform teams. Two disparate architectures were unified into one architecture with separate implementations of the same ETL execution framework. This resulted in simplicity and uniformity and eliminated the need to deeply understand dead code and architecture. BQM2 - BigQuery Materializer 2 (open sourced), detailed in ShareThis experience, was leveraged to orchestrate the loading, transformation, and export of all data products and two multi-million dollar business lines. Data Quality and Scale Control and Monitoring Framework - created a comprehensive and highly customizable data quality assertion framework for the migrations. The framework allows for the insertion of quality and scale gates at any stage. Developed N-Way Unique Contribution analysis framework - applied this framework to a variety of use cases across Netwise, Audience Targeting, and Eyeota use cases. N-Way provides easy answers to the often asked questions - What does or will this data source uniquely contribute to our data offering and or business? Where does the source agree with other sources? What’s the overlap and with who? Highly Expressive but Simple CI/CD system - implemented a collection of simple and intuitive forkable template repositories to maximize team member productivity and reusability without sacrificing any customizability. Technologies leveraged were - Github, GCP CloudBuild and GKE, Google’s Kubernetes offering. Large Scale distributed GCP to AWS data transfer appliance - designed and implemented a distributed multi part uploads system executed with boto3, pubsub and cloud functions. Allowed for customer required single file archives (up to 5TB) to chunked archives to be created and transferred in minutes not hours. ### Principal Engineer @ ShareThis Jan 2014 – Jan 2022 | Palo Alto, CA Privacy – Lead engineer for GDPR/CCPA Engineering wide compliance. Infrastructure and Security – CI/CD owner and maintainer – Automation of global infrastructure built around Gitlab/Github/Kubernetes/Jenkins/IAM/Secrets Manager. Data In/Out – 3-4 Billion plus events ingested daily near time ingestion and enrichment (golang+java+dynamo). Extensible event decorator framework. Numerous golang repos. Analytics - Creator of BQM2 – Powers most ShareThis domo cards and first phases of our Adtech audience offering. Data Science – Designed key pieces of data science pipelines. Limited but enthusiastic use of Scala/Spark/EMR. Cost Engineering – Developed a drop in replacement storage scheme for ShareThis core data asset. Cost savings ~50% or storage and transfer costs. Rallied team members through frequent all hands on deck cost saving and security related swarm activities. ### Senior Software Engineer @ Chegg Inc. Jan 2012 – Jan 2014 | Santa Clara, CA ### Technical Yahoo @ Yahoo! Jan 2008 – Jan 2012 Tracking Yahoo! user viewing history for all users of www.yahoo.com, news.yahoo.com, and other Yahoo sites. Component is responsible for a 10% lift in click through rate. Leveraging Avro, Hbase, Hadoop and Pig in the process. Developed a Java modeling api to represent, maintain, and store the various outputs of Core's user and item grid based modeling engine. Provided framework for projecting model data to other optimized formats for use in content ranking. Built a user profile and history publishing system for Core's geographically distributed ranking systems. Built a grid based system of rate limiting for throttling hbase and external storage systems accesses. Responsible for content ingestion for Yahoo's Core Team. Provided content de-duplication and normalization for Core's modeling and recommendation engines. Used ZooKeeper, HBase, Java, and Perl to develop both distributed and file based systems. Worked extensively with Yahoo! customer facing teams to integrate Core's modeling capabilities into their internet real estate i.e. www, finance, news, answers, messenger, omg, and others. ### Programmer and Analyst @ Center for Animal Disease Modeling and Surveillance Jan 2007 – Jan 2007 | Davis, California Used R, a statistical programming language, to develop a package for the analysis of Foot and Mouth Disease epidemics. Package enables numerical and graphical analysis of a user extensible set of Classification models. Includes support for Support Vector Machines (libsvm), Random Forest, and Linear Regression used as a classifier. ### Systems Development Manager @ Lawson Software Jan 2001 – Jan 2005 Designed and developed hierarchical system of Budget Approvals (BAP). Used Hibernate (hibernate.org) as the persistence layer. Supported Budget Approval cycle involving thousands of departments and cost centers. Performance tuning of Lawson Budgeting and Planning system. Hit customer and product scalability targets via Hibernate Query tuning and algorithm analysis a nd refactoring. Acted as Project Manager for Budgeting and Planning Application. Required merging efforts of two teams of 15 developers each to produce a solid budgeting tool still in use today at Lawson customer sites. ### Senior Software Engineer @ Apexion Software Jan 2001 – Jan 2003 Created Java and XML based CRUD integration framework for various ERP systems (Lawson, McKesson, Peoplesoft) used to power Supply Chain Management business processes with barcode scanning handhelp mobile devices. Managed a team of four developers through several 6 month software release cycles. Oversaw and lead team of offshore developers in Amman, Jordan. ### Software Engineer @ Pharos Technology Jan 1998 – Jan 2001 ### Network Operations Engineer @ DG Jan 1995 – Jan 1998 ## Education ### MSCS in Computer Science University of San Francisco ### Computer Science San Francisco State University ### BS in Mathematics & Creative Writing University of Michigan ### Culver Academies ## Contact & Social - LinkedIn: https://linkedin.com/in/deegsca --- Source: https://flows.cv/douglascampbell JSON Resume: https://flows.cv/douglascampbell/resume.json Last updated: 2026-04-11