# Robert Kolb > Founding Engineer @ Axion | ex-McKinsey | ex-Microsoft | 2021 Microsoft Global Hackathon Challenge Winner Location: New York City Metropolitan Area, United States Profile: https://flows.cv/robertkolb I am a data architect and Founding Engineer with a track record of building scalable data platforms across startups, top-tier consulting, and big tech. Currently, I am a Founding Engineer at Axion Ray, where we are building the world's best AI platform for engineering and quality leaders. I led the 0-1 initial data architecture and built the foundational pipelines that took us from pre-seed to a 300M Series B valuation. My day-to-day is entirely centered on technical architecture, core execution, and building the data linking engines that help global manufacturers catch critical issues before they escalate. Before Axion, I engineered high ROI data solutions for enterprise manufacturing and energy clients at McKinsey & Company and modernized critical security infrastructure at Microsoft (where I led my team to a win in Microsoft's 2021 Global Hackathon Challenge). I am deeply passionate about AI and building world class analytics platforms. ## Work Experience ### Founding Engineer @ Axion Ray Jan 2023 – Present | Chicago, IL Transitioned from VP of Data Engineering to a Founding Engineer role, centering entirely on technical architecture and core execution. This deliberate pivot allowed me to build the high-impact data infrastructure required to scale the company to a $300M Series B valuation. Core Architecture & Execution: • Architected and built the core data linking engine, establishing a fundamental product value proposition for integrating, resolving, and mapping complex client datasets. • Engineered the foundational data pipeline architecture from the ground up, providing the scalable infrastructure that currently powers all client-facing workflows and deployments. ### Vice President of Data Engineering @ Axion Ray Jan 2022 – Jan 2023 | Chicago, IL Joined as one of the first engineering hires to bootstrap the data engineering function from pre-seed all the way to Series A. Led the initial architecture, built the foundational data pipelines from scratch, and helped establish the early engineering culture. • Designed and deployed the V1 architecture utilizing Kedro, Dagster, MongoDB, Django, Cloud Run, and Terraform, establishing the core ingestion framework for the company. • Delivered the initial client platforms that proved product-market fit, directly supporting the successful Series A fundraise. ### Software Engineer @ Microsoft Jan 2021 – Jan 2022 | Redmond, WA • Led a cross-functional team of 7 to win a 2021 Microsoft Global Hackathon Challenge (out of 70,000+ participants), rapidly prototyping and delivering an award-winning technical solution. • Architected and implemented a CMMC L3 compliance solution for the Federal Business Intelligence platform, engineering automated delivery and monitoring for over 130 critical security controls. • Spearheaded the modernization of the Federal Reporting platform by fully transitioning legacy infrastructure to Infrastructure-as-Code (IaC) utilizing Azure Bicep. • Engineered and deployed an optimized CI/CD pipeline that slashed system deployment times by over 50% while improving overall release reliability. ### Senior Consultant – Data Engineering @ McKinsey & Company Jan 2020 – Jan 2021 | Chicago, IL • Served as Lead Data Engineer for an internal analytics accelerator in the energy sector, architecting modular, reusable data pipelines using the Kedro framework to significantly reduce project delivery timelines. • Engineered an automated data profiling and validation engine leveraging Great Expectations to enforce multi-dimensional data quality standards across diverse datasets. • Architected an extensible framework for the profiling system that allowed for custom logic and automated remediation strategies when data quality violations were detected, ensuring high-fidelity inputs for downstream models. ### Data Engineering Consultant @ McKinsey & Company Jan 2019 – Jan 2020 | Greater Chicago Area Oil and Gas Client (Technical Leadership) • Architected and led a team of 3 internal engineers and 3 client engineers to deliver a full-stack optimization platform for a major O&G client, generating $40M in annual revenue through real-time refinery sensor data analysis. • Engineered a recommendation engine utilizing Azure Data Factory, Databricks (PySpark), and App Services to optimize plant operations and throughput. • Acted as a strategic technical advisor, designing robust solution architectures and presenting deliverables to the client’s steering committee and executive leadership. • Drove project sustainability by upskilling client engineering teams in Python, Git, and Azure cloud best practices, ensuring long-term architectural integrity after hand-off. Mining Client (Predictive Engineering & Optimization) • Designed and deployed a predictive maintenance pipeline on Google Cloud Platform (GCP) to forecast critical mining component failures with hourly precision, resulting in $2M in annual cost savings. • Orchestrated complex data workflows using GCP Composer (Airflow), Cloud Functions, and Pandas to ensure high-availability model inference in a production environment. • Developed an automated anomaly detection system utilizing ADTK to monitor sensor drift, establishing a robust feedback loop for model retraining and continuous performance stability. ### Data Engineer - Fellow @ McKinsey & Company Jan 2018 – Jan 2019 | Chicago, Illinois F100 Manufacturing Client • Developed an advanced analytics pipeline utilizing Kedro, Databricks, and Docker to optimize worker lineup and increase remanufactured product throughput, delivering $9M in recurring annual revenue. • Built a recursive inventory mapping tool using the AnyTree Python library to decompose remanufactured units into their subcomponents. • Enabled more precise inventory valuations for the accounting department by providing a granular breakdown of complex part hierarchies, replacing less accurate legacy methods. ### Data Engineer @ Shiftgig Jan 2018 – Jan 2018 | Chicago, Illinois • Created a real time sentiment analysis pipeline using Lambda/SQS that would ingest customer feedback and rank the positive/negative sentiment associated with the review. The system would then forward flagged reviews to relevant CSM representatives. • Built out a worker recommendation system using a combination of Lambda, SNS, DynamoDB, and a graph database. • Put a machine learning model into production that would provide the likelihood of a worker to pick up different positions. ### Business Intelligence Analyst @ Shiftgig Jan 2017 – Jan 2018 | Greater Chicago Area • Constructed an automated transaction based SQL model that used an activity log of CSM’s actions in conjunction with financial information to classify the profitability of clients. • Developed a worker churn model that ultimately led to the LTV of a customer cut by various verticals and positions. • Worked with the Data Science team to create a functional model of worker and position demand pools by location. ### Finance & Statistics Tutor @ University of Illinois at Chicago Jan 2017 – Jan 2017 | Greater Chicago Area ## Education ### Bachelor of Science - BS in Finance University of Illinois Chicago ## Contact & Social - LinkedIn: https://linkedin.com/in/robert-kolb-553511159 --- Source: https://flows.cv/robertkolb JSON Resume: https://flows.cv/robertkolb/resume.json Last updated: 2026-04-05