# Nick Weimer > Senior Software Engineer at Twilio Location: United States, United States Profile: https://flows.cv/nickweimer I’m happy to be in a position where I can constantly learn. While working closely with software, I’ve learned how to be patient and persistent. As an engineer you are constantly figuring out "why". Some of the best engineers that I’ve worked alongside approached problems with a relentless sense of calmness, which is something I’ve grown to adopt. Some things I've been focused on over the past year: -setting up dbt for our team and leading dbt adoption by making tutorial videos -rewriting our Airflow pipeline of 200+ ingestion scripts to use multiprocessing and running it on a distributed celery architecture (MWAA) instead of a single self managed ec2 - building out our “Analytics Command Center”, a custom React app that displays things like pipeline health of tables, schema changes, column lineage, KPIs for execs, longest runtimes across models and more Apart from backend work, I also build out custom visualizations (React, d3) and web applications. Skills include AWS ecosystem (+ Dynamo, Lambda, API Gateway, ECS, EKS, MWAA, cdk), Airflow, Spark, dbt, Python (+ Flask, Django), Scala, JavaScript (+ React, Redux, Nextjs, Node, Express), SQL, Docker, Kubernetes Professional Portfolio: https://www.nickweimer.com/ Small side project (6-Week): https://www.classifeyeanimals.com/ ## Work Experience ### Senior Software Engineer @ Twilio Jan 2022 – Present Corporate Data and Analytics Infrastructure team (R&D) - Remote • Responsible for serving 60% of company their data needs (Marketing, Finance, Sales, etc.) by maintaining 200+ Spark & Python ETLs on 4-person team using technologies such as Airflow (AWS MWAA), Spark (AWS EMR), Buildkite, Terraform, Presto, Glue, Lake Formation • Led project for removing PII from transcripted sales calls (data from Gong) on ~1 billion words using AWS Comprehend API, Spark Scala, EMR • Led project for creating an internal Chatbot which responds with column-level lineage and high-level explanation of data tables using GPT 4 and Chroma (Vector db) • Go-to partner for Finance team each month for end-to-end calculation of revenue, cogs, and gross margin • On-call rotation member every 4 weeks (24/7) for data infrastructure team ### Lead Data Engineer @ ClickUp Jan 2021 – Jan 2022 Airflow, Snowflake, Python, SQL, JavaScript, dbt, AWS (CDK, Lambda, MWAA), DynamoDb, Postgres • Broadened our analytics team's scope from a single marketing ML model to serving Finance, Marketing, Customer Success, Sales, and Operations. Built out our Snowflake data warehouse from scratch which helped our team expand from 4 to 30 people • Responsible for creating and maintaining Airflow (MWAA) and dbt codebases, and did the initial setup of each • Employee of the Month, Core Value Award for October 2021 • Increased data ingestion speed by over 200% by transitioning ClickUp's Airflow server from self-managed EC2 to MWAA using AWS CDK (Python) and multiprocessing • Created office visualization (React, Flask API, Airflow, AWS EB) - a rotating globe of customers - for display on office TVs • Produced video tutorials covering dbt cloud to onboarding analysts/data scientists • Created multiple "scalable webhooks" using AWS CDK (TypeScript, Python): API Gateway -- Lambda -- SQS Queue -- Lambda -- Snowflake • Built a custom integration: Implemented automated refreshes in Tableau extracts based on a successful dbt job run by creating a custom Airflow sensor task and hitting the dbt/Tableau APIs • Created a way to dynamically account for schema changes on ingestion each day, output any schema changes to a central table, then take scheduled snapshots of these schema changes over time • Responsible for most of the API ingestions for the company’s data lake • Happy to help teammates with common issues such as understanding dbt, fixing git issues, creating slack alerts, creating new Snowflake users/roles/permissions, merging PRs, mapping data relationships, reviewing SQL logic, etc. ### Data Engineer @ SAS Jan 2017 – Jan 2021 | Cary, NC Global Hosting and US Professional Services (Client-facing) - Advanced Analytics Lab Worked in the following groups: Automotive Manufacturing, Trade Finance, State Tax Fraud, HVAC System Optimization, Credit/Debit Card Fraud ### Data Analytics Analyst @ Charlotte Hornets Jan 2016 – Jan 2016 | Charlotte, North Carolina ## Education ### Master of Science (M.S.) in Analytics Institute for Advanced Analytics ### Bachelor's degree in Chemistry The University of North Carolina at Chapel Hill ## Contact & Social - LinkedIn: https://linkedin.com/in/nick-weimer-2837b477 --- Source: https://flows.cv/nickweimer JSON Resume: https://flows.cv/nickweimer/resume.json Last updated: 2026-03-22