# Steve Bronder > Software Engineer at Flatiron Institute Location: New York, New York, United States Profile: https://flows.cv/stevebronder Feel free to contact me or view my website, stevebronder.com, if you have any questions about my current interests or research. If you would like to inquire about freelance R, C++, or Python tool creation or analysis send me a proposal at the email available in my contact info. ## Work Experience ### Software Engineer @ Simons Foundation Jan 2023 – Present | New York, New York, United States ### Quantitative Software Engineer @ Department of Statistics, Columbia University Jan 2020 – Jan 2022 | New York, New York, United States - Lead a team in designing a new memory pattern for matrix automatic differentiation that allows for better cache efficiency and the use of SIMD instructions. This new pattern is applied automatically using an optimization routine I wrote in Stan's Ocaml based compiler. Stan models that use these new matrices can have a 20%-60% reduction in runtime. - Rewrote Stan's Monte Carlo Markov Chain Sampler API to allow multiple chains to run in parallel. Running each chain in parallel allows the program to share data across threads, heavily reducing the programs memory footprint. Models with large datasets can decrease runtime by 10%-30%. - Worked with a small group to build simpler abstractions for creating new functions for reverse mode automatic differentiation that promotes a composition style over the previous inheritance pattern. - Extended Stan's current documentation easier onboarding of new users and developers of Stan's automatic differentiation library. - Created a "legacy C++14 requires" scheme that allows developers to write specialized overloads in a way similar to C++20 requires. ### Principal Data Scientist @ Capital One Jan 2018 – Jan 2020 | New York City - Developed risk models in R and Python to assess the loss given default for leverage lending collateral pools to assess risk of both current collateral pools and synthetic worst case collateral pools. An underwriter submits a draft contract's collateral covenants to a web-based app. Given the collateral covenants in the contract, we can construct a synthetic worst case collateral pool. Assuming the borrower will max out the covenants, we build the riskiest loan pool possible. The model then performs risk analysis on the structure and gives the underwriter summary information, loss curve and default rate plots, and the synthetic loan pool. - Automated FIG's reporting by taking it from a monthly and manual powerpoint process and creating a website giving the most important content updated on a daily basis. The site is hosted on AWS, modulated into separate applications, and has oauth authentication managed by nginx with the lua module. Senior team leaders are now able to assess book quality and make data informed decisions on a daily basis instead of a monthly basis. One of the biggest win points was the data generated headers which provide business leaders with the most relevant information for each piece of the portfolio. - Built machine learning models to predict Moody's risk ratings on unrated loans with a 93% accuracy. We use a random forest method tuned with Model-Based Optimization on 10-fold cross-validation. The results of this analysis allows us to impute the loan ratings for collateral that is missing official scores. The imputation method allows our group to have a much better understanding of the risk across our contracts and overall book. ### Senior Data Analyst @ Capital One Jan 2017 – Jan 2018 | Greater New York City Area - Built a collateral visualization and querying platform that gives underwriters and portfolio managers access to the underlying collateral for collateralized deals through a point and click interface. There has been a near complete drop in ad-hoc data requests as the business team is now able to access and visualize the data through the self-service portal. - Wrote a Python package to automate the processing and validation of collateralized loan data. leverage lending collateral data often comes in an incredibly messy format and previously required hours of human manipulation to transform into the correct form for uploading to a database. After performing empathy interviews with several colleagues, I was able to find the worst choke points of the process. I then developed a package to automate the upload and validation steps. Turnaround times for uploading data went from potentially weeks to a day. - Automated reports that reduced reporting time by 12 hours a month. Several of our quarterly and monthly reports that would take a full business analyst are now fully automatic. ### Predictive Analyst Intern @ Zurich North America Jan 2016 – Jan 2016 | Greater New York City Area - Created new validation metrics to assess predictive performance on corporate insurance contracts - Developed predictive models for corporate auto contracts that beat industry standards by 12% - Used Hive and SQL for data summarization, querying, and analysis of large datasets - Created web applications using D3.js and Shiny for managers to track timelines and costs of analytics projects ### Marketing Analyst I @ Management Science Associates, Inc. Jan 2015 – Jan 2015 | Greater Pittsburgh Area - Constructed Excel tools to compare structural equation models - Developed Bayesian model for client requested study on price elasticity - Authored R tutorial for data visualization and web scraping GPS coordinates - Created an automated process to generate interactive reports in R with Javascript - Gave inhouse tutorials on data visualization, web scraping, and data mining in R, D3.js libraries, and Shiny ### Research Assistant - Economics Department @ Duquesne University Jan 2014 – Jan 2014 - Leveraged R Enterprise to analyze a terabyte of point of sale data - Executed out-of-memory algorithms to sort and merge customer account data with point of sale data - Developed R script to automate data cleaning process ### Intern @ Onorato for Governor Jan 2009 – Jan 2010 - Trained new interns on database management - Promoted and set up events - Helped coordinate door-to-door and call campaign to constituents - Developed and utilized advanced communication skills ## Education ### Master’s Degree in Quantitative Methods in the Social Sciences Columbia University ### BSBA in Economics Duquesne University ## Contact & Social - LinkedIn: https://linkedin.com/in/stevebronder - GitHub: https://github.com/stevebronder - Portfolio: https://SteveBronder.com --- Source: https://flows.cv/stevebronder JSON Resume: https://flows.cv/stevebronder/resume.json Last updated: 2026-04-13