# Salil Shenoy > Software Engineer at Alation Location: Santa Clara, California, United States Profile: https://flows.cv/salilshenoy Masters in Computer Science graduate from San Jose State University (SJSU) (CGPA : 3.67). Working as a Software Engineer at Guidewire Software, responsible for implementing and maintaining ETL pipelines handling structured and unstructured data, using Postgres, MongoDB and Redshift. Automating the pipelines including data collection, data aggregation and data testing using Jenkins. I primarily use Python but also help with R, developing modular and packaged code. Use Docker, to simulate production like environment to test the individual units of the pipeline. I worked as a Software Engineering Intern at 23andMe. In my role, I have worked on ETL pipelines using Big Data tools like Map Reduce, Spark, HBase, Hive, and AWS. I used Ansible to build an automated deployment pipeline. To implement the pipeline, I used Maven for resolving dependencies, Jenkins as for building the code and creating the artifacts (jars), Apache Archiva which is an open source repository to store artifacts. Before pursuing masters, I have worked as a Software Developer at Persistent Systems, where I used C++. MFC, Windows SDK to build and enhance features for a desktop application Quicken. The responsibilities included working on feature design, implementation and defect fixing. Github: https://github.com/SalilShenoy ## Work Experience ### Software Engineer @ Alation Jan 2021 – Present | San Francisco Bay Area - Working on Alation Data Catalog SQL Editor, Compose. - Implementing new features, bug fixes to enhance, optimize Compose based on the user feedback for Compose applications using Python(Django), React (Typescript), Java - Following Test Driven Development approach using unit test frameworks UnitTest, Pytest in Python, Jest in React Catalog Search - Working on data catalog search which is based on elasticsearch - Built observability and monitoring around the feature to help troubleshoot issues - Improved security posture and accessibilty to deployments by configuring OIDC SSO via OKTA Drive Cloud/Secure Cloud - bring historical knowledge of working on Compose to help the team with support escalations, bug fixes. - helped the team analyze existing E2E tests, convert to Unit, Integration tests with AI tools, speeding up the build pipelines - building enhancements into the product which help drive the organizations goal to migrate customers from onprem installations to Cloud deployments using Python (Django), Rust, React, Typescript - using AI tools Claude Code, Github Copilot to improve productivity and reduce time to feature delivery ### Software Engineer 2 @ Guidewire Software Jan 2020 – Jan 2021 | San Francisco Bay Area - Implement microservices for data collection, aggregation, modeling tasks of cyber data sources using Python, SQLAlchemy, EKS, Aurora Postgres - Working on migrating models from R to Python, implementing models using PySpark running on EMR to reduce turnaround time from 10+ hrs. to 2 - 3 hrs - Driving monthly and daily releases cycles, conducting code reviews, pair programming. ### Software Engineer @ Guidewire Software Jan 2018 – Jan 2020 | San Francisco Bay Area - Working with cyber datasources, implementing ETL pipelines (data ingestion, aggregation) using Python, using Postgres, Redshift, MongoDB as storage, using test driven development - Using Jenkins to to build the pipelines, automate processes ### Software Engineer @ ABBYY Jan 2018 – Jan 2018 | United States - Worked with data capture products FineReader Engine, Flexicapture Engine, Flexicapture application - Worked with users to help them use FineReader and Flexicapture Engine API to implement modules/projects using C++, C#, Java - Worked with users to implement business logic / scripts to extract data using Flexicapture application ### Software Engineer @ 23andMe Jan 2016 – Jan 2017 | San Francisco Bay Area • Working on big data ETL pipeline using Big Data tools like Apache Spark, HBase using Python, Java • Used Spark to implement ETL pipeline, stored data as parquet in S3, used SparkSql to query parquet using Python • Implemented scripts to convert csv to parquet and vice-versa using Spark, fastparquet, pyarrow Python api • Implemented logging framework for Hbase, Yarn using log4j, logback using Java • Developed a continuous integration pipeline deployment using Archiva, Ansible, Maven, Jenkins ### Senior Software Engineer @ Persistent Systems Jan 2014 – Jan 2015 | Hinjewadi • Helped implement One Intuit Password, a single sign on mechanism for users to manage all their Intuit products, using .Net and C++ for UI and feature implementation. • Implemented UI, Developed a feature which helps the user to edit the budget from a single screen, using C++. • Implemented, demoed new UI design for the Quicken Installer using .Net Framework, new libraries from Win32 SDK. ### Software Engineer @ Persistent Systems Jan 2011 – Jan 2014 | Hinjewadi • Development of UI and features using C++ and .Net Framework for Quicken, which is Intuit’s flagship product helping over 1000 customers in managing budget, bill pay, loan management, tax planner. • Developed UI for Loan management module and Installer to make them more user friendly, using C++ and MFC SDK. ## Education ### Master's degree in Computer Science San José State University ### BE in Computer Engineer Savitribai Phule Pune University ## Contact & Social - LinkedIn: https://linkedin.com/in/salilshenoy --- Source: https://flows.cv/salilshenoy JSON Resume: https://flows.cv/salilshenoy/resume.json Last updated: 2026-04-11