# Khandu Shinde > Staff Software Engineer at Chime!! Location: San Francisco Bay Area, United States Profile: https://flows.cv/khandu 20+ years of experience as Sr. Data Engineer working on different ETL tools building data warehouse solutions for large scale enterprises across Fintech, Manufacturing, Banking & Agro domains. Developer, architect and technical lead in data warehousing and databases with strong results Competent in all aspects of Data Warehousing including project planning, requirement gathering, data modeling, development, deployment, maintenance, and enhancements. Able to work collaboratively, quickly understand system architecture and business needs while committed to achieving corporate goals. ## Work Experience ### Staff Software Engineer @ Chime Jan 2022 – Present | San Francisco Bay Area ### Sr. Data Engineer @ Chime Jan 2020 – Jan 2022 | San Francisco Bay Area ### Sr. Data Engineer @ Bayer Crop Science Jan 2016 – Jan 2020 | San Francisco Bay Area Responsibilities: Building ETL pipeline accelerating the extraction, transformation, and loading of massive structured and unstructured data. Managed all activities necessary to take an ETL project from concept to production including system architecture definition, data modeling (star and snowflake) design, prototyping, development & testing (SCD Type I,II,III) and scheduling (dependencies, backward chaining, forward firing) Excellent analytical, organizational, management, technical and creative skills with a strong understanding of data warehousing technologies, data analysis and OLTP vs. OLAP flows Configuring streaming pipeline to perform a real time analytics to help grower and dealer to maintain a great farming practises. Loaded the aggregate data into a relational database for reporting, dash boarding and ad-hoc analyses, which revealed ways to lower operating costs and offset the rising cost of programming. Experienced in Parsing high-level design specs to simple ETL coding and mapping standards. Designed Mapping Document, Detail Design Documents, Install and Release documents which is a guideline to ETL Coding. Designed a large data warehouse using data/dimensional modeling. Responsible for scheduling ETL pipeline in Airflow, error checking, production support, maintenance and testing of ETL pipeline using Airflow logs. Designed and developed Big Data analytics platform for processing growers(farmers) information from fieldview product and other application logs using Python, Spark, Hive etc . Environment: Spark, Hive, Flink, AWS(EMR, S3, Lambda, Kinesis, ECS, EC2), YARN, Zeppelin Notebook, Python, Scala, Docker, Airflow, Kettle, Vertica,Git Repo. ### Sr. Data Engineer/ Designer @ Salesforce Jan 2015 – Jan 2016 | San Francisco Bay Area Description: Salesforce.com (stylized as salesƒorce) is a cloud computing company headquartered in San Francisco, California. Though its revenue comes from a customer relationship management (CRM) product, Salesforce also tries capitalizing on commercial applications of social networking through acquisition. As of 2015, it is one of the most highly valued American cloud computing companies with a market capitalization of $50 billion,[6][not in citation given] although the company has never turned a GAAP profit since its inception in 1999. Responsibilities: • Data integration plan from the various source system to Hadoop system. • Experienced in Parsing high-level design specs to simple ETL coding and mapping standards. • Designing, developing, integrating, testing, troubleshooting and debugging of the applications. • Designed Mapping Document, Detail Design Documents, Install and Release documents which is a guideline to ETL Coding. • Managing smooth implementation within deadlines and deployment of the application at client location. • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables. Environment: Spark,Hive,Scoop,Scala,Python,Informatica Power Center 9.5, Oracle 11g,TIDAL Apache Flink, AWS (EMR,Kinesis, S3, Zippelin Notebook) ### ETL Architect @ Ingersoll Rand Jan 2011 – Jan 2015 | Charlotte, North Carolina Area Description: The Client is a $17 billion global diversified industrial company founded in 1871. Ingersoll Rand is a global provider of products, services, and integrated solutions to industries as diverse as transportation, manufacturing, construction, and agriculture. The IR BTP BI program will allow the business to focus on analyzing information to make better business decisions and provide improved management reporting and analytics with data from disparate order entry and order management systems. Responsibilities: • Parsing high-level design specs to simple ETL coding and mapping standards. Designed Mapping Document, Detail Design Documents, Install and Release documents which is a guideline to ETL Coding. • Performance tuning of various ETL components, Database SQL queries, Indexes management, Advanced Oracle Concept Tuning. • Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica. • Used Informatica's features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data. • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure. • Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats. • Designed and developed the Mappings using various transformations to suit the business user requirements and business rules to load data from Oracle, SQL Server, DB2, flat file. • Developed standard and re-usable mappings and mapplets using various transformations like Expression, Lookups, Joiner, Filter, Source Qualifier, Sorter, Update strategy and Sequence generator. ### Senior Informatica Developer @ BNY Mellon Jan 2010 – Jan 2011 | Jersey City, NJ Description: Bank of New York is one of the leading financial services groups and one of the largest financial institutions in the combined USA and abroad market with total assets of over $250 billion. It has an extensive network of 300 branches and representative offices in 25 countries. RBS offers a wide range of loans, whether it is for a HDB home, a private home, renovation, a commercial property, car, secured overdraft or just a credit line. This DSS system is implemented to mainly integrate the data from multiple source systems and providing one view of solutions to the business managers and share holders. DSS provide revenue based information on daily, monthly, quarterly and annual basis. This system client is also using to receive the complaints. and orders that are coming from different customers Responsibilities: • Extraction, Transformation and loading of data from flat file, Oracle sources into oracle database. • Created Informatica mappings and mapplets using different transformations. • Imported data from various Sources and loaded to Target database, created Transformations using Informatica Power Center Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer). • Involved in analysis of business requirements and keeping track of data available from various data sources, transform and load the data into Target Tables using Informatica Power Center. • Prepared documentation for ETL strategies, source versus destination system mappings. Involved in building and supporting the extraction flows. • Identification and Documentation of various data sources including the detailed Table/File/Field level mappings between the source and target systems. • Created mappings using various transformations like Lookups, Expression Editor, Update Strategy, Filter, Router, Joiner and Sequence generator. • Extensively worked on the AUTOSYS scheduling tool. ### Informatica Developer @ Emerson Jan 2008 – Jan 2009 | Pune, Maharashtra, India Description: Emerson (NYSE: EMR) is a diversified global manufacturing and technology company. We offer a wide range of products and services in the areas of network power, process management, industrial automation, climate technologies, and tools and storage businesses. Recognized widely for our engineering capabilities and management excellence, Emerson has approximately 127,700 employees and 240 manufacturing locations worldwide. Responsibilities: • Created Informatica mappings and mapplets using different transformations. • Extensively used Informatica Client tools -- Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor. • Worked with Informatica Partitioning when dealing with huge volumes of data and worked on parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions. • Prepared the documentation for the mappings and workflows. • Used transformations like Connected and Unconnected Lookups, Aggregator, Update, Expression Router and Sequence generator. • Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor. • Did performance tuning at source, transformation, target, and workflow levels. • Used PMCMD, PMREP and UNIX shell scripts for workflow automation and repository administration. Used PVCS for version control and AutoSys for Job scheduling. • Created ETL processes to extract data from Mainframe DB2 tables for loading into various oracle staging tables. • Developed ETL into data mart for phase-II data elements from staging. ### Member of Technical Staff @ CDAC Jan 2005 – Jan 2008 | Pune, Maharashtra, India Description: Pradhan Mantri Gram Sadak Yojana (PMGSY) was launched on 25th December 2000 as a fully funded Centrally Sponsored Scheme to provide all weather road connectivity in rural areas of the country. The programme envisages connecting all habitations with a population of 500 persons and above in the plain areas and 250 persons and above in hill States, the tribal and the desert areas. Handling modules like Master Data, Core Network, Proposal, Tendering Quality Monitoring, Payments and Receipt Module (Accounting Module). Responsibilities: • Creating stored procedure, functions, creating views, packages, triggers and tables. • Changing the existing stored procedure due to issues and performance issues. • Developed oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL & PL/SQL stored procedures • Extensively used Experience in working with SAP Developers to create Function modules or transparent(Z Table in SAP) tables for Migration /Loading of Data from SAP R3,SAP BW to Oracle and then loading to external targets like SQL Server and Flat files. • Worked on SQL tools like PLSQL Developer/SQL Plus/MS SQL Visual studio to run SQL queries and validate the data. • Worked along with SAP Functional team on SAP Solution manager and service manager tool for transferring or Transporting programs from one level to another level in SAP(Landscape like Development, UAT, Regression testing and Production). • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle. • Designing the data entries screens. Modifying and designing reports. • Handling and resolving all the queries of the client. Environment: Oracle 9i, ASP.Net & ASP. ## Education ### DAC in Computer Science ACTS CDAC ### Bachelor’s Degree in Computer Science All India Shri Shivaji Memorial Society College Of Engineering ## Contact & Social - LinkedIn: https://linkedin.com/in/khandu-shinde-4a559ba8 --- Source: https://flows.cv/khandu JSON Resume: https://flows.cv/khandu/resume.json Last updated: 2026-04-12