Experience in application Development for business intelligence, big data, data analytics, and DevOps integration on on-prem, Hadoop Distributed Cluster, Private, and Public Cloud Infrastructure.
Experience
2023 — Now
2023 — Now
Mountain House, CA
Designed and developed cloud-based data integration and analytics platforms across AWS and Azure environments.
Built ETL pipelines using Databricks, Azure Data Factory, and AWS Glue for enterprise applications and legacy migrations.
Automated infrastructure and code deployments with Terraform, CloudFormation, and GitLab CI/CD pipelines.
Implemented SaaS frameworks using AWS Glue, Step Functions, and Airflow for large-scale data processing.
Developed real-time event-driven services with AWS Lambda, SQS, SNS, and API integrations.
Migrated and synchronized data using AWS DMS and optimized S3 data storage with partitioning and cataloging.
Ensured compliance and security (PCI, USGCI) and managed DevOps automation for passwords, access, and policies.
Tech Stack: AWS (Glue, Lambda, S3, EMR, EC2, DMS, RDS), Azure (Data Factory, Databricks), Terraform, CloudFormation, Python, Java, SQL, GitLab, Jenkins, Linux, Oracle, PostgreSQL, and SQL Server.
2022 — 2023
2022 — 2023
Mountain House, CA
Design and Develop ETL data pipelines, event-based processing for the employee, dealer, and care in sales commission management system.
• Customer Care commission payout, Sprint and Metro systems Integration using Informatica, PL/SQL, AWS Oracle RDS
• Implemented End-End Payroll files process automation in Python, shell script for Employee, Customer Care and Dealers
• SAMSON billing system subscriber and features data extraction and integration with Scorecard
• Implemented pgp and gpg encryption and decryption process for internal and external data feeds
• Designed ETL frameworks for audit, control and balance process, File load/generate and external code execution
• Successfully migrated 32 TB Oracle from on-prem to AWS RDS Oracle and multi available zone setup
• 24/7 Production Support for 1.2k active production jobs, enhancements of legacy applications, Systems upgrade and Maintenance support
• Support 11 Non-production environments maintenance and Informatica Administration activities
• Implemented ODBC/JDBC driver connection in ETL informatica
• Designed and implemented delta replications for 300+ objects to reporting and analysis systems
• Support to multiple scrum teams and deliver tasks in Agile model – 2 weeks sprint
• Mentor team, support on new and legacy applications and provide support offshore team
Environment: Java, Python, Shell Scripting, Informatica, AWS - S3 bucket, EMR, EC2, SQS, SNS, Secret Manager, Lambda, Deep.io, Splunk, Git, Bitbucket, Gitlab, Jenkins, SQL, PL/SQL, Linux, Oracle, Teradata, SQL Server, Tidal, SFG - File transfer
2017 — 2022
2017 — 2022
Kirkland, Washington, United States
• Care commission payout system Integration
• Design data pipelines for the integration layer for structured and semi-structured data formats
• Implemented DevOps CI/CD pipelines and automated deployments
• 30 TB database migration from on-premise to AWS RDS Oracle and multi-available zone set up
• Event-based commissions system for near real-time data-processing of POS and hard goods events
• Developed spark jobs to process 80million transactions for every two hours by leveraging the EMR cluster
• Designing event-based data flows to push JSON objects to SQS, and SNS using Java and Python API
• Designed frameworks for S3 bucket storage and process data to RDS Oracle and Postgres databases
• Implementing Publisher and Subscriber payload events SQS and enhance Spark Streaming for data aggregations
• Implemented event based near real time data integration for IOT data using Kafka and Deep.io
• Implemented audit, control and balance framework for ETL job logging
• Implemented a reusable test framework for integration testing
• Implemented delta replication process from central commissions database to reporting and analysis systems
2013 — 2017
2013 — 2017
Phoenix, Arizona, United States
Design and develop end to end data warehouse, scorecard metrics, data analytics, data lake for policy, payment central, billing and payment systems.
• Implemented ODS and IBM Data Lake platform for marketing and policy data sources on Hadoop cluster
• Design data model, database objects and Architect ETL data pipelines
• OLTP Policy source system analysis and map the attributes, assign codes and types values
• Developed general Insurance IBM IIW subject area models for policy, billing, and payment Enterprise Data warehouse systems
• Implemented presentation layer - Policy Analytics and KPI Metrics
• Implemented Data Governance system for policy and payment attributes
• Developed scorecard KPI reporting system for organization wise data quality by leveraging bigdata technologies
• Implemented near real time event-based systems for device usage data using Kafka and Spark Streaming services
• Leveraged Hortonworks platform- Hadoop for data storage and Spark for compute engine.
• Partially involved in the Customer 360 master data management application design and development
• Implemented CI/CD Jenkins build pipelines and code deployment automation
• Performance improvements, enhancements, and defects resolution in existing applications.
• Documented and maintained run books on all changes and issues.
2010 — 2013
2010 — 2013
Seattle, Washington, United States
Built Operational data Store, Teradata data warehouse, fact schemas for T-Mobile Subscriber and campaign management database systems.
• Analysis and development of calls data, business process campaign management, customer data, bridge to value, customer opt-out, subscriber fact
and dimension model applications
• Design data models and functional specifics based on business requirements
• Designed developed cross reference tables for customer latest snapshot
• Design and develop ETL, ELT, pushdown optimization, parallel data processing jobs for Data Warehouse (TDW) and Data Marts
• Design Business objects Universe, complex SQL queries to publish ad-hoc and canned reports using Business Objects
• Spin up non-production environments with production volume and validate end to end transaction of data behavior for new products
Education
University of Wisconsin-La Crosse
Master's degree
Anna University Chennai
Master of Science in Information Technology
Acharya Nagarjuna University