Data Analytics major with solid implementation skills of statistics and machine learning algorithms experienced as a software engineer with proven ability to learn and implement as a team player in the fast-paced agile environments.
◦ Led the design and implementation of a SaaS platform incorporating a high-volume, distributed data processing system, which improved operational efficiency by 60% and enhanced data governance and discoverability.
◦ Engineered and scaled a large-scale ETL pipeline for healthcare data, enhancing the throughput and reliability of data workflows, and implementing comprehensive data quality checks and monitoring routines.
◦ Designed and developed a monitoring framework for distributed microservices using Kafka and Elasticsearch, which cut error resolution times by 80% and improved system reliability.
◦ Mentored and supported a distributed team across various time zones, applying Agile methodologies to maintain high efficiency and quick adaptability.
◦ Developed a visual analytics dashboard for mobile user data, supporting strategic business decisions with enhanced user engagement metrics through a robust batch ETL process.
•
Engineered a backend API to integrate real-time claims data with the operations data pipeline, ensuring efficient data flow across the batches and streams of data.
•
Refined the event routing process within the data pipeline, achieving a notable optimization in data ingestion time, enhancing overall system responsiveness.
Analytics Pipeline: Developed a web-based configurable system to launch the projects with built-in data pipelines included with batch and stream processing capabilities for processing files(CSV, JSON) and Twitter streams; This facilitated enhancement of research projects of big data lab to migrate into automated analytics pipelines.
•
Research in the field of decentralized Artificial Intelligence (AI):
•
Limitations of the current blockchain technologies in collaborating with AI;
•
Framework for implementing the AI/ML models on the latest blockchain (3.0) technologies (IOTA, Cardano, ICON)
Microservice Development: Improved the efficiency of the website by migrating a core model from monolithic backend to microservice architecture with a response time within 10 ms;
•
Redesign Data Model: Enhanced the performance of the website by redesigning the whole "guest" member workflow which optimized the response time of all the transactions by 20%.
Data Engineering: Developed data pipelines to facilitate the establishment of the data warehouse by building scalable ETL workflows for handling both batches(50 GB/day) as well as online clickstreams(1M clicks/min);
•
Automation: Solved the promotion claims deficit problem by developing an automated computation engine for the complex data model which boosted the claim by INR 1 Million/month.