Adeptly revolutionizing data management through the implementation of an efficient data validator, optimizing system performance, and seamlessly transitioning legacy codebases. Expertise spans diverse programming languages, environments, and database applications, focused on elevating data accessibility and visualization and showcasing streamlined deployment processes using Azure DevOps and Jenkins.
ā¢Data Transfer and Visualization: Built and deployed a Docker-based data transfer system to Apache Solr servers using Azure Messaging. Enhanced Terminal Sales Reps' ability to analyze sales data through custom, dynamic dashboards in .NET and TypeScript/Node.js, accelerating decision-making and increasing revenue by 20%.
ā¢Legacy System Migration & Cloud Integration: Migrated Solaris codebases to Linux, using C++, Python, and C# across diverse environments, and Azure serverless architecture, optimizing performance and reducing costs.
ā¢Database Optimization: Optimized SQL Offline running as a TIDAL job, reducing space utilization from 1 GB to 60MB and query times from 2 hours to 20 minutes. Enhanced data accessibility and scalability using Kafka for efficient data streaming.
ā¢Data Validator on Cloud: Architected and implemented a unified data validator using Trino and ASP.NET, reducing manual data validation costs by 90%, and enhancing scalability and efficiency using Azure web apps.
ā¢Secure RESTful APIs: Guided a Python project using FastAPI and IAM authentication for secure API integrations. Delivered scalable workflows using Amazon S3 and Lambda, streamlining collaboration.