• Designed a Deep Learning model (CNN using Keras and TensorFlow) integrated with OpenCV for automated defect detection in manufacturing images, achieving 95% accuracy and decreasing product inspection time by 50%.
• Fine-tuned a smaller LLM (BERT or a custom model using LoRA and Hugging Face) for domain-specific text generation and summarization, improving internal knowledge search efficiency by 40%.
• Developed production-ready REST APIs (FastAPI) for inference, containerized services using Docker, and orchestrated deployment on GCP (Vertex AI, Kubernetes), handling 10,000+ daily requests.
• Built a Regression/Classification ensemble model using scikit-learn and Python to predict customer churn, achieving an AUC-ROC of 0.88 and guiding retention efforts that saved the client an estimated $22K annually.
• Implemented MLflow for tracking all model experiments, parameters, and metrics, creating a centralized Model Registry that reduced experimental setup time for new engineers by 25%.
• Led A/B Testing and Statistical Analysis for model performance and business impact validation in a live production environment, ensuring only models with a statistically significant performance lift of 5% or more were deployed.
• Designed and automated ETL/ELT workflows using Airflow to manage data transfer and Data Wrangling of over 2TB of data from PostgreSQL & MySQL to a central analytics platform.