I’m an MLOps / Machine Learning Engineer with 4+ years of experience designing, building, and deploying production-grade AI systems across enterprise environments.
Experience
2025 — Now
2025 — Now
• Driving GenAI enterprise enablement and automation across multiple business units through bot services, data pipeline integration and Agentic AI workflows.
• Enabled DeFi risk analytics and automated Web3 workflows for Coinbase and Kraken, leveraging blockchain infrastructure with Generative and Agentic AI.
• Involved in developing end-to-end machine learning pipelines, fine-tuning Large Language Models (LLMs), and applying GenAI for real-world automation and insight generation.
• Built GenAI-powered APIs integrating OpenAI GPT and HeyGen avatars for personalized customer messaging and interactive support.
• Worked with Chat GPT and Chat GPT API to leverage its capabilities with several applications.
• Prototyped lightweight RAG-like context chaining with BERT embeddings and NER output before full-stack GenAI frameworks became standardized.
• Leading the Generative AI, Agentic AI, Data, Analytics and Digital Transformation portfolio with a global team, driving enterprise AI adoption and supporting strategic initiatives as a Solution Architect/ Engineer alongside the sales & solution team.
• Integrated ChatGPT API into client-facing platforms, automating 86% of manual tasks and reducing support load through contextual AI interactions.
• Integrated Generative AI tools (Hugging Face Transformers, OpenAI APIs) to support demand forecasting and supply chain risk analysis, enhancing decision-making processes with LLM-generated insights.
• Built production-grade, dynamic-response chatbots using the LangChain Framework, leveraging LLMs for real-time, context-aware answers across high-volume environments.
• Developing enablers and accelerator tools using Generative AI, enhancing data pipelines, insights, and analytics to drive better business decisions.
• Developed AI-driven product enhancements as part of broader digital transformation efforts to address customer pain points, resulting in a 30% increase in adoption rates.
2025 — 2025
2025 — 2025
Virginia, United States
ssisting in designing, coding, testing, and debugging software applications under
the guidance of senior developers.
• Supporting the development of reports, dashboards, and forecasting models using
intelligence tools.
• Collaborating with software engineers, product managers, and designers.
• Participating in team meetings, brainstorming sessions, and code reviews.
• Learning to perform unit testing and troubleshoot software issues.
• Documenting technical processes and software functionality for future reference.
• Researching new technologies and assisting with improvements to existing
systems.
• Helping resolve user-reported issues and gathering feedback for enhancements.
2024 — 2025
2024 — 2025
• Integrated machine learning models into production environments; deployed models for payroll, risk/fraud, identity verification, and other use cases.
• Developed an AI-powered migration tool that transforms Java Struts applications into FastAPI and React using OpenAI’s real-time API and Hugging Face models.
• Developed LLM-based assistants using GPT-3 (Azure OpenAI) and LangChain, integrated into Microsoft Teams for HR and IT support automation.
• Created FastAPI microservices to serve LLM outputs and integrated them into enterprise dashboards and CRMs.
• Integrated LLM-enhanced workflows into legacy systems using REST APIs and message brokers (Kafka/Azure Queues).
• Delivered GenAI solutions for Fraud/Waste/Abuse, code conversion, image analysis, doc search, hallucination, token optimization, prompt engineering, and fine-tuning.
• Led GenAI solution design to automate campaign targeting, content generation, and funnel tracking for GTM teams, increasing conversion rates by 25%.
• Delivered GenAI solutions with Databricks AI (Mosaic AI, LLMOps, Model Serving) and RAG pipelines for real-time inferencing.
• Delivered AI-powered solutions for fraud detection, predictive analytics, and automation, improving efficiency.
• Developed internal AI guidelines and secured LLM usage for sensitive data prompts.
• Implemented AI-enhanced chatbot workflows for support escalation and ticket classification. Integrated solutions with internal CRM via API and conducted A/B testing to improve customer experience.
• Analyzed AI-generated responses and iteratively improved prompt structures, enhancing model accuracy and reducing hallucinations by 30%.
• Leveraged OpenAI API and Hugging Face Transformers to integrate NLP into reporting pipelines, automating sentiment analysis and summarization tasks for customer feedback.
2020 — 2022
2020 — 2022
Hyderabad, Telangana, India
• Involved in gathering, analyzing and translating business requirements into analytic approaches.
• Implemented MLOps pipelines using MLflow, Docker, Kubernetes, and CI/CD tools (GitHub Actions, Bitbucket, Jenkins) to automate training, deployment, monitoring, and versioning of ML models.
• Developed reusable ML/AI models for forecasting and operational risk mitigation across nodes in the supply chain, deployed via Vertex AI Pipelines and Kubeflow.
• Performed data analysis, visualization, feature extraction, feature selection, feature engineering using Python.
• Applied core NLP techniques—tokenization, stemming, lemmatization, sentiment scoring, and POS tagging— to evaluate rhetorical structure, sentiment intensity, and public reaction trends.
• Developed and tested prompt engineering strategies using various instruction formats (e.g., Q&A, classification, summarization) to assess LLM consistency and hallucination rate.
• Worked on the SAGE project, utilizing RAG (Retrieval-Augmented Generation) Pipeline, LLM Models, GenAI, and Prompt Engineering techniques to enhance the system's capabilities.
• Orchestrated data and ML pipelines using TensorFlow Extended and MLflow, leveraging container registries (ECR, ACR, GCR) for containerization to ensure reproducibility and scalability across cloud environments.
• Containerized ML components using Docker and utilized BigQuery as the data warehouse for seamless data management and integration.
Education
VNR Vignana Jyothi Institute of Engineering and Technology (VNRVJIET)
Bachelor of Technology
Webster University
Computer/Information Technology Administration and Management
Webster University
Master's degree
The University of Texas at Dallas