Results-oriented and innovative Senior Software Engineer with 8 years of experience. Remarkable relationship building, decision making and communication skills. Works at fast pace to meet tight deadlines. Enthusiastic team player ready to contribute to company success.
Developing Python based APIs (RPC Web Service) and test-driven outage communication tool to track google Cloud incidents and outages.
•
Designed Google ProtoBuff (IDL) to transfer message between the systems.
•
Used Panda, NumPy, Pydantic, Data Class, Attrs libraries for data Engineering.
•
Developing a fully automated continuous ServiceNow integration using Git (aka Gerrit git-on-Borg in Google) and deploy by CI/CD pipeline across different environments like dev, test, Production environment.
•
Deployment and build of various environments including gLinux/UNIX. Configuring all memory and cache parameter in application based on performance.
•
Extracting ServiceNow data and creating pipeline to ingest data Google cloud storage buckets. Storing the pipeline metadata to spanner dB to track pipelines.
•
Gathering requirements and needs, analyze, test, script, and document them to strengthen quality and functionality of business-critical applications.
•
Implementing test and data driven development methodologies to deliver on essential software requirements (functions, performance, design constraints, attributes) and its external interfaces.
•
Developing general system design including internal and external information flows, current and future system requirements, and integration points.
•
Creating technical documentation including high-quality documentation of code.
•
Conferring with systems analysts, project, and delivery managers to design systems and to obtain information on project limitations and capabilities, performance requirements and interfaces.
Involved in enhancement of framework (named as iDAA) for DQM (Data quality Management) which helps in data cleansing, processing, and maintaining the metadata quality for multiple applications.
•
Written several python packages and java codes to verify the data recency, completeness, reconciliation, and accuracy by connecting various platforms and environments.
•
Working in Python Flask service framework and Scala application to get the data onto the platform and create ETL pipeline for Data Preparation and Enrichment.
•
Created a wrapper Python script to run the iDAA codebase in different clusters.
•
Created tables and key spaces in Cassandra to store events of streaming data pipeline.
•
Bash shell-scripts to run the python script on server and on scheduler.
•
Analyzed and created the view on the oracle database for Tableau requirement.
•
Created the event and send the streaming messages to Kafka. Using Kafka producer to send events to druid to consume data for real time ingestion. Kafka Consumer consumes the message and running the ingestion to ingest the segment into druid. Wrote the PySpark code for data processing and data aggregation.
Collaborated with stakeholder in the Local Inventory Ads (LIA) team to understand business analytics requirements, design and develop tools to analyze, monitor and visualize the key business performance metrics.
•
Steered dashboard requirement gathering and created prototype to give clear picture of the assortment on Google Express that leads to capitalizing lost opportunities.
•
Analyzed and verified the data quality for inventory, product and store related information from LIA feed provider by creating the predictive modeling and analysis.
•
Saved 10+ man hours bi-weekly with creation of automated report generator and reusable templates by using JavaScript and python.
•
Worked with the partner solution teams to develop tools that address their technological and business needs and identify opportunities to grow Google’s partner business.
•
Investigates and troubleshoot issues/bugs and provided technical support for LIA operation.
•
Presented the dashboard using web UI to externalize to the outside vendors.