# Priyanka P. > Sr. Staff software engineer @Experian | Data Engineer | AI/ML Engineer | Backend Engineer Location: San Francisco Bay Area, United States Profile: https://flows.cv/priyankap Results-oriented and innovative Senior Software Engineer with 8 years of experience. Remarkable relationship building, decision making and communication skills. Works at fast pace to meet tight deadlines. Enthusiastic team player ready to contribute to company success. Known for best-in-class development and collaboration. SKILLS: • Expertise on Python, Sql, Scala, Java Script, Unix Shell Scripting, Django, Flask, FastAPI, vue.js. • Experience in creating APIs using gRPC, GraphQL, REST. • Databases: Oracle db, Casandra, PostgreSql, Cloud Spanner, Big Table, GCS, Amazon S3, Druid. • Apache Spark: Spark Core, Dataframes, Spark SQL, Spark Streaming. • Methodology: Agile (Scrum, Kanban, JIRA), SDLC. • Automation tool: ApScheduler, Autosys, borg, Corn job. • Visualization tool: Tableau, Google Plx dashboard. EXPERIENCE SUMMARY: • Hands-on-experience on python web application development using python, html/css, JavaScript, typescript, Vue.js, Django, Flask and Serverless framework. • Experience in Architecting the Data Warehouse and Extraction, Transformation, and Loading (ETL) process for complex business that gets data from various sources into the Data Warehouse. • Good experience with Amazon Cloud (EC2, S3)/ Google cloud (GCS, Cloud run, Scheduler, etc.). • Expertise in working with server-side technologies including databases (using ORM), gRPCs, GraphQL, Restful API using MVC design patterns. • Experience in optimizing Queries and writing Subqueries, Stored Procedures, Triggers, Cursors, Functions on MySQL, Dremel-SQL and NoSQL (MongoDB, Cassandra) database. • Good knowledge on bash shell-scripts to automate routine activities. • Experience in data analysis and machine learning techniques using Python (Pandas, NumPy, SciPy, TensorFlow), Tableau, R, SAS, advanced MS. • Good understanding in Machine Learning Algorithms- Customer Churn Prediction, Predictive analytics, Natural Language Processing, text mining, A/B testing, statistical modeling, forecasting. • Excellent Programming, debugging, problem-solving, optimization and testing skills. • Result-oriented team player and quick grasping ability with problem solving capability. ## Work Experience ### Sr. Staff software engineer @ Experian Jan 2024 – Present AI/ML Engineer ### Staff Software Engineer @ Experian Jan 2023 – Jan 2024 | United States Data / Machine learning Engineer. ### Senior Software Engineer | Backend Engineer @ Google Jan 2020 – Jan 2023 | Sunnyvale, California, United States • Developing Python based APIs (RPC Web Service) and test-driven outage communication tool to track google Cloud incidents and outages. • Designed Google ProtoBuff (IDL) to transfer message between the systems. • Used Panda, NumPy, Pydantic, Data Class, Attrs libraries for data Engineering. • Developing a fully automated continuous ServiceNow integration using Git (aka Gerrit git-on-Borg in Google) and deploy by CI/CD pipeline across different environments like dev, test, Production environment. • Deployment and build of various environments including gLinux/UNIX. Configuring all memory and cache parameter in application based on performance. • Extracting ServiceNow data and creating pipeline to ingest data Google cloud storage buckets. Storing the pipeline metadata to spanner dB to track pipelines. • Gathering requirements and needs, analyze, test, script, and document them to strengthen quality and functionality of business-critical applications. • Implementing test and data driven development methodologies to deliver on essential software requirements (functions, performance, design constraints, attributes) and its external interfaces. • Developing general system design including internal and external information flows, current and future system requirements, and integration points. • Creating technical documentation including high-quality documentation of code. • Conferring with systems analysts, project, and delivery managers to design systems and to obtain information on project limitations and capabilities, performance requirements and interfaces. ### Sr. Python Engineer | Data Engineer @ Apple Jan 2019 – Jan 2020 | Sunnyvale, California • Involved in enhancement of framework (named as iDAA) for DQM (Data quality Management) which helps in data cleansing, processing, and maintaining the metadata quality for multiple applications. • Written several python packages and java codes to verify the data recency, completeness, reconciliation, and accuracy by connecting various platforms and environments. • Working in Python Flask service framework and Scala application to get the data onto the platform and create ETL pipeline for Data Preparation and Enrichment. • Created a wrapper Python script to run the iDAA codebase in different clusters. • Created tables and key spaces in Cassandra to store events of streaming data pipeline. • Bash shell-scripts to run the python script on server and on scheduler. • Analyzed and created the view on the oracle database for Tableau requirement. • Created the event and send the streaming messages to Kafka. Using Kafka producer to send events to druid to consume data for real time ingestion. Kafka Consumer consumes the message and running the ingestion to ingest the segment into druid. Wrote the PySpark code for data processing and data aggregation. ### Python&Sql Engineer | Data Analytics Engineer @ Google Jan 2018 – Jan 2019 | San jose, California • Collaborated with stakeholder in the Local Inventory Ads (LIA) team to understand business analytics requirements, design and develop tools to analyze, monitor and visualize the key business performance metrics. • Steered dashboard requirement gathering and created prototype to give clear picture of the assortment on Google Express that leads to capitalizing lost opportunities. • Analyzed and verified the data quality for inventory, product and store related information from LIA feed provider by creating the predictive modeling and analysis. • Saved 10+ man hours bi-weekly with creation of automated report generator and reusable templates by using JavaScript and python. • Worked with the partner solution teams to develop tools that address their technological and business needs and identify opportunities to grow Google’s partner business. • Investigates and troubleshoot issues/bugs and provided technical support for LIA operation. • Presented the dashboard using web UI to externalize to the outside vendors. ### Software Engineer Intern @ Zebo Technologies Inc Jan 2018 – Jan 2018 | Palo alto, California • Integrated application with 3rd party APIs (Google, Facebook, Stripe, PayPal, Google’s Natural language API &UI's) based on REST calls using Python, phrasing the JSON responses and Ajax call through Vue.js for Zebo platform in Python 3.6. • Used Restful API with JSON, XML to extract Network traffic information and wrote several ORM queries to extract and load data on PostgreSQL db. • Build the web app by using the Vue.js framework, the Bootstrap plugin for Vue (includes JavaScript/CSS/HTML) Created the API with the Serverless framework. ### Software Engineer @ Tata Consultancy Services Jan 2015 – Jan 2017 | Bangalore, India Role: Python Developer • Collaborated and extracted the structured and unstructured data from various systems and performed EDA for statistical modeling. • Conducted data cleansing, variable identification, univariate analysis, outlier detection and missing value treatment, variable transformation and creating analytical dataset for further analysis. • Worked on developing various python code for automating model results. • Wrote Python OO Design code for manufacturing quality, monitoring, logging, and debugging code optimization. • Installed and maintained web servers Tomcat and Apache HTTP Web servers. • Worked on automation, setup and administration of build and deployment tools such as Jenkins. • Used RESTful API with JSON for extracting Network traffic/Memory performance information. • Created database using MySQL, wrote several queries and Django API to extract data from database. • Building a relevant business story out of every model and selling it in a presentable form. • Understanding the business sense based the drivers and their relationship with the category and the divisions and build business sense out of every forecast. ### Assistant Software Engineer @ Tata Consultancy Services Jan 2013 – Jan 2014 | Bangalore , India Role: Python Developer • Involved in the design, development and testing phases of application using AGILE methodology along with experience in Building reusable code and libraries for future use. • Designed the Web Application Using Python on Django Web Framework pattern to make it extensible and flexible, implemented code in python to retrieve and manipulate data. • Used MVC framework to build modular and maintainable web applications. • Created and executed various MYSQL database queries from python using Python-MySQL connector and MySQL dB package. • Maintained and improved the security level of data and Responsible for security standard implementation and data protection. • Used Python and Django creating graphics, XML processing of documents, data exchange and business logic implementation between servers. ## Education ### AI & Machine Learning Caltech Jan 2025 – Jan 2025 ### Master of Science - MS in Business Analytics (Data Science and Analytics) Saint Mary's College of California Jan 2017 – Jan 2018 ### Bachelor of Technology - BTech in Computer Science and Engineering Biju Patnaik University of Technology, Odisha Jan 2008 – Jan 2012 ## Contact & Social - LinkedIn: https://linkedin.com/in/priyanka-padhy07 --- Source: https://flows.cv/priyankap JSON Resume: https://flows.cv/priyankap/resume.json Last updated: 2026-03-22