Experience
2018 — Now
2018 — Now
Working on Stream Processing Platform team
2014 — 2018
2014 — 2018
San Francisco, California, United States
Data Platform Engineer - Stream Processing Platform (April 2017 - Current)
• Built a new data warehouse platform for stream processing on top of HDFS to power real-time data use cases (Spark, Hive Metastore, Presto)
• Developed a Spark Streaming job to persist data from Kafka into the data warehouse (HDFS, Hive Metastore) in real-time
• Implemented a disaster recovery pipeline for backing up critical data in HDFS to AWS S3 using Spark and HDFS snapshots
• Optimized and tuned various parts of the data warehouse (Spark, HDFS, Hive Metastore, Presto) to improve performance and efficiency of the platform
Data Platform Engineer - Ads Data Engineering (October 2016 - April 2017)
• Implemented various Spark SQL workflows for the next generation Pinterest Ads data pipeline to compute ad metrics, spend, and conversions
Tech Lead, Full Stack Engineer - Advertiser Products Team (June 2014 - October 2016)
As one of the first few engineers on the advertiser products team, I worked with Python (Django, Flask, SQLAlchemy), HTML (Jinja2, React.js), JavaScript (Backbone.js, React.js), and CSS to build and ship various projects such as Pinterest Analytics, Pinterest Ads Manager, Ads API, and Bulk Editor from scratch.
As a full stack engineer, I worked on every layer of the stack to build Pinterest Analytics, Pinterest Ads Manager, Ads API, and Bulk Editor.
These include:
• Developing front-end components using HTML (Jinja2, React.js), JavaScript (Backbone.js, React.js), and CSS.
• Writing application code in Python (Django, Flask, SQLAlchemy).
• Maintaining and debugging databases such as MySQL and HBase.
As the tech lead of the advertiser products team, I worked on various projects which improved developer velocity, reduced technical debt, and improved the quality of the code and the products that my team shipped. On top of the engineering work, my responsibilities included setting the technical direction for the team and mentoring engineers on the team.
2013 — 2013
2013 — 2013
San Francisco, California, United States
Worked on the Twitter Cards Analytics team, implementing a portion of Twitter Cards Analytics platform.
• Worked with Scala, JavaScript (Flight.js), html, and LESS to implement various features such as Top Metrics (URL, Tweets, Accounts) tables.
• https://blog.twitter.com/2014/introducing-analytics-for-twitter-cards
2013 — 2013
2013 — 2013
Mountain View, California, United States
Worked on the Google AdWords Reports team, implementing Auction Insights v2.
• Implemented new views in AdWords custom query engine to fetch and organize the data in the back-end
• Modified the Ads API to support the new views, which makes the queries to the query engine on behalf of the front-end
Implemented the UI for the new feature in Java, using Google Web Toolkit
• Optimized the UI using split points to asynchronously download the components for the new feature instead of on page load
2012 — 2012
2012 — 2012
Mountain View, California, United States
Worked on LinkedIn Groups team for the first half of my internship. The project I worked on was the Group Skills feature, which exposes the unique skills of groups based on its members.
• Wrote and optimized Hadoop jobs in Pig Latin to compute and organize the relevant data
• Pushed the computed data into Voldemort (distributed key-value storage) for real-time access
• Wrote the back-end code in Java, which fetches the unique skills of a group from Voldemort storage and serves it to the front-end using REST.li (LinkedIn's RESTful service invocation framework)
• Wrote unit tests for back-end code using TestNG and EasyMock
Worked on a team responsible for creating a new product (to be announced in the future) for the second half of my internship. While I was on this team, I worked on a variety of features, from back-end service code to user-facing web pages.
• Wrote the back-end code in Java and Scala to communicate requests made from the front-end to ESPRESSO (LinkedIn's distributed database system)
• Wrote the front-end code in Java and Scala to implement the business logic using Unicorn (LinkedIn's front-end framework) and to make necessary service calls to the back-end over REST.li
• Created AJAX end points that can be called from the web page for data mapping and mutating
• Developed features on the web page which called the AJAX end points using JavaScript (jQuery, YUI)
• Worked on user-facing web pages using html, dust.js (templating framework), SASS
Education
University of Waterloo