2021 — Now
New York City Metropolitan Area
2019 — 2019
Bengaluru Area, India
Worked for a company that consulted for Katerra India - a leading startup in the construction space. Worked on key additions to the suite of microservices to their product called Apollo Construct.
Wrote a bulk downloads service to download large files from Azure Blob Storage and S3 where each request would be offloaded to a Redis queue with a worker processing the tasks sequentially. The code was hosted on an EC2 with a 30GB EBS volume attached to it. The transfer from S3 to the EC2 was exceptionally fast since they were placed in the same VPC. The customer requesting the download would receive a mail with a link to download the zipped files.
A serverless setup to generate thumbnails on the fly whenever a pdf or an image is uploaded to an S3 bucket. An S3 event notification triggers a Lambda function which performs the task of generating the thumbnails. Ghostscript was used to generate the thumbnails for both images and pdfs.
Wrote a new microservice construct-analytics which had an endpoint to generate keywords using nlp_rake library. It was a flask endpoint behind auth which would generate the keywords based on the project description entered by the user. Dockerized the app so that it could be deployed to Kubernetes.
Tried my hand at reducing the time taken by one of the microservices which was responsible to split pdf files. The endpoint was taking around 14 minutes to split files of the size 400MB. Used AWS Spot instances to run this task parallelly thereby reducing the pdf split time to about 3 minutes.
Wrote a weather service that would fetch the daily weather and historical weather details queried from darsky weather API. Wrote a cron job scheduled by a cloudwatch event on AWS to cache the daily weather in Mongo to reduce the number of API calls being made to darksky.
2018 — 2019
Bangalore
Contributed to the Node, Python and Scala backend which were major integration points in a multi-layer microservice architecture. The app generated events based on user behaviour on a website. These events were pushed to Kafka and were further used to do stateful event aggregation using Apache Flink.
Worked on a client SDK written in TypeScript that could be embedded in the customer's website. This SDK would be responsible to generate the click-stream events and also render video recommendations.
Worked on the shopify platform to launch Recotap as an integration with Shopify to provide product recommendations to e-commerce sites built on top of shopify. All internal reporting and recommendation services communicated with the Node backend through GRPC.
Wrote a GRPC reporting service in Python that would fetch key data about the number of visits, the conversion(as defined by user) of a particular event on the website. This service would query Druid which we used internally to store the processed events from Kafka.
Bangalore
Was assigned to work on generating pdfs with a huge amount of tabular data in a sacrosanct way. Tried my hand at a few libraries such as pdfmake, gofpdf, html2pdf. Finally ended up using a node wrapper around latex to handle the pdf generation task in a subprocess.
Maintained the infrastructure of Leucinetech right from deploying to new clients and to have shell scripts to generate and store database backups to S3. Contributed to Node on the backend and React on the frontend. We used components from AntDesign as our UI Library. We used nginx as a reverse proxy to route traffic to our backend.
Setup a load balancer attached to an autoscaling group on AWS to scale the number of VM's based on the number of requests.
Played a part in refactoring the entire application to start using TypeScript and TypeORM and also follow TDD. The testing setup used Jest as the task runner and enzyme as the assertion library. We used prettier as our code formatter.
Migrated the leucinetech domain from wix to AWS Route53 and added https using letsEncrypt and certbot to all our client servers. Also added a script in crontab to auto renew the certificate after expiry
Setup a testing framework using Nightwatch.js based on the Selenium webdriver so that we could start writing integration tests for our frontend React app.
Bangalore
My work was mainly focussed on developing the APIs for the company. The tech stack largely involved working with Node.js, MongoDB and many npm packages for Node.js. Had a wonderful time learning through the internship and also automated few workflow's essential for marketing campaigns. The latter part of my Internship was focussed on revamping the website for Wishfie.
Education
2019 — 2021
University of Colorado Boulder
Master's degree
2019 — 2021
2014 — 2018
Vellore Institute of Technology
Bachelor of Technology - BTech
2014 — 2018