# Ramyak Singh > Software Engineer Location: San Francisco Bay Area, United States Profile: https://flows.cv/ramyak I am a software engineer at Araali Networks, where I work with large-scale distributed databases, stream processing pipelines, and concurrent backend servers/APIs built using gRPC protocols in Kotlin. I have a bachelor's degree in computer science and statistics from Purdue University, where I gained valuable experience in software development, machine learning, and research. At Araali Networks, I have contributed to the design and development of a multi-stage stream processing pipeline to store and consume data from various sources, such as Kubernetes clusters, cloud services, etc. I have also re-designed the backend to reduce read side latencies by setting up a distributed caching layer using Redis and creating data pipeline operators in Kotlin to populate the cache. Additionally, I have built a custom Kubernetes controller in Go to relay and store information about the state of services, pods, and controllers present inside the cluster. I have also gained DevOps experience using Docker and Kubernetes to containerize and deploy applications. I am passionate about data engineering and gRPC, and I am always eager to learn new technologies and skills. I enjoy working in a collaborative and innovative environment, where I can bring diverse perspectives and experiences to the team and help solve complex and impactful problems. ## Work Experience ### Senior Software Engineer @ Qualys Jan 2023 – Present | San Francisco Bay Area ### Software Engineer @ Araali Networks Jan 2020 – Jan 2023 | San Francisco Bay Area • Worked with large-scale distributed databases such as Cassandra and DynamoDB; helping design and develop a multi-stage stream processing pipeline to store the data and concurrent backend servers/APIs built using gRPC protocols in Kotlin to efficiently consume the data • Re-designed backend to reduce read side latencies by setting up a distributed caching layer using Redis and creating data pipeline operators in Kotlin to populate the cache • Built a custom Kubernetes controller in Go to relay and store information about the state of services, pods and controllers present inside the cluster • Gained DevOps experience including creating Dockerfiles to containerize applications, Kubernetes manifests to deploy applications and Gradle manifests for creating Java/Kotlin projects • Setup GitOps on Kubernetes clusters integrating ArgoCD with Jenkins pipelines and helped create a monitoring infrastructure using open source tools such Prometheus for scraping metrics from hosts and pods, in conjunction with Grafana for building dashboard and alerting • Created an end to end automated testing suite for a distributed system using Python along with configuration tools like Ansible and AWS Cloudformation with PyTest framework • Worked on building front-end applications with Vue.js and TypeScript ### Web Developer @ Purdue C Design Lab Jan 2019 – Jan 2019 | West Lafayette • Created an interactive, curated search engine for wearable technologies • Worked in a full stack role using React and Redux on the frontend, along with Python and Flask along with MySQL on the backend and database respectively, acquired valuable design experience working as the sole software developer ### Software Development Intern @ Facebook Jan 2018 – Jan 2018 | Menlo Park, California • Worked in a team focused on creating the interfaces for resolving advertiser issues in a web development role using technologies like PHP, React, Redux and RxJS • Developed and integrated mechanisms for version control and user permission configuration with a collaborative editing platform ### Student Research Intern @ Purdue CAM2 Jan 2017 – Jan 2017 | West Lafayette, Indiana • Worked on finding neural networks driven approach to object tracking and detection. • Worked specifically with convolutional neural network architectures like SSD and YOLO in deep learning frameworks like Caffe optimized for image detection • Worked on the 1 billion images project which aims to process 1 billion images in under a day using the frameworks mentioned above ### Summer Intern @ KPMG Jan 2016 – Jan 2016 • Learnt about the Business Intelligence and Data Warehousing framework at a basic level. • Worked in policy consulting for the most part and gained insights about processes involved in business transactions. ## Education ### Bachelor’s Degree in Computer Science and Statistics Purdue University Jan 2015 – Jan 2019 ## Contact & Social - LinkedIn: https://linkedin.com/in/ramyaksingh --- Source: https://flows.cv/ramyak JSON Resume: https://flows.cv/ramyak/resume.json Last updated: 2026-03-22