Highlighted Coursework: Foundations of Computer Science, Introduction to Operating Systems, Introduction to Computer Architecture, Big Data, Network Security, Introduction to Java, Design and Analysis of Algorithms, Principles of Database Systems, Virtual and Augmented Reality, Cloud Computing
Supervised and mobilized high-quality performance from a team of engineers focused on scaling, improving, and maintaining high throughput streaming applications and batch data jobs for the supply side of the platform.
•
Empowered external teams to garner strong knowledge of new systems, developing the team into subject matter experts in the reporting systems and enabling them to assist external teams in their reporting needs.
•
Built up the team from a single direct report with limited knowledge of both streaming and batch data processing and grew to become subject matter experts through independent learning and engaging with external teams.
•
Applied knowledge gained from building the team to enable other teams to learn the systems going forward.
•
Drove adherence to customer SLAs around data freshness, response time, and system availability by incorporating rigorous monitoring tooling, strict policies, and 24/7 pager rotation coverage to batch data jobs, Kafka pipelines, and API.
•
Created and implemented quarterly Scrum plans for the scope that needed to be developed and released.
•
Removed impediments, handled conflict management, and inspired team collaboration to help achieve sprint objectives.
Key Achievements
•
Boosted data freshness speed from minutes to seconds and reduced round-trip response time to the frontend by 60% through the cultivation of partnership with Database Administrators, product managers, and other dependent cross functional teams in the service of migrating analytical databases from Vertica to SingleStore.
•
Lessened load by 80% and eliminated monthly client-impacting incidents resulting directly from pipeline volume by designing and implementing strategies to reduce load on Kafka streaming pipelines with product managers and client services teams.
Delivered insights via data related to monitoring and troubleshooting tools used daily by international clients through the design and implementation of new materialized views in SQL combining data from relational and analytical databases into a single OLAP database.
•
Clearly communicated lifetime-aggregated and time series data for terabyte-sized datasets by building batch data jobs utilizing Hadoop.
•
Propelled effective real-time advertising strategies for clients by providing high granularity data through the writing of a new Apache Kafka pipeline in Java to insert high volumes of streaming data into Vertica at a scale of hundreds of MB/s.
•
Optimized joins performed at query-time for the fastest possible client-side retrieval times by configuring OLAP tables many terabytes in size.
Course providing a comprehensive overview of Computer Science fundamentals to prepare students with a non-technical undergraduate degree for a master’s degree in computer science.