# Bohdan Vitomskyi > Staff Software Engineer at Samsung Electronics Location: Walnut Creek, California, United States Profile: https://flows.cv/bohdanvitomskyi Software Engineer with 20+ years specializing in backend, data, and platform engineering. Skilled in independently delivering end-to-end projects, from design and infrastructure setup using cloud providers to coding, deployment, and monitoring. Proven ability to take ownership of the entire development lifecycle and ensure successful project completion. Holds a Master's Degree in Computer Science and Mathematics with honors from Europe (Ukraine). ## Work Experience ### Staff Software Engineer @ Samsung Electronics America Jan 2021 – Present • Implemented robust and scalable backend services for Samsung TV and phone apps. • Designed and led backend services for Samsung’s Highlights app that includes microservices for the mobile app and CMS, video encoding serverless system, data ingestion, infrastructure, and CI/CD (NestJs/Typescript, AWS, Terraform, k8s). • Implemented and owned one of the backend services of the cloud gaming platform that allows end users to play games via TV without gaming consoles (NestJs/Typescript, AWS, Terraform, k8s). • Participated in building a backend service to support NFTs in the TV app (NestJs/Typescript, AWS, Terraform, k8s). • Helped other teams with infrastructure and CI/CD for their projects (AWS, Terraform, k8s). ### Staff Software Engineer @ Samsung Next Jan 2020 – Jan 2021 • Implemented the Video Annotation and Training AI Pipeline (GCP, NodeJS, ReactJS, Python, Kubernetes) • Implemented the Video On Demand service: upload and stream videos (GCP: GCS, Transcoder API, BigQuery, GKE; NodeJS, Kubernetes) ### Senior Software Engineer @ Samsung Next Jan 2019 – Jan 2020 • Designed and implemented the Data Platform on GCP for the Data Engineering team (Pub/Sub, GCS, BigQuery, Dataproc/Spark, Composer, Python, Terraform) ### Senior Data Engineer @ Anki Jan 2018 – Jan 2019 | San Francisco Bay Area • Designed the new version of the Anki Data Platform. • Implemented the Anki Data Lake (AWS S3, Glue Data Catalog, Spark/Parquet) and access to it (AWS Athena, Redshift Spectrum, EMR/Spark). • Added EMR/Spark as a processing engine (running on demand by Airflow) to the Data Platform to unload resource-consuming parts of ETL from Redshift. • Designed and implemented ETL/data modeling/data quality for a new product, robot Vector, to support product analytics and business needs. • Migrated the legacy ETL for robot Cozmo to use the new EMR/Spark/Airflow framework. ### Data Engineer @ NerdWallet Jan 2016 – Jan 2017 | San Francisco Bay Area - ETL - create automate pipelines data processing and monitoring (Bash, Python, Airflow); extract from different sources (mostly APIs), transform and load data to DB (Amazon Redshift). - Data modeling – involved in some data modeling (e.g. SCD type2) for some new tables for existent DWH. - Data Quality Assurance – create Data Quality Check scripts for data profiling to discover inconsistencies and other anomalies in the data to improve the data quality and email alerts if issues with data exist. ### Backend/Data Engineer @ Accuen Jan 2013 – Jan 2016 | Greater Chicago Area - Data Warehouse – build and maintain the infrastructure for DWH(S3, EC2, PosrtgreSQL, Redshift, BigQuery, Spark, Python, Airflow, Ansible) - ETL – create tools/services for automate pipeline data processing and monitoring (Python, Airflow); extract from different sources (APIs, SFTP servers, S3 and Google Storage), transform and load data to DBs (PostgreSQL, Amazon Redshift, Google BigQuery, Spark); add >100G per day. - Accuen Platform – the marketplace (Programmatic media buying) for publishers, agencies, advertisers (Python, Django, PostgreSQL, Amazon Redshift, RQ, Supervisor etc.): participated in creation of the working prototype and early stage of development. - Analytic Products (AP) Console – the Web console for creating, executing, listing, editing of queries to DBs (Amazon Redshift, Google BigQuery, PostgreSQL) for analysis of the data, and viewing/download a results of the queries: develop some parts of the AP Console website (Django); develop the services (Python/RQ/Supervisor) for executing the queries, created in the Web AP Console on DBs and returning a results back to the Web AP Console. - DB administration – PostgreSQL databases, Amazon Redshift cluster (53 dc1.8xlarge nodes) and Google BigQuery: schemas/tables/users/permissions management; write SQL code. - Daily AWS/Linux administration - DevOps (Ansible) ### Full Stack Engineer @ American Health Service Jan 2012 – Jan 2013 ETL(Python) - extract (e-commerce websites, EBay, Amazon Marketplace), transform and load data (products, orders, prices etc.) into the warehouse database(Sybase); developing services, which monitors orders, prices, possible fraudulent purchases, low margins and send messages via email. Developing tools(Python) - which creating shopping listing (TheFind, Google Shopping, Bing Shopping) ; extract data from the warehouse database, transform and load it via web service or CSV files and upload a feeds via ftp; other business, financial analytic programs. Windows administration Website support: changes in web design company e-commerce websites (HTML, CSS, JavaScript) ### Software Engineer @ Self-Employed Jan 2000 – Jan 2012 ## Education ### Master's degree (with honor) in Mathematics and Computer Science Ternopil National Pedagogical University ## Contact & Social - LinkedIn: https://linkedin.com/in/bohdanv --- Source: https://flows.cv/bohdanvitomskyi JSON Resume: https://flows.cv/bohdanvitomskyi/resume.json Last updated: 2026-03-23