An Experienced Software Engineer, with a B.Sc in Computer Engineering, that focuses mainly on deep learning model deployment, diverse backend solutions, and architecture optimization.
Experience
2024 — Now
2024 — Now
San Francisco, California, United States
2023 — 2024
2023 — 2024
San Francisco Bay Area
2021 — 2022
2021 — 2022
One Concern goal is to build the most accurate, insightful ‘Digital Twin’ of the world’s infrastructure, revealing hidden risks posed by natural disasters, extreme weather, and climate change. These insights would then be used by customers to make better financial decisions in regards to their investments.
• Designed, built, and deployed various ETL pipelines of our flagship feature focused on the effects of natural hazards on power network infrastructure for production, specifically for the US and Japan.
• Power network architecture was designed in collaboration between data scientists, product, and other SCRUM teams to create the power network infrastructure (this also required our sub team to coordinate between two of the larger engineering teams).
• Designed and optimized multiple ‘v2’ power network pipelines that focused on faster, cheaper workflow processes to produce analysis requested by our clients. This resulted with at least a 30% reduction in production costs for running the pipelines. To achieve this, our design included a more modular design, leveraging SnowSQL while reducing transaction costs, and writing scripts that required less memory to achieve the same results.
• Leveraged 'Datadog' as a way to optimize and sanity check our new, cost-effective workflow process. By using these workflow metrics, I zeroed in on reducing overall memory usage, maximizing our allotted cpu/memory usage, and reducing container running times by use of multithreading/multiprocessing.
• Developed and maintained certain architecture of database schemas, both PostgresSql and SnowSql, to be used for cross team consumption.
• Helped newer team members onboard, review their data output for debugging, and created instructional documentation for running our pipelines.
Languages: Python (Main), SQL, Bash
Technology: Kubernetes, Argo, Helm, PostgreSql, SnowSql, Docker, Datadog, Make, Unittest/Pytest, Github
2019 — 2020
2019 — 2020
San Francisco Bay Area
Edison AI is a product built by our team to aid data scientists within and out of GE Healthcare in developing, training, testing, and deploying their deep learning models.
My responsibilities and accomplishments:
• Primarily develop for backend cloud platform using AWS Lamda, S3, and API gateway, while also optimizing past modules for a cleaner, faster workflow.
• Collaborated with multiple SCRUM teams to transfer capabilities of the cloud platform to an on-prem environment.
• Completely re-designed the cloud workflow for on-prem capabilities. This includes re-designing and implementing the architecture of our sub-team’s portion of the product.
• Delivered extra features before deadline that was given so that the it could be showcased to customers. The main deliverable leveraged using orchestration tools to allow for parallel multi-GPU model training and testing.
• Built various modules for customers for use in our orchestration tool, along with bash scripts, which allowed for more of an easier ‘hands-off’ approach for customers.
Languages used: Python (Main), Java, Perl
Technology used: Kubernetes, Helm, Airflow, Docker, MinIO, AWS (SageMaker, S3, ECS, Lambda), Unittest/Pytest, Github
Non-technical Accomplishments:
• Planned and taught a two week beginner AI course for incoming EEDP students.
• Documented, tested, and led brown-bags to educate other team members the aforementioned
components that were added to the platform.
• Worked with international teams to leverage our technology requiring strict scheduling and precise
communication to attain our goals.
2017 — 2019
Greater Milwaukee Area
The Edison Engineering Development Program consists of a small group of entry level engineers across multiple disciplines who go through a rigorous 4 6-month rotations through different modalities within GE Healthcare. To graduate one must complete a semester long project, professional seminars, and 2 years of post-graduate courses focused on the inner mechanics and theories surrounding healthcare technology.
1st Rotation (MKE): PET/CT (Jul 17 - Dec 17)
• Enhanced existing application used on PET/CT Scanner gathered from input from our customers (mainly research labs)
• Designed revamped application with customers, created/logged validation procedure for FDA approval, regular debriefings to PET/CT team
Tech: Eclipse, Putty, Java, Bash scripts, Redhat using VirtualBox, Perl
2nd Rotation (MKE): X-Ray Tubes (Jan 18 - Jul 18)
• Created a tool using JSON as the protocol to bridge an existing application (Python) to automate testing for an existing X-Ray tube device monitor (Java)
• Drastically reduced hours of human intervention for testing and allowed for cleaner, more reliable data. Currently used on manufacturing floor
Tech: VS Code, Eclipse, Java, Python, bash, Labview
3rd Rotation (Bay Area): Edison AI Device (Jul 18 - Dec 18)
• Collaborated on a SCRUM team to develop an application placed on medical devices to make analytical predictions using AI models
• Created security protections for input API, protections of env variables, unit tests for various modules, and detailed documentation
Tech: VS Code, Python, Pytest, Unittest, Postman, Github
4th Rotation (Bay Area): Edison AI Cloud (Jan 19 - Jul 19)
• Integrated deep learning models in our platform for use within different modalities, using docker images for deployment. Cross functional development with program/product managers and data scientists
• Implemented different forms of storage (S3/MinIO) to interact with docker for scalability and envs
Tech: VS Code, Python, Docker, Pytest, VirualEnv, Postman, Github
Education
University of Maryand, Baltimore County