Hi, this is Min! I am currently working as a Software Engineer at NMI. At NMI, I focused on integrating new payment processors and maintaining existing ones. As an engineer, I am constantly seeking opportunities to expand my knowledge and expertise.
Experience
2021 — Now
2021 — Now
New York, United States
• Actively working on integrating new card payment processors into our system to boost company revenue. This process involves assessing whether the processor's features and capabilities align with our company’s requirements, implementing the necessary code from the initial boarding of the processor to sending payment information to processor and securing required certifications before launch.
• Maintain existing payment processors by implementing new client feature requests, continuously assessing integration compliance with processor requirements, and updating systems as needed.
• Consistently providing production support by monitoring, resolving, and investigating issues that could impact our system’s performance.
• Create a reporting analysis tool to capture and analyze log statistics aiming to prevent server overloads and unnoticed issues from excessive logging.
• Supported QA by adding required attributes for their Cypress automation testing project and making processor responses more readable and directly accessible in Postman.
• Developed a comprehensive guide to address various support issues, aimed at assisting team members in navigating our logging system and resolving common problems effectively. This guide also serves as a reference for future issue resolution.
2021 — Now
2021 — Now
Pittsburgh, Pennsylvania, United States
• Working on the crawler development team to develop new crawlers that collect information regarding Department of Defense spending and research, which will all be uploaded into the Amazon S3 bucket.
• Continuing to develop and improve features for the crawler systems by such as implementing file upload parser to upload message into S3 bucket.
• Refactored and redesigned the existing code base by decoupling the selenium web driver from parsers to keep functionality to remain purely on extracting information from websites. Interactions with the web driver are move to the reusable action class components.
• Leverage the use of Docker to create a better way to run local test on the crawler system that can run on all different environment.
• Created round robin proxy server pool for the MITMProxy upstream proxy to get pass certain rate limits placed on targeted website.
• Created a more efficient caching option for MITMProxy hot-cache using Redis to determine when certain requests should be stored and removed based on their retrieval frequency and recency.
• Lead a multi-team effort with HR, beginning with understanding their feature requirements and then designing and implementing a web crawler and additional features aimed at identifying viable candidates for hire.
2016 — 2020
2016 — 2020
Boston, MA
• Worked with fellow tutors to create interactive activities to aid student in learning math and literacy
• Assist student who is struggling with classwork and homework
• Update and communicate with tutors on events that had taken place on the work site
2018 — 2019
Boston, Ma
• Assist with administrative tasks and projects around the lab.
• Interact with other students by providing information and assistance.
2019 — 2019
2019 — 2019
Manhattan,NY
• Created bonding curve through ZAP curation market.
• Wrote an auxiliary market smart contact through solidity using ZAP API, in which users can choose to invest in either the main or auxiliary market.
Education
Boston University