# Qian L. > Senior Software Engineer at AlertMedia Location: Katy, Texas, United States Profile: https://flows.cv/qianl A self-motivated problem solver with 7+ years professional software programming experiences in big data processing applications for both Cerner cloud based HealtheIntent platforms and HealtheCRM solution. Also being a key software engineer role in infrastructure design and software development of HealtheCRM solution including its front-end HI-console tool. Main Skills: Java, Spark, Scala, Ruby, React, AWS Lambda/Glue/AppFlow, Hadoop, Hive, HBase, Oozie, Crunch, Splunk, Docker, Apex ## Work Experience ### Senior Software Engineer @ AlertMedia Jan 2022 – Present | Remote Key Achievements: Optimized Contract Synchronization Workflow •Replaced an unstable workflow syncing contract information between Salesforce and the AlertMedia database with a reliable, daily Django task. •Achieved synchronization for over 4000+ clients without requiring any changes to the existing database schema, ensuring minimal disruption to the current infrastructure. Pivotal Role in AlertMedia Travel Risk Management •Led the design and implementation of database schemas and RESTful API endpoints for the new AlertMedia Travel Risk Management product. •Integrated 7+ third-party travel systems, implementing AWS Lambda functions/APIs to handle incoming data. •Successfully managed the ingestion of thousands of travel records daily, contributing to the product generating millions in revenue last year. Innovative AI Solutions Using Langchain •Spearheaded a hackathon project utilizing Langchain to develop an AI-powered dynamic group filter. •Enabled users to generate custom filters by simply describing their requirements, enhancing user experience and functionality. •Authored effective AI prompts and demonstrated in-depth understanding of Langchain’s application within the project. ### Sr Software Engineer/Software Engineer III @ Cerner Corporation Jan 2018 – Jan 2022 | Kansas City, Missouri Area Playing a key senior software engineer role in infrastructure design and software development of HealtheCRM solution including its front-end client configuration tool component. • Designed and developed batch data processing infrastructure that contained 14 pipelines for 16+ clients. • Led the design and implementation of data processing integration test infrastructure for HealtheCRM solution. • Developed HBase based data service which contained 10 endpoints that fetching millions of records. • Developed RESTful API on Ruby on rails service for 5 services. One of those is pushing millions of records data from Cerner HealtheIntent Platform to 20+ Salesforce Health Cloud objects. • Developed Ruby API Engine and React.js UI interface for HealtheCRM solution in Cerner HealtheIntent console tool. The tool is used by 16+ clients to make configurations setting for 7 entities. • Hosted solution infrastructure knowledge for new team members in HealtheCRM team and mentored new software engineers in Cerner Dev Academy Program. • Developed AWS Glue jobs to convert Avro data outputs to CSV formatted data. ### Software Engineer II @ Cerner Corporation Jan 2016 – Jan 2018 | Kansas City, Missouri, United States • Broke down a giant pipeline into several small and entity specific pipelines. Reduced the memory usage from 14G down to 4G when running in cluster. Personally developed 3 out of 7 entities pipelines including generic Apache Crunch DoFns and Processors shared across pipelines. • Modified a data processing pipeline to integrate with 3M third-party API jar in the and developed Potentially Preventable Readmissions (PPR) grouping function. Optimized algorithm in calling PPR logic and generated 10% more PPR outcomes. • Designed data model for Operational Data Warehouse module. Developed a configurable pipeline to get the required operational data in CSV format. Integrated it with the main Apache Oozie workflows and created the weekly reports for over 80+ clients. ### Software Engineer I @ Cerner Corporation Jan 2013 – Jan 2016 | Kansas City, Missouri Area • Developed a configurable command to delete the redundant data in Hadoop File System (HDFS). Monthly cleaned up GBs data in HDFS across different environments and regions. • Migrated Java RESTful API calls from running ETL pipelines to Ruby script that can be used in release process. Reduced data processing time for 80+ clients. • Distributed 1 Jenkins job into 4 parallel running Jenkins jobs and saved up 50%+ running time in total. • Developed 20+ Apache Crunch DoFns and MapFns which was modifying data frame in ETL process. • Onboarded 30+ clients with customized configuration setting. Created a checklist template and wiki that included all the steps for developers with consultants and it was being used in on-boarding process afterwards. ### Hadoop Research Assistant @ Carnegie Mellon University Jan 2013 – Jan 2013 • Provided suggestions for professor in terms of selecting programming environment, implementing Hadoop cluster and sample application explanation. • Documented the instruction of installing Hadoop programming environment locally and using Amazon Web Service (AWS). • Designed and created Java based MapReduce programming assignments. ### Hadoop Researcher (Intern) @ Excellence.com.cn Research Institute Jan 2012 – Jan 2012 | Guangzhou, China • Prepared comparative analysis between Hadoop technology and other technology. • Set up a Hadoop cluster using Secure Shell (SSH) Protocol configuration in 10 Redhat servers • Run sample MapReduce applications “Word Count” on experiment Hadoop cluster. • Wrote the installation guide and trained three new interns. ## Education ### Master in Management Information System Carnegie Mellon University Jan 2011 – Jan 2013 ### Bachelor in Management Information System Beijing Jiaotong University Jan 2007 – Jan 2011 ## Contact & Social - LinkedIn: https://linkedin.com/in/ansonlq --- Source: https://flows.cv/qianl JSON Resume: https://flows.cv/qianl/resume.json Last updated: 2026-03-22