# Kaustubh S. > Software Engineer 2 @ SoFi | Python, AWS, Kubernetes Location: San Francisco Bay Area, United States Profile: https://flows.cv/kaustubhs At SoFi, I work as a Software Engineer 2, focusing on API development using Python, AWS, Kubernetes, Docker, and microservices. My contributions include leading the BillPay Switch, developing multiple APIs using AWS technologies, and enhancing backend services for financial transactions such as Wire Transfers and ACH simulations. I earned a Master of Science in Computer Science from Arizona State University and have honed my skills in REST APIs, SDLC, and machine learning. My previous experience includes building cloud-based solutions with Azure at MAQ Software and creating DevOps pipelines during my Insight Data Science Fellowship. I am committed to advancing the software engineering field by solving real-world challenges through innovative and scalable solutions. ## Work Experience ### Software Engineer 2 @ SoFi Jan 2022 – Present | Seattle, Washington, United States At SoFi, I played a pivotal role in API development, leveraging technologies such as Python, AWS, and Docker. My contributions included leading the BillPay Switch project and creating essential APIs like the Token Service and Events API. I also enhanced backend services for Wire Transfer and ACH Transaction Simulation, focusing on security improvements. ### Software Engineer @ MAQ Software Jan 2020 – Jan 2022 | United States Used Azure cloud to build solutions for customers. Completed Azure certification (DP 200) and PowerBI certification Interacted with clients to understand their needs and provide useful solutions and insights ### Fellowship @ Insight Data Science Jan 2020 – Jan 2020 | San Francisco Bay Area Fellow, Insight Data Science, San Francisco  Created DevOps pipeline for databases by using EKS,ECR, CircleCI ,SQLAlchemy and PostgreSQL to manage database from development to QA and from QA to production  Implemented the migration of databases with traceability to database changes and ability to roll back by creating tests in dev, QA and production environments ### Enterprise Applications Intern @ ON Semiconductor Jan 2020 – Jan 2020 | Phoenix, Arizona Responsibilities : Responsible for the regular data analytics and report generation of key performance metrics, assisting project teams with one off tasks when available, Assist with Continuous Delivery ### SDE intern @ Amazon Web Services (AWS) Jan 2019 – Jan 2019 | East Palo Alto Amazon Web Services, East Palo Alto : Software Development Engineer Intern (TensorFlow team) May-August 2019 1. Worked on SageMaker-Debugger, GitHub link- https://github.com/awslabs/sagemaker-debugger which helps to analyze the tensors generated during training job of neural networks. 2. Created Index Writing class for that project so that the tensors that are being saved for analyzing can be directly fetched from the exact location it is saved. Also, created utility functions that can later be used to fetch files for exact step, integrated Index Writer with Index Reader. The speed to fetch tensor improved dramatically because of Index Writing. 3.Fixed some bugs for the project, like: Detection of end of training job, raising run time error if same directory structure is being used, Updating tests for more generalized usage. 4. Created Code Integration/Code Deployment system to run all tests for each PR and upload reports and wheels for tool to S3, which can – Run integration and unit tests for each Pull request, Publish the results of Pull Request to the corresponding Pull Request, create pip wheel package from alpha and master branches, publish pip-packages to corresponding s3 locations if tests are success. Build sends chime notification using lambda function to deep engine group about warnings /errors and info generated during the build as well as link to build logs. 4. Learned a lot about TensorFlow and about AWS tools like CloudWatch, lambda, Codebuild, Codepipeline, S3. ### Intern @ Drivedge Jan 2016 – Jan 2016 | Pune Project Management Portal (Drivedge Infosolutions, Pune - 13th June to 11th July 2016) Worked on Project Management Portal which is helpful to employees to submit project reports and their daily work log online while employer/manager can manage project details, add/manage employees on project and see project progress for employee ## Education ### Master of Science - MS in Computer Science Arizona State University ### B tech in Information Technology Vellore Institute of Technology ## Contact & Social - LinkedIn: https://linkedin.com/in/kaustubh-sardar - GitHub: https://github.com/kaustubhsardar --- Source: https://flows.cv/kaustubhs JSON Resume: https://flows.cv/kaustubhs/resume.json Last updated: 2026-03-29