Hello! I'm CJ, a software engineer at Meta!
My career began in IOS development. I had the privilege of interning at a tech agency, where I designed and implemented features utilizing Swift's RealityKit.
MTIA (1 year) - Meta Training and Inference Accelerator - Meta's own GPU made from scratch tailored to our ML and AI workflows. Our domain is in the software stack sitting underneath PyTorch and on top of chip firmware, we specialize in model lowering w/ graph and compiler optimizations, smooth runtime integration for fast inference and training jobs, and hardware specific kernel writing (in a variety of styles, C manual kernels, Triton kernels, and Inductor).
All of this leads to 2-3x perf/cost of running our chips over our competitors creating scalable savings, allowing for new innovation in the AI chip space
Wearables (current) - With the release of Ray Ban Display glasses, there is a whole new element of interaction with technology. We now present EMG controlled devices allowing for a very seamless and non disruptive way to interact with technology. This input modality brings a unique set of problems not solved before and I specifically work on the handwriting aspect.
I spent 1 year on MTIA before making the exciting switch to Meta Neural Band for the Meta Rayban Glasses. Going from maximum compute in large data-centers to a heavily resource constrained device is all about scaling down our models and writing lightweight performant code. It's been very cool to work on model optimizations for very large models and very small ones, with lots of overlaps in concepts with greatly differing applications. Excited to see all the future use cases of EMG tech (and help make it happen)!