I lead software development for nextgen BCI applications, including the Blindsight and Convoy projects. I write code that demonstrates what visual cortex stimulation can achieve through brain-computer interfaces, and I create the tools our Animal Care Team relies on every day to collect data and run interactive games with our NHPs.
A major focus is building TypeScript, React, and Vite desktop apps for PRIME study participants. These let them control a virtual arm in 3D space in a straightforward, natural way—while capturing clean labeled data we use to train models for accurate arm and hand representation. I work closely with participants almost daily, incorporating their feedback right away, pushing quick updates, and shipping software daily.
For the Convoy trials, I also write the hardware control code that safely operates the assistive robotic arm, along with the participant software that translates their neural signals into real-time robot control. It's a balance between fast, iterative, user-centered apps that adapt based on direct input, and reliable, safety-focused systems for physical hardware.