Research Assistant on using Virtual and Mixed Reality Humans for healthcare soft skills training
Completed PhD in Summer 2017.
* Designed, developed, and conducted two user studies involving over 150 healthcare professionals including nurses, surgical technicians, and anesthesia residents. The studies investigated how to improve communication skills in a team-based environment using interactive mixed reality humans.
* Developed new team training system which allows for the mixed reality humans to intelligently respond to user voice input. The system uses Google Cloud Speech Recognition and the Microsoft Language Understanding Intelligence Service (LUIS)
* Collaborated with UF Health faculty to design team training scenarios.
* Led weekly conference calls with an interdisciplinary team of computer scientists and healthcare professionals
* Assisted with porting mixed reality human technology from the Ogre3D rendering engine to Unity
* Maintained/Improved mixed reality human codebase
* Designed/Developed Wizard of Oz Interface which aided proctor in the deployment and operation of a mixed reality team training scenario.
* Assisted with transferring the mixed reality human technology to UF Health Science Library IT
* Led initial investigation into the development of a personalized virtual reality training system
Conducted research on virtual humans/virtual patients in the context of medical training.
* Led initial investigation into rendering 3D virtual humans on the web using WebGL
* Developed a redesigned text-based chat interface for interacting with virtual humans on the web. This interface has been used by hundreds of medical students across the United States.
* Collaborated with faculty at the Medical College of Georgia to design a user study for 70 first-year medical students which investigated how virtual patients can influence empathy