Skip to the content.

James

I’m a PhD student in the field of Human-Computer Interaction (HCI), part of the GEMINI project team at Lancaster University, supervised by Prof. Hans Gellersen. In GEMINI, we investigate the coordination of eye, head and body to design multimodal interfaces that better reflect the interplay of gaze and movement, and that let users interact more naturally in extended reality (XR), using their eyes and body in concert.

In my research, I apply machine learning (ML) and signal processing to model human behaviour as context for interaction, with a focus on developing novel eye-head-based interaction techniques. I aspire to a future in which ML-enabled devices enhance human capacities, interfacing in intuitively human ways.

Biography

Baosheng, also known as James, has a background in biomedical engineering (pdf logo CV). He earned his Bachelor’s from the University of Auckland, modeling the human vocal tract’s fluid dynamics under Prof. Richard Clarke and Prof. John Cater. He interned at the Auckland Bioengineering Institute, using machine learning to predict soft tissue deformation for cancer imaging under Dr. Duane Malcolm, and evaluated motion capture technologies for the shoulder complex, supervised by Dr. Kumar Mithraratne and Dr. Ted Yeung.

James completed his Master’s at the Technical University of Denmark (DTU), researching hybrid brain-computer interfaces under Prof. Sadasivan Puthusserypady and Prof. John Paulin Hansen, their paper was given the Best Paper Award. He also contributed as a bioinformatics programmer on a gold medal-winning iGEM team, advised by Prof. Christopher Workman.

After his MSc., James worked as a research assistant at DTU, led by Dr. Fiona Bríd Mulvey and Prof. Per Bækgaard, and in collaboration with occupational therapists, he helped develop a smart visual aid as XR headset for visually imparied patients. Currently, as a PhD student in Prof. Hans Gellersen’s lab at Lancaster University, he applies machine learning and signal processing to develop classifiers for eye-head movements. During his PhD, he interned at Google AR with Dr. Mar Gonzalez-Franco and Lucy Abramyan.

Using an experimental approach, James explores various interfacing technologies, including eye tracking, virtual reality and XR headsets, EOG glasses, and EEG. And has co-authored publications in CHI, INTERACT, ETRA, and COGAIN. In James’ diverse projects and interdisciplinary endeavors, a consistent theme prevails: harnessing technology and the power of computational modelling to enhance the lives of people.