Physiological computing, artificial intelligence and empowering our capability
Physiological computing is an emerging research area that can help to boost disability technology innovation. Dr Youngjun Cho is a pioneer in this field, and is simultaneously helping to connect ideas and information to push forward the innovation of accessible assistive technology and interaction (AATI), in turn empowering our capability.
Artificial intelligence (AI)-powered physiological computing looks at technology that can help us listen to our bodily functions and psychological needs. Dr Youngjun Cho is a world leader in this area of research, which starts with physiological sensing. This includes cardiovascular, respiratory, cortical, perspiratory or pupillary pattern measurements. For example, heart rate monitoring is one of the most powerful features in wearable smartwatches or fitness trackers.
With AI and computer vision technologies, such physiological activities can also be measured without wearable devices. This is good for people with certain chronic conditions who often find wearable devices uncomfortable. One way of doing this is remote photoplethysmography (remote-PPG) which is a contactless way to measure certain aspects of the human body just with a webcam or smartphone camera. For example, to measure cardiovascular activity through a video camera, remote-PPG takes the colour and brightness variations of the skin, which reflects haemoglobin absorptivity levels. Physiological factors and parameters can then be estimated. Infrared thermal imaging is another promising remote physiological sensing channel, and Cho has pioneered an approach to mobile imaging-based physiological sensing and stress recognition.
Once physiological factors have been measured, machine learning plays a pivotal role in estimating psychological aspects such as mental stress, anxiety, feelings of comfort, or even the moments when a person becomes distracted. This is done through extracting features using AI techniques, learning from them, and also classifying the various features. The purpose of doing this is so that technology can eventually provide feedback to an individual. For example, it could alert an individual to feeling stressed, and provide support in how to handle this.
Although Cho’s research in the field of physiological computing is ongoing, the real world impact of it could be significant. In the first instance, information collected in this field could be used as a scalable research tool to understand barriers that people with physical impairments or mental disabilities (e.g. anxiety disorders) experience in their daily activities. This could involve the reality of accessibility experiences in public places as well as in the online space, and how intelligent prosthetics could adapt their movement and functionality to users’ psychophysiological states. This type of technology could also assist healthcare professionals to better understand AT users’ physiological and psychosocial needs in finding the right AT product.
AI helping connect people
Cho’s interest in the role that AI can play in both improving assistive technology (AT), and improving access to AT, is applied more broadly to the work he does on the AT2030 project, and at the GDI Hub.
“There is a pressing need to improve awareness about the relative lack of access to AT, we’re currently expecting more than two billion people who need at least one assistive product in their life across the world. But less than 10% actually have access to it. That is a real issue.”
Dr Youngjun Cho
Associate Professor at UCL Computer Science & Director of MSc DDI & Lead of GDI Physiological Computing group
This is why Cho is actively involved in various projects to improve information seeking of AT stakeholders, and to connect ideas and people to ensure information exchange is possible through AI.
The first is a project where the team is developing an AI-powered platform to help people search for and find high quality information about AT. The team has developed an up-to-date AT taxonomy to aid this process, and the idea is that policymakers and other decision makers will be able to source information that will help them plan for future AT needs.
Another one is the development of a digital survey tool, which will enable organisations to run surveys that include AT questions, in an easy to communicate and automated way. Cho’s interest extends to how AI can collaborate with humans to improve our AT information seeking and sense making journeys, contributing to human-centred AI.
Next steps
Towards Accessible Assistive Technology and Interaction (AATI)
Although AT solutions have made a lot of progress in recent years, Cho feels there is still some way to go. One example is in data visualisation. Graphs and pie charts are widely used across the world – even to share information about trends and statistics about AT requirements – yet there is no fully accessible way for visually impaired people to engage with this kind of material.
“The main challenge is the limited access to AT and limited accessibility of many of mainstream technologies,” explained Cho. “‘Access’ means many things in the field of AT. There is access to information and resources that some people are excluded from due to design and limited interaction in the design stage. And there is the fact many people across the world – particularly in less developed countries – cannot get hold of AT solutions that exist in other regions.”
Cho feels that a better way to describe the field of AT he works in is accessible assistive technology and interaction (AATI). He highlights that AT cannot resolve the key issues without accessibility and user interaction being factored in. “We are here to address many of the challenges and problems in the field,” said Cho. “I really want to be part of making things better, empowering our capability and improving different facets of peoples’ lives.”
Share: