IOP Publishing Ltd., 2023. — 226 p. — (IPEM–IOP Series in Physics and Engineering in Medicine and Biology). — ISBN 978-0-7503-5180-5.
Аффективные вычисления в здравоохранении: приложения на основе биосигналов и искусственного интеллекта
The field of affective computing has progressed tremendously in the past few years and has gained wide acceptance in various applications that involve e-learning, marketing, customer service, smart transportation, healthcare, and many more.
Affective computing takes advantage of different modalities such as facial expres sions, biosignals, gestures, and speech in order to develop technology that can detect different emotions (happiness, anger, sadness, fear, disgust, and surprise) of users.
Over the past few years, it has become increasingly common in healthcare to employ this technology for clinical diagnosis. An affective computing system has also been developed for attempting to identify emotional disturbances or impairments in patients using an automated diagnosis system. In the past, it has been primarily used as a tool for diagnosing emotional disturbances associated with a variety of diseases, including depression, stress, emotional stress, anxiety, bipolar disorder, attention deficit disorder, strokes, Parkinson’s disease (PD), autism spectrum disorders (ASD), mood disorders, etc. The majority of existing emotional computing applications rely on facial expressions and physiological signals such as Electrocardiograms (ECGs), Electromyograms (EMGs), Electroencephalograms (EEGs), Electrooculograms (EoGs), Skin Temperature (ST), Galvanic Skin Response (GSR), Photoplethsymogram (PPG), etc to assess the emotional state changes on the subjects. There have been numerous methods and algorithms proposed by researchers to extract meaningful information from biosignals and facial images so that they can extract meaningful information related to different emotions. In order to make decisions, artificial intelligence (AI) methods have been based on machine learning algorithms and deep learning algorithms. In spite of the fact that facial expression-based emotion recognition is a simple, cost-effective, and computationally efficient means of assessing emotions, the researchers have primar ily chosen to use biosignals to assess the subject’s emotions because they are highly robust and have the ability to capture highly refined information from physiological changes in an uncontrolled manner.
Preface
Anxiety recognition using a new EEG signal analysis approach based on sample density in a Chebyshev chaotic map
Evaluating cognitive load during lexical decision tasks for monolinguals and bilinguals using EEG
Detection of psychological stress using principal component analysis of phonocardiography signals
Affective computational advertising based on perceptual metrics
Machine-learning-based emotion recognition in arousal–valence space using photoplethysmogram signals
EEG-based human emotion classification from channel-wise feature extraction and feature selection
Detection of physiological body movements in affective disorder patients using EEG signals and deep neural networks
Voice-enabled real-time affective framework for negative emotion monitoring
Differential diagnosis tool in healthcare application using respiratory sounds and convolutional neural network
Virtual reality and augmented reality based affective computing applications in healthcare, challenges, and its future direction