Vital Sign Sensors for Artificial Intelligence and Deep Learning

Kirchner J, Fischer G (2018)


Publication Language: English

Publication Type: Conference contribution, Abstract of lecture

Publication year: 2018

Event location: Aachen DE

Abstract

Up to now vital data of patients are mostly acquired during clinical or ambulant examination. Here, the sensors are carefully placed by trained personal and movements of the patient are well controlled, e.g. during exercise ECG on a recumbent ergometer. In this way, misplacement of the sensors as well as motion artifacts are avoided, and physical load is well defined, such that high quality data is ensured that can be use for diagnostic purposes.
Such a clinical or ambulant examination, however, captures only a short moment in time and is not representative for daily life. More preferable from the perspective of the treating physician would be data collected on a regular basis from daily routine work of the subjects. In such a situation, however, sensors may not be placed properly by the patient and movements are not well controlled any more, so data quality can be compromised.
Hence, the validity of the sensor data has to be assessed and taken into account during data analysis in order to derive reasonable conclusions. Therefore, a wearable sensor system for daily life must be multimodal to allow machine learning algorithms to clean the acquired sensor data from artefacts. Artificial intelligence and deep learning system can then identify corrupt data segments, substract artefact patterns and by that clean up sensor data.
The talk will highlight the trade-off between usability and integratability of a sensor system into daily life versus the validity of data obtained. This challenge will be illustrated by a multimodal sensor platform for elderly and dementia care that was developed during the course of the Medical Valley excellence cluster funded by German Federal Ministry of Education and Research (BMBF).
Another example discussed in the talk are ECG circuit concepts that make the sensor more robust and compatible with daily life. Special focus will be put on capacitive ECG. Here, movement artefacts can be cleaned up if the movement during ECG aquisition is known, which can be done, e.g., by EMG or acceleration/gyroscopic sensors. Apart from that, these sensors allow to interpret the captured ECG data with respect to the current load to the body and thus provide context
information.
In conclusion, a combination of advancements in both sensor hardware/circuitry and multimodal sensor processing by artificial intellifence is needed to make the sketched vision real.

Authors with CRIS profile

How to cite

APA:

Kirchner, J., & Fischer, G. (2018). Vital Sign Sensors for Artificial Intelligence and Deep Learning. Paper presentation at 52nd Annual Conference of the German Society for Biomedical Engineering, Aachen, DE.

MLA:

Kirchner, Jens, and Georg Fischer. "Vital Sign Sensors for Artificial Intelligence and Deep Learning." Presented at 52nd Annual Conference of the German Society for Biomedical Engineering, Aachen 2018.

BibTeX: Download