- This event has passed.
The Convergence of Visual Motion and Wearable Sensors in Human Activity Recognition
April 5 @ 11:00 am – 12:00 pm EDT
The sMAP CREATE program webinar series invites scientists from different disciplines whose commitment and research relate to mobility in aging populations. Join us for the following sMAP CREATE webinar and forward to colleagues who may be interested.
Microsoft Teams Need help?
Join the meeting now
Meeting ID: 268 952 113 103
Passcode: yqNoDQ
ABSTRACT
Human activity recognition (HAR) is one of the core topics in wearable and ubiquitous computing, as it allows intelligent systems to be aware of their surrounding context and expect the intentions of humans in the environment. Our physical activities often cause specific patterns in the sensor signals, and sensor-based HAR focuses on inferring the activity contexts based on the signal patterns. One of the greatest challenges in sensor based HAR is the lack of well annotated data to establish the link between signal characteristics and semantic contexts. Unlike vision and natural language, where the data can be easily annotated as our language is primarily invented to describe what we see; sensor signals (from IMUs to smart textiles) are often obscure patterns, even experts struggle to directly annotate sensor signals. And annotation typically rely on another synchronized source like experiment video recordings and manual transcripts. In this talk we will explore several recent works trying to address this challenge with the help of recent machine learning advances in visual perception.
SPEAKER BIOGRAPHY
Dr. Bo Zhou is the deputy head and a senior researcher of the department Embedded Intelligence from the German Research Center for Artificial Intelligence (DFKI). He is also the Steering Committee co-chair and 2023/2024 TPC co-chair of the renounced conference series in wearable computing – ACM International Symposium on Wearable Computers (ISWC). His main research interests include the cross-over of hardware and machine learning, sensor-based HAR, multi-modal information fusion, biomedical engineering, sustainable machine learning and electronics, smart textile. He leads the research of several EU and German national funded projects in these directions including SustainML, SimpleSkin, VidGenSense, SocialWear, HeadSense, etc. He has a diverse education background from control, instrument, navigation, system-on-chip, sensing on the hardware aspect, to machine learning and AI on the software aspect.