报告一题目：Robust Inertial Motion Tracking through Deep Sensor Fusion across Smart Earbuds and Smartphone
报告一时间：2021 年 9 月 6 日下午 15:20
报告一地点：升华后楼 409/腾讯会议：717 381 1916
报 告 人：弓箭
IMU based inertial tracking plays an indispensable role in many mobility centric tasks, such as robotic control, indoor navigation and virtual reality gaming. Despite its mature application in rigid machine mobility (e.g., robot and aircraft), tracking human users via mobile devices remains a fundamental challenge due to the intractable gait/posture patterns. Recent data-driven models have tackled sensor drifting, one key issue that plagues inertial tracking. However, these systems still
assume the devices are held or attached to the user body with a relatively fixed posture. In practice, natural body activities may rotate/translate the device which may be mistaken as whole body movement. Such motion artifacts remain as the dominating factor that fails existing inertial tracing systems in practical uncontrolled settings.
Inspired by the observation that human heads induces far less intensive movement relative to the body during walking, compared to other parts, we propose a novel multi-stage sensor fusion pipeline called DeepIT , which realizes inertial tracking by synthesizing the IMU measurements from a smartphone and an associated earbud. DeepIT introduces a data-driven reliability aware attention model, which assesses the reliability of each IMU and opportunistically synthesizes their data to mitigate the impacts of motion noise. Furthermore, DeepIT uses a reliability aware magnetometer compensation scheme to combat the angular drifting problem caused by unrestricted motion artifacts.We validate DeepIT on the first large-scale inertial navigation dataset involving
both smartphone and earbud IMUs. The evaluation results show that DeepIT achieves multiple folds of accuracy improvement on the challenging uncontrolled natural walking scenarios, compared with state-of-the-art closed- form and data-driven models.
报告二题目：RF Vital Sign Sensing Under Free Body Movement
报告二时间：2021 年 9 月 6 日下午 15:00
报告二地点：升华后楼 409/腾讯会议：717 381 1916
报 告 人：弓箭
Radio frequency (RF) sensors such as radar are instrumental for continuous, contactless sensing of vital signs, especially heart rate (HR) and respiration rate (RR). However, decades of related research mainly focused on static subjects, because the motion artifacts from other body parts may easily overwhelm the weak reflections from vital signs. This paper marks a first step in enabling RF vital sign sensing under ambulant daily living conditions. Our solution is inspired by
existing physiological research that revealed the correlation between vital signs and body movement. Specifically, we propose to combine direct RF sensing for static instances and indirect vital sign prediction based on movement power estimation. We design customized machine learning models to capture the sophisticated correlation between RF signal pattern, movement power, and vital signs. We further design an instant calibration and adaptive training scheme to enable cross-subjects generalization, without any explicit data labeling from unknown subjects. We prototype and evaluate the framework using a commodity radar sensor. Under a variety of moving conditions, our solution demonstrates an average estimation error of 5.57 bpm for HR and 3.32 bpm for RR across multiple subject