CRII: RI: mmWave-based Human Motion and Activity Sensing Under Unseen Scenarios

  • Xue, Hongfei (PI)

Project Details

Description

Perceiving and understanding human activities have become increasingly crucial in intelligent systems to facilitate our social lives, including gesture control in virtual reality devices, and sports analysis for athletes. Recently, researchers have found some methods to use wireless signals, like those from WiFi, to sense and understand human activities. Different from traditional camera-based solutions, the sensing systems built on these methods work even in the dark or when the view is blocked, and they don’t invade privacy. Thus, they can lead to advanced applications such as round-the-clock monitoring for the elderly without compromising their privacy, helping rescue teams see through walls during fires, and keeping track of passengers in foggy weather. Despite the usefulness of wireless sensing systems, their prevalence is restricted by their limited performance in new environments or for new tasks. This project aims to develop generalized wireless human sensing frameworks, which will result in improved system performance in those new scenarios and reduce the deployment efforts of the systems in the real world. The project will result in algorithms that can be broadly applicable in wireless human sensing and enable a wide range of applications in smart homes, healthcare, robotics, smart cities, entertainment, etc. Additionally, this project will provide opportunities for students of diverse backgrounds to get involved in this research area and will contribute to a new course on wireless sensing at the University of North Carolina at Charlotte. This research improves the generalizability of wireless sensing models by taking advantage of other modalities, which contains two thrusts. In the first thrust, data from other modalities is utilized to enhance generalization. In this design, the wireless signals are augmented from the video or mesh data using the proposed physical simulator, subsequently training the sensing model with a cycle-consistent structure on this enriched dataset to achieve a generalized pose estimation under unseen scenarios. In the second thrust, models from other modalities are leveraged to improve generalization. By aligning the wireless signal embeddings with the activity-related text or image embeddings from the pre-trained vision language models, the wireless model can conduct open-class activity recognition tasks. This research effort will result in generalized solutions of wireless human sensing systems under unseen scenarios and the dissemination of shared code and data to the research community.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
StatusActive
Effective start/end date15/4/2431/3/26

Funding

  • National Science Foundation: US$175,000.00

ASJC Scopus Subject Areas

  • Signal Processing
  • Computer Networks and Communications
  • Engineering(all)
  • Computer Science(all)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.