Fostering the development of AI-enhanced technologies to support healthy aging at home for older adults and individuals with Alzheimer’s disease.
Featured Pilots
Utilizing the Druid Impairment App to Assess and Enhance Senior Adult’s Driving Performance
Micheal Milburn and William DeJong, Impairment Science, Inc.
Anuj Pradhan and Shannon Roberts, UMass Amherst
Protecting Patients against Phishing Attacks using AI-enabled Agents
Gang Wang, University of Illinois at Urbana-Champaign
Roopa Foulger, OSF
Correlations Between Light Exposure Inputs and Sleep Quality Outputs
News & Events
-
Next Webinar (Tue 11/19 4p EST) – Unlocking Success: FDA Regulatory Strategies for AgeTech Devices and AINovember 6, 2024/0 Comments
Featured Webinar
Past Webinar – Comprehending Human Behaviors using Wireless Sensing on Everyday Wearables – Cheng Zhang
- High-quality behavior data in the wild will revolutionize how AI understand and support humans.
- Low-power, privacy aware and minimally-obtrusive everyday wearables will play a central role in sensing and interpreting human behaviors.
- Missing killer application has prevented wearables from a larger scale adoption. I think Health sensing in the wild is the killer application for the next generation of wearables.
Talk Abstract: Despite the rapid advancement of AI, computers’ ability to comprehend human behaviors remains limited. For instance, commodity computing devices still face challenges in understanding even basic human daily activities such as eating and drinking. The primary obstacle lies in the absence of suitable sensing technologies capable of capturing and interpreting high-quality behavioral data in everyday settings. In this presentation, I will share my research on the development of everyday wearables that are minimally-obtrusive, privacy-aware, and low-power, yet capable of capturing and comprehending various body movements and poses that humans employ in their everyday activities. First, I will show how these sensing technologies can empower various everyday wearable form factors, including wristbands, necklaces, earphones, headphones, and glasses, to track essential body postures, such as facial expressions, gaze, finger poses, limb poses, as well as gestures on teeth and tongue. Then, I will demonstrate how, when paired with state-of-the-art AI, these everyday wearables can revolutionize how computers comprehend human behaviors. Specifically, I will focus on applications related to activity recognition, accessibility, and health sensing. Finally, I will discuss the prospects and challenges associated with the integration of AI and wearables to support users in the future of everyday computing.
About the Speaker: Cheng Zhang is a tenure-track Assistant Professor in Information Science and Computer Science (Field) at Cornell University, where he leads the Smart Computer Interfaces for Future Interactions (SciFi) Lab. His research focuses on designing, developing, and evaluating intelligent sensing systems, particularly minimally-obtrusive wearables, to seamlessly comprehend and predict human behaviors and intentions, thereby supporting users in areas such as accessibility, health sensing, and activity recognition. Cheng earned his Ph.D. in Computer Science from Georgia Institute of Technology. He published at top-venues such as Ubicomp/IMWUT, UIST, CHI, ISWC, MobiCom, IUI, MobileHCI, and holds numerous patents. His research has been recognized with honors such as the NSF CAREER Award, the 10-Year Impact Award at Ubicomp, and several Best Paper awards and honorable mentions. His work has been featured by BBC, Forbes, New Scientist, DigitalTrends, CNET, Fast Company, Popular Science, Engadget, Gizmodo, NowThis, and Mashable. Cheng is the steering committee co-chair for International Symposium on Wearable computers(ISWC) and a member of the steering committee on International Joint Conference on Pervasive and Ubiquitous Computing (Ubicomp). He served as technical program committee (TPC) co-chair for ISWC 2023 and Ubicomp 2024.