The broader/commercial impact of this Small Business Innovation Research (SBIR) Phase II project will result from providing remote, accessible fall risk assessments and exercise programs. This project empowers older adults to maintain independence and improve their quality of life. Falls are a major health risk for older adults, with significant physical, psychological, and economic impacts, costing the U.S. $50 billion annually. Current fall prevention methods are costly, inconsistent, or difficult to access, particularly in rural communities. This project introduces an AI-based video assessment app for routine fall risk assessments and personalized exercises for older adults using common smartphones or tablets. This innovation aims to improve the quality of life for older adults. Beyond improving individual health outcomes, this project has the potential to significantly lower healthcare costs by reducing fall-related hospitalizations, rehabilitation expenses, and long-term care admissions. By providing an affordable, scalable, and privacy-preserving alternative to traditional fall assessments, the results of this project benefit healthcare systems, insurance providers, and senior living communities alike. Additionally, this project catalyzes awareness of fall prevention and the role of technology in enhancing elderly care while fostering interdisciplinary collaboration and innovation. 
This Small Business Innovation Research (SBIR) Phase II project creates innovative AI and computer vision technologies to develop a novel AI-based video assessment system for accurate, real-time fall risk assessment across various smart devices. Unlike conventional systems, this innovation is camera-agnostic, lighting-independent, and privacy-focused, ensuring broad accessibility without storing or sharing raw video data. A key aspect of this project is its strong commitment to privacy. Unlike conventional monitoring systems, the proposed solution operates without continuous video recording, ensuring that original video content is never stored or shared, a critical feature for user trust and data security. Inspired by recent advances in Large Language Models and Self-Supervised Learning, the proposed technology introduces a novel autoregressive encoder framework for real-time human motion prediction and analysis and personalized exercise coaching, utilizing only the built-in camera and computational power of existing smartphones and tablets. This innovation surpasses existing vision transformer models by focusing on human motion heatmaps, which capture the spatial/temporal aspects of human movement. This project presents a unique approach to learning and understanding human body movements contextually, regardless of the camera’s perspective, field of view, and environmental noise. The transition from pixel-level processing to heatmap-based representation significantly reduces model complexity and computational load. This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.
Source: https://seedfund.nsf.gov/awardees/phase-2/details/?company=foresightcares-inc
