HUMAN ACTIVITY RECOGNITION: INTEGRATING SENSOR FUSION AND ARTIFICIAL INTELLIGENCE
No Thumbnail Available
Date
2024-12
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Florida Atlantic University
Abstract
Human Activity Recognition (HAR) plays a crucial role in various applica
tions, including healthcare, fitness tracking, security, and smart environments, by
enabling the automatic classification of human actions based on sensor and visual
data. This dissertation presents a comprehensive exploration of HAR utilizing ma
chine learning, sensor-based data, and Fusion approaches. HAR involves classifying
human activities over time by analyzing data from sensors such as accelerometers
and gyroscopes. Recent advancements in computational technology and sensor avail
ability have driven significant progress in this field, enabling the integration of these
sensors into smartphones and other devices. The first study outlines the foundational
aspects of HAR and reviews existing literature, highlighting the importance of ma
chine learning applications in healthcare, athletics, and personal use. In the second
study, the focus shifts to addressing challenges in handling large-scale, variable, and
noisy sensor data for HAR systems. The research applies machine learning algorithms
to the KU-HAR dataset, revealing that the LightGBM classifier outperforms others
in key performance metrics such as accuracy, precision, recall, and F1 score. This
study underscores the continued relevance of optimizing machine learning techniques for improved HAR systems. The study highlights the potential for future research
to explore more advanced fusion techniques to fully leverage different data modal
ities for HAR. The third study focuses on overcoming common challenges in HAR
research, such as varying smartphone models and sensor configurations, by employing
data fusion techniques. Experiments were conducted on the KU-HAR and UCI HAR
datasets using popular machine learning classifiers, including Decision Trees, Random
Forest, Gradient Boosting, and XGBoost. XGBoost achieved the highest accuracy
of 96.83%, demonstrating its effectiveness in classifying fundamental human activi
ties, with decision-level fusion methods further improving results. The fourth study
delves into the implementation of multimodal fusion techniques for HAR by combin
ing wearable sensor data with visual data. The research investigates the performance
of the late fusion method in integrating sensor and visual modalities.
Description
Keywords
Human Activity Recognition (HAR), Inertial Sensor, Accelerometer, Gyroscope