21 - 25 April 2024
National Harbor, Maryland, US
Conference 13036 > Paper 13036-8
Paper 13036-8

Human-based gait authentication using multi-modal sensors at the tactical edge device: smartphone

22 April 2024 • 2:20 PM - 2:40 PM EDT | Potomac 2

Abstract

The Department of Defense (DoD) including the US Air force have been seeing increased physical security risks, such as gate runners, active shooter situations, and other use cases (such as Drone ISR, Rapid Deployment, Force Protection, and Security Force), requiring increased coordination and data cooperation. Additionally, technology and operational environments are getting more complex, interconnected and diverse. Recently a team of developers within the US Air Force developed a plug-in, which analyzes sensor data from tactical edge device’s (e.g., cell phone) onboard accelerometer and gyroscope to determine the movement of a person when they walk. The plug-in uses machine learning (ML) algorithms to create a model of that person’s gait, and then sends pertinent data through the associated human gait model to authenticate a user. The novelty of our effort lies into enhancing this human gait authentication by using different features extracted from spectral information of the accelerometer and gyroscope signals from the smartphone using a public human activity recognition dataset (WISDM) as a proof of concept, marking a previously unexplored approach. By leveraging spectral data, we seek to enhance the accuracy and robustness of authentication systems in military contexts. After the feature extraction is performed, Kernal Discriminant Analysis (KDA) is utilized to reduce the dimensions of the spectral features to 50; after including the non-spectral features, our total number of final features becomes 65. After adding the total features, we perform feature-level fusion utilizing ML algorithms and the performance shows promising for authentication utilizing 51 users. The SVM-rbf classifiers achieved a mean Equal Error Rate (EER) and mean Accuracy (ACC) of 2% and 97.3% , while the GBM classifiers achieved a mean EER and ACC of 0.4% and 99.1%, and the CNN classifiers achieved a mean EER and ACC of 10% and 90.4% respectively.

Presenter

Shageenth Sandrakumar
Air Force Research Lab. (United States)
Shageenth Sandrakumar is a Research Engineer at Air Force Research Lab. He has a dual Bachelor's degree in Electrical Engineering (B.E.) and Mathematics (B.A.) from University at Buffalo. He also has a Masters degree in Engineering Sciences focus in Data Science (M.E.) from the University at Buffalo.
Application tracks: AI/ML
Presenter/Author
Shageenth Sandrakumar
Air Force Research Lab. (United States)
Author
Simon Khan
Air Force Research Lab. (United States)