The advent of Artificial Intelligent (AI) and Machine Learning (ML) has enabled smart devices such as smartphones and tablets to detect and keep track of moving objects in three-dimensional (3D) space. These AI/ML models, such as the ML Kit Pose Detection API by Google and Augmented Reality Kit by Apple Inc., allow the mobile camera to capture a person’s motion to collect robust and repeatable data for functional tasks by accurately determining multidimensional kinematics across joints. With this technology, a cost-effective mobile gait analysis application was developed using a single camera on the commercial smart device. The Lower-Body Motion Tracking version 1.0.1 (LGait) application is designed to support the decision-making of clinicians quantifying mobility by calculating and analyzing the 3D kinematics of walking. The LGait app is designed to run on Apple (Apple Inc., USA) iOS mobile devices (iPhone and iPad). For capturing the body motion kinematics, the Apple ARKit-3 is powered by machine learning models running on the Apple Neural Engine chip using Xcode 11 IDE and Swift programming language. The LGait application provides all the significant features to support lower-limb mobility gait analysis. Two main features of the mobile application are the real-time 3D motion capturing and 3D gait joint angle calculation. Results of tests comparing kinematics acquired from a Vicon motion capture system to the LGait application show a compatible measurement. The proposed applications of the LGait application include the classification of mobility for clinical diagnosis and patient monitoring.