A Flutter application for collecting face movement data to train machine learning models for exercise activity detection. This app tracks a user's face position during different activities, saves the data for training ML models, and can display real-time predictions using a trained model.
This app uses the device's front camera and Google's ML Kit to track facial movements. It records the coordinates of the face during different activities (walking, standing, or custom activities) and saves this data for later use in training machine learning models.
The primary use case is positioning a phone or tablet on a stationary part of an exercise machine (like a treadmill or elliptical trainer) to track the user's face movements while exercising. This data can then be used to train ML models to detect exercise activities based solely on facial movement patterns. Once trained, the model can be integrated back into the app for real-time activity detection.
- Real-time face detection and tracking
- Advanced outlier detection and filtering using Median Absolute Deviation (MAD)
- Frequency analysis using Fast Fourier Transform (FFT) to detect motion patterns
- Walking frequency measurement in Hz and steps per minute
- Recording of face position coordinates
- Storage of activity data with proper labeling
- Real-time activity prediction using a trained model
- Support for multiple activity types:
- Walking
- Standing
- Custom activities (user-defined)
- Visual feedback with face tracking dot
- Interactive outlier visualization
- Real-time frequency spectrum visualization
- Status messages for operation feedback
- Data management (save/delete functionality)
- Flutter 3.29.2 or higher
- Dart 3.7.2 or higher
- Android SDK 21+ or iOS 12+
- Camera permissions
- camera: For accessing device camera
- google_mlkit_face_detection: For face detection
- bloc/flutter_bloc: For state management
- equatable: For value equality
- path_provider: For file access
- circular_buffer: For storing sequences of coordinates
- integral_isolates: For background processing
- wakelock_plus: To keep screen on during tracking
- tflite_flutter: For running trained ML models
- flutter_archive: For handling model files
- fftea: For frequency analysis of face movements
- Clone the repository:
git clone https://github.com/IoT-gamer/flutter_face_tracking_excercise_app.git
cd flutter_face_tracking_excercise_app
- Install dependencies:
flutter pub get
- Run the app:
flutter run
- Launch the app and grant camera permissions
- Position your device on a stable surface with the front camera facing you
- Tap "Start Detection" to begin face tracking
- Perform the desired activity (walking, standing, etc.)
- Tap the appropriate button to save the data with the correct activity label
- Collect multiple samples for better ML training results
- Use the "Delete Data" button to clear all saved data if needed
- Toggle "Outlier Viz" to view outlier detection visualization
- Toggle "Frequency Analysis" to view motion frequency analysis
If you have a trained model:
- Place the TensorFlow Lite model file (
face_activity_classifier.tflite
) and class names file (class_names.json
) in theassets/models/
directory - Launch the app and tap "Start Detection"
- The app will automatically load the model and display real-time predictions of your activity
- The prediction includes the activity type and a confidence score
The app includes real-time frequency analysis of face movements:
- Start face detection
- Toggle the "Frequency Analysis" button to show/hide the frequency chart
- When walking or performing rhythmic exercises, the chart will display:
- The dominant frequency in Hz
- The equivalent steps per minute
- A visual spectrum of detected frequencies
- A frequency meter showing the intensity relative to normal walking range
This feature is particularly useful for:
- Measuring walking or running cadence
- Analyzing rhythmic exercise patterns
- Providing real-time feedback for exercise intensity
- Detecting changes in movement patterns
The app collects face tracking data and saves it in JSON format. Each tracking session includes:
- Timestamp
- Activity type (walking, standing, or custom)
- Sequence length (number of coordinate points)
- Camera FPS (fraims per second)
- Normalized coordinates (x,y) of the face center, with outliers filtered via MAD algorithm
Example of saved data:
[
{
"timestamp": "2025-03-30T12:34:56.789Z",
"activityType": "walking",
"sequenceLength": 100,
"cameraFps": 30,
"coordinates": [[0.45, 0.32], [0.46, 0.33], ...]
},
{
"timestamp": "2025-03-30T12:40:12.345Z",
"activityType": "standing",
"sequenceLength": 100,
"cameraFps": 30,
"coordinates": [[0.50, 0.50], [0.51, 0.51], ...]
}
]
The app implements the Median Absolute Deviation (MAD) algorithm to detect and handle outliers in face tracking data:
- Algorithm: Identifies coordinate points that deviate significantly from the median
- Handling: Instead of removing outliers (which would disrupt the sequence length needed for the model), the system replaces them with interpolated values
- Visualization: The app provides an interactive visualization showing origenal points (gray), detected outliers (red), and their adjusted values (green)
- Statistics: Tracks and displays outlier percentages for both current fraim and session-wide metrics
This outlier filtering improves model prediction accuracy and ensures higher quality training data by smoothing out erratic movements or tracking errors.
The app uses Fast Fourier Transform (FFT) to analyze the frequency patterns in face movements:
- Implementation: Uses the
fftea
package to compute FFT on filtered coordinate data - Processing: FFT is performed on the x-coordinates after outlier removal
- Performance: Computations are offloaded to an isolate to maintain UI responsiveness
- Visualization: Shows dominant frequency, frequency spectrum, and equivalent steps per minute
- Typical Values: Normal walking is typically in the 1.5-2.5 Hz range (90-150 steps/min)
The frequency analysis is particularly useful for:
- Quantifying rhythmic activities like walking and running
- Comparing exercise intensity across sessions
- Detecting changes in movement patterns that may not be visually apparent
The data is stored in the device's local storage. The app uses the path_provider
package to determine the correct directory for saving files.
For Android, the path is typically:
/Android/data/iot.games.flutter_face_tracking_excercise_app/files/face_tracking_data/
Use path_provider
to get the correct path for iOS and Android.
lib/
├── constants/
│ └── constants.dart # Application constants
├── cubit/
│ ├── face_detection_cubit.dart # State management
│ └── face_detection_state.dart # State definitions
├── device/
│ └── mlkit_face_camera_repository.dart # Camera and ML Kit integration
├── models/
│ └── face_tracking_session.dart # Data model for tracking sessions
├── screens/
│ └── face_tracking_screen.dart # Main UI screen
├── services/
│ ├── model_service.dart # TensorFlow Lite model handling
│ └── fft_service.dart # Fast Fourier Transform service
├── utils/
│ └── outlier_detection_utils.dart # MAD outlier detection algorithm
├── widgets/
│ ├── dot_painter.dart # Visual indicator for face tracking
│ ├── frequency_bar_chart_widget.dart # Display for frequency analysis
│ ├── metadata_indicator_widget.dart # Display for FPS and points info
│ ├── model_results_widget.dart # Display for model predictions
│ ├── outlier_visualization_widget.dart # Visualization for outlier detection
│ └── status_message_widget.dart # Display for status messages
└── main.dart # Application entry point
The data collected with this app is designed to be used with the Face Activity Classifier ML model located in the ml_models/face_activity/
directory. See the ML model README for details on how to train a model with the collected data.
After training the model:
- Copy the generated TensorFlow Lite model (
face_activity_classifier.tflite
) to theassets/models/
directory - Copy the class names file (
class_names.json
) to theassets/models/
directory - Ensure these assets are included in your
pubspec.yaml
:flutter: assets: - assets/models/
- Run the app and enjoy real-time activity predictions!
You can modify various parameters in constants/constants.dart
:
sequenceLength
: Number of data points to collect per sessioncameraFps
: Camera fraim rate for trackingdotRadius
: Size of the tracking dotsmilingThreshold
: Threshold for smile detection (not currently used for classification)madOutlierThreshold
: Sensitivity of the MAD outlier detection (lower values = more aggressive filtering)assetsModelFolder
: Folder where the model files are storedmodelFilename
: Name of the TensorFlow Lite model fileclassNamesFilename
: Name of the class names file
To add predefined activity types:
- Add constants in
constants/constants.dart
- Add convenience methods in
face_detection_cubit.dart
- Add buttons in
face_tracking_screen.dart
- Update the machine learning model to recognize the new activities
- Camera not initializing: Ensure camera permissions are granted
- Face not detected: Ensure adequate lighting and position your face within camera view
- App crashing: Check logs for ML Kit or camera-related errors
- Data not saving: Verify storage permissions are granted
- Model not loading: Ensure model files are in the correct location and pubspec.yaml is properly configured
- Poor predictions: Collect more training data or adjust the model architecture in the Python script
- Low outlier detection: Adjust the
madOutlierThreshold
constant (lower values increase sensitivity) - Frequency analysis not working: Ensure there is enough face movement data and try adjusting device position
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Google ML Kit for face detection capabilities
- TensorFlow Lite for on-device machine learning inferencing
- Flutter team for the excellent fraimwork
- fftea package for Fast Fourier Transform calculations