-
Notifications
You must be signed in to change notification settings - Fork 52
Home
David A. Mellis edited this page Jul 29, 2016
·
9 revisions
ESP (Example-based Sensor Prediction) system leverages expert examples to support users in applying machine learning to a wide range of real-time sensor-based applications. Machine learning pipelines are specified in code using the Gesture Recognition Toolkit. This code generates a user-interface for iteratively collecting and refining training data and tuning the pipeline for particular projects. (We use openFrameworks to render the interface.)
We've built some applications to demonstrate the possibilities of the ESP system:
- gesture recognition using an accelerometer
- speaker identification, i.e. using a microphone to tell who is talking
- color recognition, e.g. for recognizing different objects based on their color
- a Touché-like example for detecting the way someone is touching or holding an object
- walk detection using an accelerometer
You can learn more about the project from the Arduino blog post or the talk.
For installation instructions, see the main README. Also, check out the API documentation