Edge Impulse makes things smarter using embedded Machine Learning. In this video we'll show how you can use any smartphone to collect data from the accelerometer and microphone, and how you can deploy the machine learning model back to your device. Afterwards the model runs completely local in your web browser, without a need for an internet connection. A written version of this tutorial can be found here: And the guide to building the gesture recognition model is here: In the Edge Impulse docs you'll find plenty more information, including how to use the WebAssembly build as part of your project ( ), and tutorials on recognizing sounds in your house ( ).









