Controlling Espruino from Tensorflow on the Desktop

In this tutorial we'll teach an AI to recognise features in video with a webcam, and to then send a signal from the PC - via Bluetooth - to a Bangle.js (or other Bluetooth Espruino device) to make something happen.

In this case we'll make a website that reminds you not to pick your nose by buzzing a Bangle.js when it thinks it sees your finger near your nose.

Google provides a website called Teachable Machine that will allow us to train our own neural network in a matter of seconds, and even provides you with the code you'll need to use it.

    // run the webcam image through the image model
    async function predict() {
        // predict can take in an image, video or canvas html element
        const prediction = await model.predict(webcam.canvas);
        for (let i = 0; i < maxPredictions; i++) {
            const classPrediction =
                prediction[i].className + ": " + prediction[i].probability.toFixed(2);
            labelContainer.childNodes[i].innerHTML = classPrediction;
        }
        // Check if it's likely the nose was picked for a while
        if (prediction[1].probability > 0.5) {
          pickCounter++;
          // If so, send some JS to Bangle.js to make it buzz
          if (pickCounter==20 && connected)
            Puck.write("Bangle.buzz()\n");
        } else {
          pickCounter = 0;
        }
    }
    // Web Bluetooth connections can only be started in response
    // to a user's action, so this boilerplate just puts something
    // over the whole screen that must be clicked first
    let connected = false;
    let pickCounter = 0;
    Puck.modal(function() {
      Puck.write("\n", function() { // force a connection
        connected = true;
      });
    });

Now, if you start to raise your finger to pick your nose, Tensorflow will detect it and will make the Bangle.js buzz to warn you!

This page is auto-generated from GitHub. If you see any mistakes or have suggestions, please let us know.