The PyBadge has a built-in accelerometer (LIS3DH) which you can use to detect tilt and motion. The accelerometer outputs 3 axes of acceleration data, and we can use that to train and infer gestures using TensorFlow!
If you want to load demo this immediately to your PyBadge, here is a UF2 file, which you can 'drag-n-drop' onto your BADGEBOOT diskdrive to load the example (follow the instructions here on how to load UF2 files if you've never done it before)
Serial out gesture demo compile & upload
Let's start with the plain Arduino TensorFlow demo. Don't forget you have to perform all the steps in the previous page for installing Arduino IDE, Adafruit SAMD support, libraries, and board/port selection!
We adapted the default gesture demo to use the LIS3DH, so you cannot use the example in the Arduino TensorFlowLite library Instead, use the one in Adafruit TensorFlow Lite
called magic_wand
Compile & upload this example!
Upon success, you may see the LED on the board pulsing. The best way to see the output is to select the Serial Monitor
You'll see steaming data coming out on the Serial Monitor. This is the 3 axis accelerometer data. We output it so that you can have some more debugging data which can be really handy when training/debugging gestures. You can also plot it with the Serial Plotter if you like (close the Monitor first)
Move and twist the badge to see the red/green/blue lines change.
Close the Plotter and re-open the monitor to see the streaming data again. This time, with the screen facing you, and the USB port pointing to the ceiling perform one of three gestures:
Wing
This gesture is a W starting at your top left, going down, up, down up to your top right
When that gesture is detected you'lll see the front NeoPixels turn yellow, and the following print out on the Serial Monitor:
Ring
This gesture is a O starting at top center, then moving clockwise in a circle to the right, then down, then left and back to when you started in the top center
When that gesture is detected you'll see the front NeoPixels turn purple, and the following print out on the Serial Monitor:
Slope
This gesture is an L starting at your top right, moving diagonally to your bottom left, then straight across to bottom right.
When that gesture is detected you'll see the front NeoPixels turn light blue, and the following print out on the Serial Monitor:
Arcada display output gesture demo compile & upload
Arcada is our library for handling displays and input - we have so many different boards and displays, we need a unifying library that would handle displays, filesystems, buttons, etc. For many boards, you don't need to do anything special to figure out the pinouts or part numbers!
Load up the Adafruit_TFLite->magic_wand_arcada example
You can upload this sketch to your board. After upload it will show up on your computer as a disk drive called CIRCUITPY (unless you changed it)
Click this button to download the gesture images and audio clips
Navigate through the zip file to examples\magic_wand_arcada\badge_files
then drag the files directly onto the CIRCUITPY drive like so:
Click reset on the Badge to restart, and you should get the graphics displaying so that you can run the demo untethered!
Setup and configuration of the accelerometer and screen is done in the accelerometer_handler
/* Copyright 2019 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ==============================================================================*/ #include "accelerometer_handler.h" #include <Arduino.h> #include "Adafruit_Arcada.h" extern Adafruit_Arcada arcada; /* this is a little annoying to figure out, as a tip - when * holding the board straight, output should be (0, 0, 1) * tiling the board 90* left, output should be (0, 1, 0) * tilting the board 90* forward, output should be (1, 0, 0); */ #if defined(ADAFRUIT_PYBADGE_M4_EXPRESS) // holding up with screen/neopixels facing you const int X_POSITION = 1; const int Y_POSITION = 2; const int Z_POSITION = 0; const bool INVERT_X = true; const bool INVERT_Y = true; const bool INVERT_Z = false; #endif #if defined(ARDUINO_NRF52840_CIRCUITPLAY) // holding up with gizmo facing you const int X_POSITION = 1; const int Y_POSITION = 2; const int Z_POSITION = 0; const bool INVERT_X = true; const bool INVERT_Y = true; const bool INVERT_Z = false; #endif #if defined(ARDUINO_NRF52840_CLUE) // holding up with gizmo facing you const int X_POSITION = 1; const int Y_POSITION = 2; const int Z_POSITION = 0; const bool INVERT_X = true; const bool INVERT_Y = true; const bool INVERT_Z = false; #endif #include "constants.h" // A buffer holding the last 200 sets of 3-channel values float save_data[600] = {0.0}; // Most recent position in the save_data buffer int begin_index = 0; // True if there is not yet enough data to run inference bool pending_initial_data = true; // How often we should save a measurement during downsampling int sample_every_n; // The number of measurements since we last saved one int sample_skip_counter = 1; uint32_t last_reading_stamp = 0; TfLiteStatus SetupAccelerometer(tflite::ErrorReporter* error_reporter) { // Wait until we know the serial port is ready //while (!Serial) { yield(); } arcada.pixels.setBrightness(50); // Set BRIGHTNESS to about 1/5 (max = 255) arcada.accel->setRange(LIS3DH_RANGE_4_G); arcada.accel->setDataRate(LIS3DH_DATARATE_25_HZ); float sample_rate = 25; // Determine how many measurements to keep in order to // meet kTargetHz sample_every_n = static_cast<int>(roundf(sample_rate / kTargetHz)); error_reporter->Report("Magic starts!"); return kTfLiteOk; } bool ReadAccelerometer(tflite::ErrorReporter* error_reporter, float* input, int length, bool reset_buffer) { // Clear the buffer if required, e.g. after a successful prediction if (reset_buffer) { memset(save_data, 0, 600 * sizeof(float)); begin_index = 0; pending_initial_data = true; } // Keep track of whether we stored any new data bool new_data = false; // Loop through new samples and add to buffer while (arcada.accel->haveNewData()) { float x, y, z; // Read each sample, removing it from the device's FIFO buffer sensors_event_t event; if (! arcada.accel->getEvent(&event)) { error_reporter->Report("Failed to read data"); break; } // Throw away this sample unless it's the nth if (sample_skip_counter != sample_every_n) { sample_skip_counter += 1; continue; } float values[3] = {0, 0, 0}; values[X_POSITION] = event.acceleration.x / 9.8; values[Y_POSITION] = event.acceleration.y / 9.8; values[Z_POSITION] = event.acceleration.z / 9.8; x = values[0]; y = values[1]; z = values[2]; if (INVERT_X) { x *= -1; } if (INVERT_Y) { y *= -1; } if (INVERT_Z) { z *= -1; } Serial.print(x, 2); Serial.print(", "); Serial.print(y, 2); Serial.print(", "); Serial.println(z, 2); last_reading_stamp = millis(); // Write samples to our buffer, converting to milli-Gs save_data[begin_index++] = x * 1000; save_data[begin_index++] = y * 1000; save_data[begin_index++] = z * 1000; // Since we took a sample, reset the skip counter sample_skip_counter = 1; // If we reached the end of the circle buffer, reset if (begin_index >= 600) { begin_index = 0; } new_data = true; } // Skip this round if data is not ready yet if (!new_data) { return false; } // Check if we are ready for prediction or still pending more initial data if (pending_initial_data && begin_index >= 200) { pending_initial_data = false; } // Return if we don't have enough data if (pending_initial_data) { return false; } // Copy the requested number of bytes to the provided input tensor for (int i = 0; i < length; ++i) { int ring_array_index = begin_index + i - length; if (ring_array_index < 0) { ring_array_index += 600; } input[i] = save_data[ring_array_index]; } return true; }
While the LED/Display output is done in the output_handler.cpp
/* Copyright 2019 The TensorFlow Authors. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ==============================================================================*/ #include "output_handler.h" #include "Arduino.h" #include "Adafruit_Arcada.h" extern Adafruit_Arcada arcada; void HandleOutput(tflite::ErrorReporter* error_reporter, int kind) { // The first time this method runs, set up our LED static int last_kind = -1; static bool is_initialized = false; if (!is_initialized) { pinMode(LED_BUILTIN, OUTPUT); is_initialized = true; } // Toggle the LED every time an inference is performed static int count = 0; ++count; if (count & 1) { digitalWrite(LED_BUILTIN, HIGH); } else { digitalWrite(LED_BUILTIN, LOW); } // Print some ASCII art for each gesture if (kind == 0) { error_reporter->Report( "WING:\n\r* * *\n\r * * * " "*\n\r * * * *\n\r * * * *\n\r * * " "* *\n\r * *\n\r"); ImageReturnCode stat = arcada.drawBMP((char *)"wing.bmp", 0, 0); if (stat != IMAGE_SUCCESS) { arcada.display->fillScreen(ARCADA_BLACK); arcada.display->setCursor(20, 20); arcada.display->setTextColor(ARCADA_YELLOW); arcada.display->setTextSize(ceil(arcada.display->width() / 30)); arcada.display->print("WING"); } arcada.WavPlayComplete("wing.wav"); arcada.pixels.fill(arcada.pixels.Color(50, 50, 0)); arcada.pixels.show(); } else if (kind == 1) { error_reporter->Report( "RING:\n\r *\n\r * *\n\r * *\n\r " " * *\n\r * *\n\r * *\n\r " " *\n\r"); ImageReturnCode stat = arcada.drawBMP((char *)"ring.bmp", 0, 0); if (stat != IMAGE_SUCCESS) { arcada.display->fillScreen(ARCADA_BLACK); arcada.display->setCursor(20, 20); arcada.display->setTextColor(ARCADA_PURPLE); arcada.display->setTextSize(ceil(arcada.display->width() / 30)); arcada.display->print("RING"); } arcada.WavPlayComplete("ring.wav"); arcada.pixels.fill(arcada.pixels.Color(50, 0, 50)); arcada.pixels.show(); } else if (kind == 2) { error_reporter->Report( "SLOPE:\n\r *\n\r *\n\r *\n\r *\n\r " "*\n\r *\n\r *\n\r * * * * * * * *\n\r"); ImageReturnCode stat = arcada.drawBMP((char *)"slope.bmp", 0, 0); if (stat != IMAGE_SUCCESS) { arcada.display->fillScreen(ARCADA_BLACK); arcada.display->setCursor(20, 20); arcada.display->setTextColor(ARCADA_BLUE); arcada.display->setTextSize(ceil(arcada.display->width() / 40)); arcada.display->print("SLOPE"); } arcada.WavPlayComplete("slope.wav"); arcada.pixels.fill(arcada.pixels.Color(0, 50, 50)); arcada.pixels.show(); } else { if (last_kind <= 2) { // re-draw intro ImageReturnCode stat = arcada.drawBMP((char *)"howto.bmp", 0, 0); if (stat != IMAGE_SUCCESS) { arcada.display->fillScreen(ARCADA_BLACK); arcada.display->setCursor(0, 0); arcada.display->setTextColor(ARCADA_WHITE); arcada.display->setTextSize(ceil(arcada.display->width() / 180.0)); arcada.display->println("With screen facing"); arcada.display->println("you, move board in"); arcada.display->println("the shape of a"); arcada.display->println("W, clockwise O or L"); } arcada.pixels.fill(0); arcada.pixels.show(); } } last_kind = kind; }
Page last edited January 20, 2025
Text editor powered by tinymce.