Install the Required CircuitPython Libraries

You will need to have a few libraries installed before the script will run on your Raspberry Pi.

Install the required CircuitPython libraries with the terminal:

pip3 install adafruit-circuitpython-seesaw
pip3 install adafruit-circuitpython-vl53l4cd
pip3 install adafruit-circuitpython-simpleio

Download the Code and Image Files

Once you've finished setting up your Raspberry Pi with Blinka and the library dependencies, you can access the Python code, Processing code, and two .png image files by downloading the Project Bundle.

To do this, click on the Download Project Bundle button in the window below. It will download as a zipped folder.

# SPDX-FileCopyrightText: 2022 Liz Clark for Adafruit Industries
# SPDX-License-Identifier: MIT

import socket
import time
import board
import simpleio
import adafruit_vl53l4cd
from adafruit_seesaw import seesaw, rotaryio, neopixel
from adafruit_seesaw.analoginput import AnalogInput

#  VL53L4CD setup
vl53 = adafruit_vl53l4cd.VL53L4CD(board.I2C())

# rotary encoder setup
encoder = seesaw.Seesaw(board.I2C(), addr=0x36)
encoder.pin_mode(24, encoder.INPUT_PULLUP)
rot_encoder = rotaryio.IncrementalEncoder(encoder)

#  neoslider setup - analog slide pot and neopixel
# 0x30 = red control
# 0x31 = green control
# 0x32 = blue control
red_slider = seesaw.Seesaw(board.I2C(), 0x30)
red_pot = AnalogInput(red_slider, 18)
r_pix = neopixel.NeoPixel(red_slider, 14, 4)

g_slider = seesaw.Seesaw(board.I2C(), 0x31)
green_pot = AnalogInput(g_slider, 18)
g_pix = neopixel.NeoPixel(g_slider, 14, 4)

b_slider = seesaw.Seesaw(board.I2C(), 0x32)
blue_pot = AnalogInput(b_slider, 18)
b_pix = neopixel.NeoPixel(b_slider, 14, 4)

#  rotary encoder position tracker
last_position = 0

#  neoslider position trackers
last_r = 0
last_g = 0
last_b = 0

#  VL53L4CD value tracker
last_flight = 0

#  rotary encoder index
x = 0

#  VL53L4CD start-up
vl53.inter_measurement = 0
vl53.timing_budget = 200


#  HTTP socket setup
s = socket.socket()
print("socket check")

port = 12345

s.bind(('', port))
print("socket binded to %s" %(port))



c, addr = s.accept()
print('got connected', addr)

while True:
    #  reset the VL53L4CD

    #  rotary encoder position read
    position = -rot_encoder.position

    #  VL53L4CD distance read
    flight = vl53.distance

    #  mapping neosliders to use 0-255 range for RGB values in Processing
    r_mapped = simpleio.map_range(red_pot.value, 0, 1023, 0, 255)
    g_mapped = simpleio.map_range(green_pot.value, 0, 1023, 0, 255)
    b_mapped = simpleio.map_range(blue_pot.value, 0, 1023, 0, 255)

    #  converting neoslider data to integers
    r_pot = int(r_mapped)
    g_pot = int(g_mapped)
    b_pot = int(b_mapped)

    #  set neopixels on neosliders to match background color of Processing animations
    r_pix.fill((r_pot, g_pot, b_pot))
    g_pix.fill((r_pot, g_pot, b_pot))
    b_pix.fill((r_pot, g_pot, b_pot))

    #  rotary encoder position check
    if position != last_position:
        #  rotary encoder is ranged to 0-3
        if position > last_position:
            x = (x + 1) % 4
        if position < last_position:
            x = (x - 1) % 4
        #  send rotary encoder value over socket
        #  identifying string is "enc"
        c.send(str.encode(' '.join(["enc", str(x)])))
        #  reset last_position
        last_position = position
    #  sliders only update data for changes >15 to avoid flooding socket
    #  red neoslider position check
    if abs(r_pot - last_r) > 2:
        #  send red neoslider data over socket
        #  identifying string is "red"
        c.send(str.encode(' '.join(["red", str(r_pot)])))
        #  reset last_r
        last_r = r_pot
    #  green neoslider position check
    if abs(g_pot - last_g) > 2:
        #  send green neoslider data over socket
        #  identifying string is "green"
        c.send(str.encode(' '.join(["green", str(g_pot)])))
        #  reset last_g
        last_g = g_pot
    #  blue neoslider position check
    if abs(b_pot - last_b) > 2:
        #  send blue neoslider data over socket
        #  identifying string is "blue"
        c.send(str.encode(' '.join(["blue", str(b_pot)])))
        #  reset last_b
        last_b = b_pot
    #  VL53L4CD value check

    #  setting max value of 45
    if flight > 45:
        flight = 45
        last_flight = flight
    if abs(flight - last_flight) > 2:
        #  send VL53L4CD data over socket
        #  identifying string is "flight"
        c.send(str.encode(' '.join(["flight", str(flight)])))
            #  reset last_flight
        last_flight = flight

After downloading the Project Bundle, unzip the folder. Your Raspberry Pi should have the following files in the Raspberry_Pi_Video_Synth folder:

  • Raspberry_Pi_Video_Synth.pde
  • cat.png
  • pizza.png

Downloading with wget

You can also use the terminal to access the files. You can create a new directory called Raspberry_Pi_Video_Synth and use wget to access the files individually from GitHub. 

mkdir Raspberry_Pi_Video_Synth
cd Raspberry_Pi_Video_Synth

Running the Python and Processing Code

Open Thonny, or your preferred Python IDE, on your Raspberry Pi. Open the file.

Open Processing 3 on your Raspberry Pi. Then, open the Raspberry_Pi_Video_Synth.pde file with the Processing IDE.

The Python and Processing files are communicating with each other through a socket. A socket is a port that is opened between two nodes that allows for sending data.

In this case, the Python script is sending data from the NeoSliders, rotary encoder and VL53L4CD to the Processing script. The Processing script's animations are then affected by these inputs.

Run the in the IDE. When the script begins running, the socket is opened and waits for a connection.

You should see listening print out to the REPL. This lets you know that the socket has been opened and is waiting for the Processing script to connect.

Then, run the Raspberry_Pi_Video_Synth.pde script in the Processing IDE.


After the socket connection is established, you'll see the Processing animations running full screen on your Raspberry Pi. You should be able to change parameters with the various STEMMA board inputs.

This guide was first published on Mar 29, 2022. It was last updated on 2022-03-29 22:34:39 -0400.

This page (Coding the Raspberry Pi Video Synth) was last updated on May 19, 2022.

Text editor powered by tinymce.