We picked up a Nintendo R.O.B. robot from our local online auction site and when it appeared we decided to figure out how to get it working. There's 3 motors inside, and the R.O.B. already comes with motor drivers and end-stops, so instead of driving the robot directly, we decided to control the R.O.B. using Circuit Playground Express (CPX) and Crickit!
The code is all in CircuitPython.
We use the Crickit for the amplified audio effects (we snagged some audio from gameplay to give it that authentic chiptune sound), driving an IR LED for signalling at 500mA burst current so we could have it a few feet away, and the capacitive touch inputs for our desk controller.
With the addition of a D battery for the gyro turner, we had a fun live-action game without the need of a CRT!
Wiring Diagram
The IR LED can handle up to 1 Amp peak current, so don't use a resistor, just wire it up to Drive 1 directly!
We use 4 capacitive touch sensors from the Crickit and 2 from CPX for 6 total (there's more capacitive touch inputs available on Crickit Signal pins but we wanted to use plain alligator pads!)
Code!
Save to your CPX as code.py and touch the alligator clips to control your R.O.B.
The IR LED should be 1-2 feet away and pointed at the R.O.B's left eye (or, the right-most eye when you are looking at R.O.B)
It will calibrate when first starting up, and play some tunes.
Flip the switch on/off on the CPX to turn on/off the capacitive touch detection/command sending (if you need to adjust your cables without having the robot turn around on you!
To help you know what's going on, the NeoPixels on the CPX will glow to match the colors of the alligator clips shown above, so use those same colors! Only exception is black shows up as purple LEDs.
You may need to tweak the capacitive touch threshholds. Try uncommenting
#touch_vals = (touch2.raw_value, touch3.raw_value, seesaw.touch_read(0), seesaw.touch_read(1), seesaw.touch_read(2), seesaw.touch_read(3))
#print(touch_vals)
And watching the REPL to see what the values read are.
# SPDX-FileCopyrightText: 2018 Limor Fried for Adafruit Industries # # SPDX-License-Identifier: MIT import time import gc from digitalio import DigitalInOut, Direction, Pull from busio import I2C from adafruit_seesaw.seesaw import Seesaw from adafruit_seesaw.pwmout import PWMOut import touchio import audioio import audiocore import neopixel import board pixels = neopixel.NeoPixel(board.NEOPIXEL, 10, brightness=1) pixels.fill((0,0,0)) # Create seesaw object i2c = I2C(board.SCL, board.SDA) seesaw = Seesaw(i2c) # switch switch = DigitalInOut(board.SLIDE_SWITCH) switch.direction = Direction.INPUT switch.pull = Pull.UP # We need some extra captouches touch2 = touchio.TouchIn(board.A2) touch3 = touchio.TouchIn(board.A3) # LED for debugging led = DigitalInOut(board.D13) led.direction = Direction.OUTPUT # Create drive (PWM) object INFRARED_LED_SS = 13 my_drive = PWMOut(seesaw, INFRARED_LED_SS) # Drive 1 is on s.s. pin 13 my_drive.frequency = 1000 # Our default frequency is 1KHz CAPTOUCH_THRESH = 850 # Commands, each 8 bit command is preceded by the 5 bit Init sequence Init = [0, 0, 0, 1, 0] # This must precede any command Calibrate = [1, 0, 1, 0, 1, 0, 1, 1] # the initial calibration Up = [1, 0, 1, 1, 1, 0, 1, 1] # Move arms/body down Down = [1, 1, 1, 1, 1, 0, 1, 1] # Move arms/body up Left = [1, 0, 1, 1, 1, 0, 1, 0] # Twist body left Right = [1, 1, 1, 0, 1, 0, 1, 0] # Twist body right Close = [1, 0, 1, 1, 1, 1, 1, 0] # Close arms Open = [1, 1, 1, 0, 1, 1, 1, 0] # Open arms Test = [1, 1, 1, 0, 1, 0, 1, 1] # Turns R.O.B. head LED on print("R.O.B. Start") def IR_Command(cmd): print("Sending ", cmd) gc.collect() # collect memory now, timing specific! # Output initialization and then command cmd for val in Init+cmd: # For each value in initial+command if val: # if it's a one, flash the IR LED seesaw.analog_write(INFRARED_LED_SS, 65535) # on seesaw.analog_write(INFRARED_LED_SS, 0) # off 2ms later time.sleep(0.013) # 17 ms total # pylint: disable=useless-else-on-loop else: time.sleep(0.015) # 17 ms total a = audioio.AudioOut(board.A0) startfile = "startup.wav" loopfile = "loop.wav" with open(startfile, "rb") as f: wav = audiocore.WaveFile(f) a.play(wav) for _ in range(3): IR_Command(Calibrate) time.sleep(0.5) while a.playing: IR_Command(Open) time.sleep(1) IR_Command(Close) time.sleep(1) f = open(loopfile, "rb") wav = audiocore.WaveFile(f) a.play(wav, loop=True) while True: # Main Loop poll switches, do commands led.value = switch.value # easily tell if we're running if not switch.value: continue #touch_vals = (touch2.raw_value, touch3.raw_value, seesaw.touch_read(0), seesaw.touch_read(1), # seesaw.touch_read(2), seesaw.touch_read(3)) #print(touch_vals) if touch2.raw_value > 3000: print("Open jaws") pixels.fill((50,50,0)) IR_Command(Open) # Button A opens arms elif touch3.raw_value > 3000: print("Close jaws") pixels.fill((0,50,0)) IR_Command(Close) # Button B closes arms elif seesaw.touch_read(0) > CAPTOUCH_THRESH: print("Up") pixels.fill((50,0,50)) IR_Command(Up) elif seesaw.touch_read(1) > CAPTOUCH_THRESH: print("Down") pixels.fill((50,50,50)) IR_Command(Down) elif seesaw.touch_read(2) > CAPTOUCH_THRESH: print("Left") pixels.fill((50,0,0)) IR_Command(Left) elif seesaw.touch_read(3) > CAPTOUCH_THRESH: print("Right") pixels.fill((0,0,50)) IR_Command(Right) time.sleep(0.1) pixels.fill((0,0,0))
Text editor powered by tinymce.