A variant of this project uses an Adafruit Feather M0 Adalogger (instead of M0 WiFi) to playback prerecorded animation sequences from a microSD card. This is great for environments where WiFi contention is an issue, or if you just don’t want to have the Processing-side computer around. You won’t get the cool multiple-synchronized-devices effect, but for a single device it’s dandy.
The circuit is identical, just swap out the Adafruit M0 WiFi for an M0 Adalogger instead. Pinouts are the same, including the battery options (e.g. optional DPDT switch).
Processing Code Adjustments
The required code changes are minimal. Rather than creating an OPC object with a network address and port…
OPC opc = new OPC(this, "192.168.0.60", 7890);
…instead, specify an integer frame rate (frames per second) and an output file as a string (absolute paths are best):
OPC opc = new OPC(this, 30, "/Volumes/4GB/anim01.opc");
You can leave out the frames-per-second argument to use the default value of 30:
OPC opc = new OPC(this, "/Volumes/4GB/anim01.opc");
Avoid using the frameRate() method elsewhere in your code…the change won’t be noted in the output file and playback will occur at a different rate. The OPC constructor sets your program�’s frame rate and records this in the file.
The Processing library doesn’t much care about the output filename (as long as it’s a valid location and has write permission)…but the Arduino code will scan the card for files with the extension “.opc”, so it’s recommended you use that.
There’s one more step…recording to a file will not commence until you call the enable() method:
opc.enable();
This can be used to make sure recording does not actually begin until a video file is actually loaded…for example, look at the OPCvideo example sketch, notice enable() is called inside the movieEvent() method. This prevents several seconds of solid black being recorded to the file while the user navigates to and selects a video file. In other examples that don’t require user input, enable() is called within setup()…recording can begin immediately.
When run, the Processing sketch will now output pixel data to this file (once the “enable()” method is called). It will continue until the program is terminated, either manually (e.g. Escape key) or through the exit() method.
On the Arduino Side…
In the Arduino IDE…instead of the OPCserver sketch, open OPCstreamSD from the same repository and upload that to the board.
OPCstreamSD will scan the root directory of an SD card; it does not look in subfolders. Any file ending in “.opc” (and that appears to contain content generated from the OPC Processing library) will be added to the play list, which is sorted alphabetically. Each file is played in turn…at the end of the list, it then returns to the first file. As written, this is limited to 50 files maximum, but that’s easily increased in the code if needed.
Good to Know
The files generated by the Processing library are specific to a given installation (e.g. whatever pixel layout you defined using opc.ledGrid() or other methods). Because the OPC library can place pixels anywhere, it doesn’t convey any data regarding things like matrix size, because it’s not inherently tied to matrices (nor any other specific topology).
So, for example, if you render an animation for a 16x16 DotStar matrix with a zig-zag order, it will not play back correctly on an 8x8 matrix, or 32x8, or a progressive pixel order or anything else. That file will work only for the installation for which it was designed, or one with an identical topology.
Text editor powered by tinymce.