This is a very minor and could-have-been-simplistic project, but makes for an amusing story about obsessiveness. Before getting into computers, I'd planned to follow in my dad’s footsteps in graphic design. The vestiges of this interest sometimes flare up at the oddest times…

A commercial product called FakeTV simulates the changing light cast by a television set, making a home appear occupied at night and thus (in theory) a less appealing target to burglars.

As FakeTV involves microcontrollers and RGB LEDs, it’s inevitable then that fake FakeTV-like projects have appeared on various DIY web sites.

Having a one-time need for such an item, and with scads of microcontroler boards and NeoPixels already on-hand, I gave a few of these DIY projects a try.

All suffered from extreme…well…fakeness. Some cycled predictably through a canned sequence of colors. Others switched among random RGB values. I don’t know if burglars would even notice such a thing, but it immediately struck me as “off.” Way off.

Movies are not random! Whether animated or live-action, they have Art Directors and Colorists and a specific color palette. Color sets the mood and defines locales and characters and is very un-random.

For example, the cinematic gem Thunderpants focuses almost exclusively on shades of green and brown.

The FakeTV web site claims their product — unlike DIY clones — began by analyzing actual TV programming, then developing an algorithm that realistically approximates these sorts of color changes mathematically.

Pretty cool idea, but I lacked both the time and inclination to develop a TV simulation algorithm. An idea struck me though…microcontrollers have a fair amount of flash memory…suppose we just stored color palettes of some actual art-directed scenes from actual movies? We don’t need to store whole image frames, just the average screen color over time.

Okay then…but how to acquire this kind of data? I was still in a hurry and didn’t want to process tons of source video.

Turns out there are folks who study just this sort of thing. This Washington Post article highlights the work of Dillon Baker in creating color timelines from films, and links to other sites with similar visualizations. “Scraping” these sites might be one option.


The Disney Animated app for iPad includes color timelines for 54 Disney films. Mostly animated, but some with live action segments, others with more “natural” tones.

A couple of screen captures later and I had dozens of hours of source data to work with…

But wait a minute…if we’d simply used random RGB colors…is any would-be thief really gonna stop and say, “Hang on…no director would cut between those two colors…that’s a fake TV!” It no longer mattered at this point. The crust was broken, both the artist and engineer sides of the brain had been tickled. It’s an unstoppable force of nature. We’re doomed.

Some folks obsess over audio. I obsess over images.

Crunching the Numbers

The iPad retina screen capture contains over nine megabytes of data.

There’s 32 kilobytes of flash memory on an ATmega328P-based microcontroller board. Leaving room for playback code and bootloader, it’s more like a maximum of 26K or so for our data. There are other boards with more flash space, or could have used an SD card, but I wanted to keep it simple and use a Metro 328 board…a 40-pixel NeoPixel Shield would neatly fit atop it for a FakeTV-like form-factor…but it can work just as well with NeoPixel strip, if that’s what you’ve got.

To make the data fit, something will have to go. Several orders of magnitude…

Looking closely at the screen capture, you’ll see these aren’t simple frame averages. The app describes their use of “clusters” in the visualization. To change the clusters into averages we’ll do some image editing with Photoshop.

First, a small black band at the bottom of the screen was cropped off.

There are 54 films in the visualization, each with one horizontal band. To produce an average for each one, the image is simply resized to 54 pixels tall (keeping the same width) specifically using bilinear interpolation mode (which will average all the pixels in each band, unlike bicubic which assigns different pixel weights and will “bleed over” from adjacent rows).

Strictly for the sake of illustration here…not done to the source data…the image was re-stretched to the original size in nearest neighbor mode, so you can see the color stripes.

Scaling down the image vertically this way provides a massive 28X reduction. But the iPad screen is huge (2048 pixels across), and this still represents about 330 kilobytes of color data. A few more reductions are needed…

Culling the herd: Some color timelines aren’t as “lively” as others…they might dwell on one color for a very long time, or are just very dark overall…or too bright, or saturated, or desaturated. Three out of four timelines were discarded, keeping the 13 most visually interesting. These were selected manually.

Another 4X reduction. Now about 80K of data.

Representing each film in its entirety didn’t seem necessary…titles and end credits, for example, were usually very plain…so I cropped just a 1,000-pixel-wide section from the larger image. If we assume each film has about a 100-ish minute average running time, this represents a roughly 45-ish minute chunk from each. That’s about 2.7-ish seconds-ish per pixel-ish.

Slightly over 2X reduction, now at 39 kilobytes. Here’s the resulting image:

This is still too large to fit in the Metro’s flash space, but one more reduction is going to take place…

A Python script will process this image into an Arduino header file. Along the way, this will quantize the 24-bit color data down to 16-bits per pixel (5 bits red, 6 bits green, 5 bits blue)…a 2/3 reduction, or 26 kilobytes total.

Though we lose some color fidelity, these are the least-significant bits…and there are plenty of other factors (LED color balance, diffuse interreflection with a room’s walls and other objects) that will tinge the resulting colors anyway. The playback code will do some shenanigans that should recover some intermediary shades.

Here’s the Python script that converts the image (named crop.png) to a .h file. It requires the Python Imaging Library or Pillow:

from PIL import Image
import sys

# Output one hex byte with C formatting & line wrap ------------------------

numBytes = 0  # Total bytes to be output in table
byteNum  = 0  # Current byte number (0 to numBytes-1)
cols     = 12 # Current column number in output (force indent on first one)

def writeByte(n):
        global cols, byteNum, numBytes

        cols += 1                      # Increment column #
        if cols >= 12:                 # If max column exceeded...
                print                  # end current line
                sys.stdout.write("  ") # and start new one
                cols = 0               # Reset counter
        sys.stdout.write("{0:#0{1}X}".format(n, 4))
        byteNum += 1
        if byteNum < numBytes:
                if cols < 11:
                        sys.stdout.write(" ")

# Mainline code ------------------------------------------------------------

# Output 8-bit gamma-correction table:
sys.stdout.write("const uint8_t PROGMEM gamma8[] = {")
numBytes = 256
for i in range(256):
        base     = 1 + (i / 3)  # LCD, CRT contrast is never pure black
        overhead = 255.0 - base
        writeByte(base + int(pow(i / 255.0, 2.7) * overhead + 0.5))
print " },"

# Output color data (2 bytes per pixel):
sys.stdout.write("colors[] = {")
image        = Image.open("crop.png")
image.pixels = image.load()
numBytes     = image.size[0] * image.size[1] * 2
byteNum      = 0
cols         = 12
for y in range(image.size[1]):
        for x in range(image.size[0]):
                r = image.pixels[x, y][0]
                g = image.pixels[x, y][1]
                b = image.pixels[x, y][2]
                # Convert 8/8/8 (24-bit) RGB to 5/6/5 (16-bit):
                writeByte((r & 0xF8) | (g >> 5))
                writeByte(((g & 0x1C) << 3) | (b >> 3))
print " };"

The output of this program is redirected to a file (e.g. data.h) and can then be #included by the Arduino code on the next page…

Arduino Sketch

In addition to the code below, you’ll need the output from the Python program on the prior page (or we also include it later on this page).

#include <Adafruit_NeoPixel.h>
#include "data.h" // Output of Python script

#define NUM_LEDS 40
#define PIN       6
Adafruit_NeoPixel strip = Adafruit_NeoPixel(NUM_LEDS, PIN, NEO_GRB);

#define  numPixels (sizeof(colors) / sizeof(colors[0]))
uint32_t pixelNum;
uint16_t pr = 0, pg = 0, pb = 0; // Prev R, G, B

void setup() {
  pixelNum = random(numPixels); // Begin at random point

void loop() {
  uint32_t totalTime, fadeTime, holdTime, startTime, elapsed;
  uint16_t nr, ng, nb, r, g, b, i;
  uint8_t  hi, lo, r8, g8, b8, frac;

  // Read next 16-bit (5/6/5) color
  hi = pgm_read_byte(&colors[pixelNum * 2    ]);
  lo = pgm_read_byte(&colors[pixelNum * 2 + 1]);
  if(++pixelNum >= numPixels) pixelNum = 0;

  // Expand to 24-bit (8/8/8)
  r8 = (hi & 0xF8) | (hi >> 5);
  g8 = (hi << 5) | ((lo & 0xE0) >> 3) | ((hi & 0x06) >> 1);
  b8 = (lo << 3) | ((lo & 0x1F) >> 2);
  // Apply gamma correction, further expand to 16/16/16
  nr = (uint8_t)pgm_read_byte(&gamma8[r8]) * 257; // New R/G/B
  ng = (uint8_t)pgm_read_byte(&gamma8[g8]) * 257;
  nb = (uint8_t)pgm_read_byte(&gamma8[b8]) * 257;

  totalTime = random(250, 2500);    // Semi-random pixel-to-pixel time
  fadeTime  = random(0, totalTime); // Pixel-to-pixel transition time
  if(random(10) < 3) fadeTime = 0;  // Force scene cut 30% of time
  holdTime  = totalTime - fadeTime; // Non-transition time

  startTime = millis();
  for(;;) {
    elapsed = millis() - startTime;
    if(elapsed >= fadeTime) elapsed = fadeTime;
    if(fadeTime) {
      r = map(elapsed, 0, fadeTime, pr, nr); // 16-bit interp
      g = map(elapsed, 0, fadeTime, pg, ng);
      b = map(elapsed, 0, fadeTime, pb, nb);
    } else { // Avoid divide-by-zero in map()
      r = nr;
      g = ng;
      b = nb;
    for(i=0; i<NUM_LEDS; i++) {
      r8   = r >> 8; // Quantize to 8-bit
      g8   = g >> 8;
      b8   = b >> 8;
      frac = (i << 8) / NUM_LEDS; // LED index scaled to 0-255
      if((r8 < 255) && ((r & 0xFF) >= frac)) r8++; // Boost some fraction
      if((g8 < 255) && ((g & 0xFF) >= frac)) g8++; // of LEDs to handle
      if((b8 < 255) && ((b & 0xFF) >= frac)) b8++; // interp > 8bit
      strip.setPixelColor(i, r8, g8, b8);
    if(elapsed >= fadeTime) break;

  pr = nr; // Prev RGB = new RGB
  pg = ng;
  pb = nb;

Points of interest in the Arduino sketch:

  • The colors are from actual films, but the timing is semi-random…the goal isn’t to match specific scene tempos, just needed believable color sequences. Some transitions are abrupt cuts, others fade (implying a camera pan or something moving in or out of frame).
  • Color interpolation takes place in 16-bit space…LEDs are only 8-bit, but some fraction of the LEDs is used to get in-between shades.
  • The NeoPixel library periodically disables interrupts and is known to mess with timekeeping functions like millis(). That’s okay for this application…we’re not precisely beat-matching any source material, just measuring relative time.

Upload sketch to a Metro board (or Arduino Uno or compatible) with NeoPixel shield installed. Power from a quality USB supply. This can then be plugged into a basic lamp timer. The sketch starts at a random point in the color data, so it’s not repeating the same sequence every time it powers up.

Aim it at the ceiling to wash a room in color, similar to the glow from a TV, or directly toward curtains if you need more brightness.

If Python is unavailable on your system, or if it’s just easier this way, here are the entire gamma and color tables. Create a new tab in the Arduino sketch, name it “data.h”, then cut and paste this whole thing: