Arduino Library

The Arduino library is pretty minimal right now but handles the most important low-level ugly stuff.

The examples show all the vital steps, but it’s mixed in with a lot of weird TFT display-specific code. Let’s look at just the camera bits…

Sketches should begin by #including the Wire and Adafruit_OV7670 libraries:

Download: file
#include <Wire.h>            // I2C comm to camera
#include "Adafruit_OV7670.h" // Camera library

The Wire library is used for camera control over I2C, while image data comes through the PCC data pins.

In the globals section of the sketch…outside the setup() and loop() functions…we set up some structures and call the OV7670 constructor.

The arch structure contains values specific to the SAMD51 hardware. In principle, in the future, there might be different arch structures for different hardware. If using Grand Central, you can just copy this line to your own code. For other M4 boards, you need to specify what timer/counter peripheral (PWM out) connects to the camera’s XCLK input, and, if the timer is a TCC peripheral, if special pin multiplexing is required for it (super esoteric, probably won’t need to change).

The pins structure specifies Arduino pin numbers where the camera’s enable, reset and XCLK pins are connected. On the SAMD51, the PCC pins are set in stone and can’t be assigned to other locations, but these few pins are OK being routed to other locations. The examples are set up for the Grand Central header.

Then the constructor is invoked…we’ll call our camera object “cam,” and it expects, in this order:

  1. An I2C address (pass OV7670_ADDR, the standard address for an OV7670).
  2. A pointer (&) to a pins structure (previously declared).
  3. A pointer (&) to a Wire (I2C) instance. Grand Central has several…one of those, Wire1, is conveniently on pins 24 (SCL) and 25 (SDA), which aligns with the camera board, almost like it was planned this way.
  4. A pointer (&) to an arch structure (previously declared).
Download: file
OV7670_arch arch = {.timer = TCC1, .xclk_pdec = false};
OV7670_pins pins = {.enable = PIN_PCC_D8, .reset = PIN_PCC_D9,
                    .xclk = PIN_PCC_XCLK};
Adafruit_OV7670 cam(OV7670_ADDR, &pins, &Wire1, &arch);

Later, inside the setup() function, we initialize the camera by calling its begin() function. This expects at least three arguments:

  1. A color mode, either OV7670_COLOR_RGB or OV7670_COLOR_YUV. RGB is best for showing color images on TFTs, but some applications such as object tracking may want grayscale data, which is more easily extracted from YUV.
  2. An initial image size, from one of the values #defined in Adafruit_OV7670.h. In most situations the library will attempt to allocate a buffer large enough for an image this size. Possible values include:
    • OV7670_SIZE_DIV1640x480 pixels, don’t bother using this right now because there’s not enough RAM to buffer a full image this size on current SAMD51 chips.
    • OV7670_SIZE_DIV2320x240 pixels (a division-by-two of 640x480). This is the largest size that most M4s can handle.
    • OV7670_SIZE_DIV4160x120 pixels (division by 4 of 640x480).
    • OV7670_SIZE_DIV880x60 pixels (ditto, 8).
    • OV7670_SIZE_DIV1640x30 pixels (16).
  3. A desired frame rate, as a floating-point value. The actual frame rate may be different from this, but it will do its best to match. The maximum supported by this camera is 30.0 frames/second.

It is IMPORTANT to check the return value from the begin() function — this tells you whether it initialized successfully and the camera is working. Possible return values include:

  • OV7670_STATUS_OK on success.
  • OV7670_STATUS_ERR_MALLOC if image buffer couldn’t be allocated (insufficient or fragmented RAM).
  • OV7670_STATUS_ERR_PERIPHERAL if an invalid timer peripheral was passed to the constructor earlier.
Download: file
OV7670_status status = cam.begin(OV7670_COLOR_RGB, OV7670_SIZE_DIV2, 30.0);
if (status != OV7670_STATUS_OK) {
  Serial.println("Camera begin() fail");
  for(;;);
}

An optional fourth argument to begin() takes an image buffer size, in bytes. There are some situations (as in the “selfie” example) where the camera might change between small and large image resolutions. It’s best in these cases to pre-allocate the image buffer to the largest anticipated image size, because it might not be possible to change later. Here we’re initially using a DIV4 (160x120 pixel) image…but allocating enough for a DIV2 (320x240) image (each pixel is 2 bytes, whether RGB or YUV):

Download: file
OV7670_status status = cam.begin(OV7670_COLOR_RGB, OV7670_SIZE_DIV4, 30.0, 320 * 240 * 2);

After begin() returns an OK status, the camera is now continuously dumping frames into a section of RAM. We can query the address of the image buffer, and the image dimensions in pixels, using:

Download: file
uint16_t *pixel_data = cam.getBuffer();
uint16_t width = cam.width();
uint16_t height = cam.height();

However…because the camera is continually dumping data, it’s possible you or the camera might overtake one another when accessing this, resulting in a visible “tear” across the image.

So, before accessing the image, it’s recommended to first pause the camera:

Download: file
cam.suspend();

(This is not a sleep mode, it just pauses the data spigot.)

Read what you need from the image buffer, then resume camera streaming with:

Download: file
cam.resume();

You can also keep the camera paused and read individual frames with:

Download: file
cam.capture();

The image data will then be in the same location as returned by getBuffer().

suspend() has slightly less latency than capture(), since it returns as soon as the current in-flight frame is received, rather than starting on the next frame.

Pixel Format

The OV7670 is “big endian” — for each 16-bit pixel, the most significant byte is at the lower address in memory.

This is the opposite of the SAMD51 and most other 32-bit microcontrollers, which are “little-endian.” If you need to dismantle and process individual pixels, a byte swap is often necessary:

Download: file
uint16_t le_pixel = __builtin_bswap16(be_pixel);

Most TFT displays are also big-endian, so the examples don’t need to do this byte swapping…they can just move data directly from the camera to the display, it’s super smooth and buttery.

In OV7670_COLOR_RGB mode, each 16-bit pixel has 5 bits of red, 6 bits green and 5 bits blue. In OV7670_COLOR_YUV mode, 8 bits are brightness and 8 for color… you can work with just the brightness byte for higher-quality grayscale than in RGB mode.

This guide was first published on Jul 28, 2020. It was last updated on 2020-08-03 17:33:53 -0400.
This page (Arduino Library) was last updated on Jul 28, 2020.