# OpenClaw on Raspberry Pi

## Overview

Danger: ## Here be dragons
Running a LLM based AI agent with shell access is dangerous. There is a risk of prompt injection or adversarial input getting sent to the bot via the enabled communications channels or other text it finds online. The bot could run malicious software, or expose API tokens and other private information that it has access to if proper safeguards are not implemented and maintained. Read and follow the advice on the [OpenClaw docs security page](https://docs.openclaw.ai/gateway/security) and be warned that this is cutting edge technology that is rapidly changing. LLMs are notorious for hallucinating and behaving unexpectedly, even if you follow all of the best practices there is still some risk with this type of tool.

Pink: ladyada here! openclaw basically lets an LLM become the 'brains' of a computer, in essence like Claude Code --super-dangerous-living. we're using this to experiment with giving LLMs access to hardware & firmware development without needing a lot of babysitting, but also not giving it full run of a computer. we recommend NOT giving it any keys or tokens or access, for example we made ours it's own github account so it can submit PRs or comment on issues but not with elevated privs. your mileage may vary, this is very cutting-edge and interesting but not necessarily useful. also, it looooves to burn through tokens, so do not give it full leash on billing!

https://www.youtube.com/watch?v=9zjehXlaCnY

~~Clawdbot~~ err ~~Moltbot~~, or maybe [OpenClaw](https://openclaw.ai/) is a project that aims to make LLM based AI Agents *do things*. It allows you to hook up Claude Opus or your preferred model to have free reign over a computer. Like Claude code with memory and intended to be a more generalized personal assistant rather than only to help you write software. It can run commands, write and run programs, and install modules to expand its own capabilities. It hooks into 3rd party communications channels like Telegram, Discord, email and more. It integrates with other systems like Home Assistant, Trello, Apple Notes &amp; Reminders, Notion, GitHub and others. You can talk to it through your preferred chat app and ask it to do things and get back to you when it's done or encounters an issue and needs more direction. Scheduled tasks can be set up to carry out specified prompts repeatedly.

Its recently release (January 2026) been going viral and gaining lots of attention. It supports integrations into several different Apple programs, many people have reported running it inside its own Mac Mini. With the "thinking" being sent off remotely to Opus, we thought surely this could be run on lower powered hardware, so we started experimenting with the Pi 5. Running on a Pi 5 also makes it easy to connect more interesting bits of hardware to give it the ability to sense and interact with the physical world in different ways.

This guide will document the setup of OpenClaw on a Rasperry Pi 5 w/ 8GB RAM as well as how to give it access to a TFT display, temperature/pressure sensor, USB camera, and NeoPixels. On the software side, it will use [eSpeak text-to-speech](https://espeak.sourceforge.net/) and [Whisper Small speech-to-text](https://huggingface.co/openai/whisper-small) to enable speech based communication with the bot.

![PiTFT showing Pipbot robot face design made by open claw. Text underneath says "PIPBOT" and "Hello World"](https://cdn-learn.adafruit.com/assets/assets/000/142/290/medium640/temperature___humidity_pipbot_face.png?1770230051)

## Parts
### Raspberry Pi 5 - 8 GB RAM

[Raspberry Pi 5 - 8 GB RAM](https://www.adafruit.com/product/5813)
The Raspberry Pi 5&nbsp;is the newest Raspberry Pi computer, and the Pi Foundation knows you can always make a good thing&nbsp;_better_! And what could make the Pi 5 better than the 4? How about a&nbsp;_faster_&nbsp;processor, USB 3.0 ports, and an updated Gigabit Ethernet chip...

Out of Stock
[Buy Now](https://www.adafruit.com/product/5813)
[Related Guides to the Product](https://learn.adafruit.com/products/5813/guides)
![Angled shot of green Raspberry Pi microcontroller.](https://cdn-shop.adafruit.com/640x480/5813-01.jpg)

### Official Raspberry Pi 27W PD Power Supply 5.1V 5A with USB C

[Official Raspberry Pi 27W PD Power Supply 5.1V 5A with USB C](https://www.adafruit.com/product/5814)
The official Raspberry Pi PD USB-C power supply is here! Superfast with just the right amount of cable length to get your Pi 5 projects up and running! With true Power Delivery support, this supply can easily supply 5 A of current!

Comes in gothy Adafruit Black! Great for use with Pi 4...

In Stock
[Buy Now](https://www.adafruit.com/product/5814)
[Related Guides to the Product](https://learn.adafruit.com/products/5814/guides)
![Angled shot of black, square-shaped power supply block with a coiled black cable.](https://cdn-shop.adafruit.com/640x480/5814-02.jpg)

### Adafruit PiTFT Plus 320x240 2.8" TFT + Capacitive Touchscreen

[Adafruit PiTFT Plus 320x240 2.8" TFT + Capacitive Touchscreen](https://www.adafruit.com/product/2423)
Is this not the cutest little display for the Raspberry Pi? It features a 2.8" display with 320x240 16-bit color pixels and a capacitive touch overlay.&nbsp;That's right, instead of a resistive touchscreen, which requires a fingernail or stylus, you can now use a fingerpad. The screen...

Out of Stock
[Buy Now](https://www.adafruit.com/product/2423)
[Related Guides to the Product](https://learn.adafruit.com/products/2423/guides)
![Red polished nail touching the screen of a Adafruit PiTFT Plus 320x240 2.8" TFT + Capacitive Touchscreen. ](https://cdn-shop.adafruit.com/640x480/2423-14.jpg)

### Adafruit BME680 - Temperature, Humidity, Pressure and Gas Sensor

[Adafruit BME680 - Temperature, Humidity, Pressure and Gas Sensor](https://www.adafruit.com/product/3660)
The long awaited BME680 from Bosch gives you&nbsp;_all the environmental sensing you want_&nbsp;in one small package. This little sensor contains&nbsp; **temperature** ,&nbsp; **humidity** ,&nbsp; **barometric pressure,** &nbsp;and&nbsp; **VOC...**

In Stock
[Buy Now](https://www.adafruit.com/product/3660)
[Related Guides to the Product](https://learn.adafruit.com/products/3660/guides)
![Angled shot of black, square-shaped pressure-humidity-temperature-gas sensor breakout.](https://cdn-shop.adafruit.com/640x480/3660-08.jpg)

### Assembled Pi Cobbler Plus - Breakout Cable

[Assembled Pi Cobbler Plus - Breakout Cable](https://www.adafruit.com/product/2029)
The Raspberry Pi B+ / Pi 2 / Pi 3 / Pi 4&nbsp;has landed on the Maker World like a 40-GPIO pinned, quad-USB ported, credit card sized bomb of DIY joy. And while you can use most of our great Model B accessories by hooking up our [downgrade...](https://www.adafruit.com/product/1986)

In Stock
[Buy Now](https://www.adafruit.com/product/2029)
[Related Guides to the Product](https://learn.adafruit.com/products/2029/guides)
![Angled shot of blue, rectangular, GPIO breakout board.](https://cdn-shop.adafruit.com/640x480/2029-04.jpg)

### GPIO Ribbon Cable for Raspberry Pi Model A+/B+/Pi 2/Pi 3/Pi 4/5

[GPIO Ribbon Cable for Raspberry Pi Model A+/B+/Pi 2/Pi 3/Pi 4/5](https://www.adafruit.com/product/1988)
That new Raspberry Pi® Model B+ computer you just got has a row of 2x20 pin headers soldered on - those are the GPIO (general purpose input/output) pins, and for those of us who like to hack electronics, they are where the real fun is. By programming the Pi, you can twiddle those pins high...

In Stock
[Buy Now](https://www.adafruit.com/product/1988)
[Related Guides to the Product](https://learn.adafruit.com/products/1988/guides)
![Angled shot of a GPIO Ribbon Cable for Raspberry Pi Model.](https://cdn-shop.adafruit.com/640x480/1988-00.jpg)

### Half-Size Breadboard with Mounting Holes

[Half-Size Breadboard with Mounting Holes](https://www.adafruit.com/product/4539)
This cute 3.2″ × 2.1″ (82 × 53mm) solderless half-size breadboard has four bus lines&nbsp;and 30 rows of pins, our favorite size of solderless breadboard for projects. You get a whoppin' **400 tie points**! [Plug in...](http://adafruit.com/feather)

Out of Stock
[Buy Now](https://www.adafruit.com/product/4539)
[Related Guides to the Product](https://learn.adafruit.com/products/4539/guides)
![Angled shot of Half-Size Breadboard with Mounting Holes.](https://cdn-shop.adafruit.com/640x480/4539-03.jpg)

### Micro HDMI to HDMI Cable - 2 meter

[Micro HDMI to HDMI Cable - 2 meter](https://www.adafruit.com/product/1322)
Connect an HDMI device with a micro HDMI port to one with a regular size HDMI port together with this basic HDMI cable. It has nice molded grips for easy installation, and is 2 meter long (about 6 feet). This is a HDMI 1.3 cable. Works great with a BeagleBone or Raspberry Pi 4  
  
<img...></img...>

In Stock
[Buy Now](https://www.adafruit.com/product/1322)
[Related Guides to the Product](https://learn.adafruit.com/products/1322/guides)
![Coiled, 2 meter long, black, micro HDMI to HDMI cable.](https://cdn-shop.adafruit.com/640x480/1322-05.jpg)

# OpenClaw on Raspberry Pi

## Installing OpenClaw

Danger: ## Here be dragons
Running a LLM based AI agent with shell access is dangerous. There is a risk of prompt injection or adversarial input getting sent to the bot via the enabled communications channels or other text it finds online. The bot could run malicious software, or expose API tokens and other private information that it has access to if proper safeguards are not implemented and maintained. Read and follow the advice on the [OpenClaw docs security page](https://docs.openclaw.ai/gateway/security) and be warned that this is cutting edge technology that is rapidly changing. LLMs are notorious for hallucinating and behaving unexpectedly. Even if you follow all of the best practices, there is still some risk with this type of tool.

Use the [Raspberry Pi Imager](https://www.raspberrypi.com/software/) to load a fresh image of Raspberry Pi OS (64-bit) onto a micro SD card. The Raspberry Pi 5 with 8GB RAM is the only model of Pi that has been tested and known to work. Once the SD card is ready, insert it into the Pi 5 and boot up.

Run these commands to update all of the pre-loaded software.

```terminal
sudo apt update
sudo apt upgrade
```

Installing OpenClaw can be done by downloading the install script from the internet and running it in bash. The "Quick Start" section of [https://openclaw.ai/](https://openclaw.ai/) shows a command that can be used to do this as well as a few alternative installation methods. This is the one-line command to install it:

```terminal
curl -fsSL https://openclaw.ai/install.sh | bash
```

Yellow: curling a shell script and piping directly to the shell is typically considered exercising poor security practices. If you have concerns, you can get the **install.sh** file then scan it for security issues. Since this guide generally is "off the beaten path", you should follow what security you think best.

The script will set up node.js and all of the other requirements that OpenClaw needs. The install time will depend on your network connection. It took about 2-4 minutes to install over WiFi during testing.&nbsp;

![The final output of the openclaw install script showing the application version, title bar, and the beginning onboarding message.](https://cdn-learn.adafruit.com/assets/assets/000/142/213/medium800/temperature___humidity_install_complete_onboarding_begin.png?1769807213 )

Once the install is complete, the onboarding process will start automatically.

The first step is acknowledging the security risk inherent to using the software.

![OpenClaw security warning shown as the first step of the onboarding process.](https://cdn-learn.adafruit.com/assets/assets/000/142/214/medium800/temperature___humidity_onboarding_security_warning.png?1769807345 )

Read the warning and make sure you understand it. It is also recommended to read the security information at&nbsp;[https://docs.openclaw.ai/gateway/security](https://docs.openclaw.ai/gateway/security). If you do not understand these risks or do not feel comfortable accepting them, you should skip this project.

Danger: If you do not understand these risks or do not feel comfortable accepting them, you should skip this project.

To move forward you must use the arrow keys to select Yes and then press enter.

![Open Claw onboarding mode prompt with QuickStart selected.](https://cdn-learn.adafruit.com/assets/assets/000/142/215/medium800/temperature___humidity_onboarding_quickstart.png?1769807966 )

Leave the default **QuickStart** onboarding mode selected and press **enter**.

Select the Model/auth provider that you wish to use. Anthropic with the Opus model is the only one that I tested on the Pi, but there are several other choices if you use a different LLM agent service. This guide documents Anthropic only. If you are using something different you will need to follow along with the wizard and&nbsp;[docs here](https://docs.openclaw.ai/providers/models), and [here](https://docs.openclaw.ai/concepts/model-providers) for your provider.

Use **arrow keys** to select **Anthrophic** and then press **enter**.

![Open Claw onboarding model/auth provider with Anthropic selected.](https://cdn-learn.adafruit.com/assets/assets/000/142/216/medium640/temperature___humidity_onboarding_model_auth_provider.png?1769808122)

The next prompt is for the Anthropic auth method.&nbsp;Leave the **Anthropic token (paste setup-token)** option selected and press&nbsp; **enter**. If you already have a Claude Code Anthropic token, you can paste it when prompted.

![Open Claw Anthropic auth method step with Anthropic token (paste setup-token) option selected.](https://cdn-learn.adafruit.com/assets/assets/000/142/217/medium800/temperature___humidity_onboarding_anthropic_token.png?1769808681 )

## Using Claude Code CLI To Get an Anthropic Token

If you don't yet have a token, then you'll need to access it using the [Claude Code CLI](https://code.claude.com/docs/en/setup). You can do this on the Pi itself, or any other Linux computer.

To do it on the Pi, open a new terminal window or SSH session so that you can leave the OpenClaw onboarding process paused where it's at. You'll return back to it once you've got the token.

Install Claude Code CLI with this command.

```terminal
curl -fsSL https://claude.ai/install.sh | bash
```

Once the installation is complete run this to set up your token.

```terminal
.local/bin/claude setup-token
```

It will print an authorization URL starting with `https://claude.ai/oauth/authorize?` copy the URL and open it in a browser. The browser can be on any PC, it doesn't have to be on the Pi itself. If you are already logged in to Anthropic/Claude on a specific PC or browser, then using that will be easiest.

When you visit the URL in a browser, you'll be prompted to login if you aren't already. Depending on how your account is set up you may be asked about what organization to connect Claude code under. Follow the directions on the page to continue the process.

You will be prompted to authorize Claude Code to connect to your account and warned that it will contribute to your plan usage.

Click the **Authorize** button.

After authorizing, you'll be given code made of random characters to copy. **Paste the code into the Claude Code prompt** then press **enter**. Note that it needs to be pasted into the Claude Code setup-token CLI utility, this is not the token for Open Claw yet.

![Claude Code authorization page](https://cdn-learn.adafruit.com/assets/assets/000/142/218/medium640/temperature___humidity_claude_code_authorize.png?1769809959)

Warning: OpenClaw could use a significant amount of AI tokens, resulting in fees. If you cannot afford significant costs, consider only giving OpenClaw set amounts of tokens which you are comfortable paying for.

After entering the code you'll be presented with an OAuth token that is valid for 1 year.

The token is a string that starts with `sk-ant-oat01-` and contains many random characters after that.

## Finish Open Claw Onboarding
Warning: Keep your OAuth token secret. Store it in a password manager or other secure storage system (not on the P5 itself). If someone else were to get this token they could use Claude on your account and drive up your usage bill!

Copy the token, being careful to ensure you get the full token and no extra spaces or other characters surrounding it. Return to the Open Claw onboarding wizard and **paste the Anthropic OAuth token** in the prompt then press&nbsp; **enter**.

![Open Claw onborading Paste Anthropic setup-token step with a partially redacted token shown after being pasted.](https://cdn-learn.adafruit.com/assets/assets/000/142/219/medium800/temperature___humidity_onboarding_paste_anthropic_token.png?1769810806 )

The next step prompts you to name the token if desired. Leave the "default" name and press **enter**.

![Open Claw onboarding Token name step with the "default" name showing](https://cdn-learn.adafruit.com/assets/assets/000/142/220/medium800/temperature___humidity_onboarding_token_name.png?1769810970 )

The next prompt is for which model to use. As of 1/30/26, the default model is anthropic/claude-opus-4-5. As newer models come out, the default may change.

Leave the default option selected if you are unsure, or select your model of choice if you have experience with them and prefer one over the others.

Use **up/down arrow keys** to select a model and press&nbsp; **enter** once the desired model is selected.

![Open Claw onboarding model selection step with default model anthropic/claude-opus-4-5 selected](https://cdn-learn.adafruit.com/assets/assets/000/142/221/medium640/temperature___humidity_onboarding_default_model.png?1769811073)

Next is channel selection. The channels are communications services that OpenClaw supports. Telegram, WhatsApp, Discord, and many others are available.&nbsp;

Use **arrow down** to select the **Skip for now** option and press **enter**.&nbsp;

Channels are optional. It's possible communicate with the bot over its local web based management portal or the terminal.

You can always connect channels later with the `openclaw channels add` command. Each channel requires slightly different steps, but the wizard walks you through them. You can also references the [channels openclaw docs](https://docs.openclaw.ai/cli/channels#channels), and search for the specific channel in openclaw docs to get more specific info.

![Open Claw onboarding Select channel step with "Skip for now" selected.](https://cdn-learn.adafruit.com/assets/assets/000/142/222/medium640/temperature___humidity_onboarding_channels.png?1769811636)

The next prompt is for configuring skills now.

Select **"Yes"** and then press **enter**.

![Open Claw onboarding Configure skills step with "Yes" selected.](https://cdn-learn.adafruit.com/assets/assets/000/142/223/medium640/temperature___humidity_onboarding_configure_skills.png?1769812268)

The first skill prompt is for Homebrew. Since Homebrew is a Mac utility and we're running OpenClaw on a Raspberry Pi, this is not needed.&nbsp;

Select **"No"** and then press **enter**.

![Open Claw onboarding skills Homebrew step with "No" selected](https://cdn-learn.adafruit.com/assets/assets/000/142/224/medium800/temperature___humidity_onboarding_homebrew_no.png?1769812390 )

Next is the preferred node manager for skill installs.

Leave the default option **npm** selected and press **enter**.

![Open Claw onboarding node manager for skill installs step with "npm" selected](https://cdn-learn.adafruit.com/assets/assets/000/142/225/medium800/temperature___humidity_onboarding_node_manager.png?1769813071 )

Next is a huge list of skills that are available for OpenClaw to integrate with other applications and services. Many things are possible without any of these tools, and you can always install them later if you want to use some.

Select the **"Skip for now"** option, press **space bar** to check the box for it, then&nbsp;press **enter**.

![Open Claw onboard skill dependencies step with "Skip for now" selected](https://cdn-learn.adafruit.com/assets/assets/000/142/226/medium640/temperature___humidity_onboarding_skill_list.png?1769813190)

Select **"No"** and press **enter** when prompted for and API key for goplaces.

![Open Claw onboarding google places API key step with "No" selected](https://cdn-learn.adafruit.com/assets/assets/000/142/227/medium800/temperature___humidity_onboarding_goplaces_no.png?1769814117 )

Continue doing the same for the remaining API key prompts. Select **"No"** and then press **enter**.

All of these can be configured later if you decide to add the skills associated with them. But none of them are required just to use Open Claw.

![Open Claw onboarding API keys step showing "No" entered for all of the keys.](https://cdn-learn.adafruit.com/assets/assets/000/142/229/medium640/temperature___humidity_onboarding_apikeys_no.png?1769814814)

Next is a prompt for Hooks setup.

Use the **arrow keys** and the **space bar** to check the boxes for **"command-logger"** and **"session-memory"** then press **enter**.

![Open Claw onboarding hooks step with "command-logger" and "session-memory" options checked](https://cdn-learn.adafruit.com/assets/assets/000/142/230/medium800/temperature___humidity_onboarding_hooks.png?1769815267 )

Once the everything completes, it will output some useful information in 3 different panels.

Control UI - Contains the local IP address URLs to the web based control panel for OpenClaw. You can copy/paste the link with the full token included into a browser on the Pi to open the control panel. The token in the screenshot is obscured because it's a private key, like a password. Keep it secure, anyone with it who can access the Pi on the local network could issue commands to the bot.

Start TUI - Is noted as the best option. It will launch a session to chat with the bot right there in the terminal.

Token - Contains information about the Gateway token that is included in the long URL (obscured by red box in the screenshot). It has reminders of where it is stored and other ways to access it.

![Open Claw onboarding complete outputs with information panels about Control UI, TUI startup advice, and Gateway Token information](https://cdn-learn.adafruit.com/assets/assets/000/142/231/medium640/temperature___humidity_onboarding_almost_complete.png?1769818702)

Finally! The very last thing it will do is ask how to hatch the bot.

Since this whole setup process has been in the terminal, go ahead and select **Hatch in TUI** and press **enter**. This is the simplest option for the initial test to make sure things are working.

Once confirmed to be working, you can then access the web based control panel for a more graphical based UI to configure and interact with the bot.

The bot will wake up and ask you about yourself, as well as ask about a name to call it.&nbsp;

![Open Claw hello world message sent after being powered up for the first time](https://cdn-learn.adafruit.com/assets/assets/000/142/232/medium800/temperature___humidity_openclaw_hello_world.png?1769821156 )

I told it a bit about myself and that I was experimenting with OpenClaw on the Pi 5 to write a learn guide. I let it choose a name for itself and we settled on using the name Pipbot.

# OpenClaw on Raspberry Pi

## Sensors & NeoPixels

Adding new capabilities and functionality to the bot is a new kind of computing experience. There are many things you can simply ask the bot to install and set up. Since it has the ability to download things from the internet, run commands, and install programs, it can get lots of stuff running all on its own.

In some cases, it may be quicker or easier to do a little bit of the setup manually before turning the bot loose to utilize the new capability. I used a mix of these techniques while adding various hardware and software components to the system.

This guide will document the steps that I took, but there are other more or less hands-on approaches that could be used to achieve the same ends.

## Wiring
To connect a BME680 temperature, humidity, pressure and gas sensor, make the following wiring connections:

- Pi **SDA** to BME680 **SDA** (blue wire)
- Pi **SCL** to BME680 **SCL** (yellow wire)
- Pi **3V** to BME680 **VIN** (red wire)
- Pi **GND** to BME680 **GND** (black wire)

To connect a NeoPixel LED strand, make the following wiring connections:

- Pi **5V** to NeoPixel **5V** power (red wire)
- Pi **GND** to NeoPixel **GND** (black wire)
- Pi **26** to NeoPixel **DIN** (green wire)

![Wiring diagram showing Rpi 5 connected to a BME680 sensor via I2C and a NeoPixel strand connected to 5V, GND, and pin 26](https://cdn-learn.adafruit.com/assets/assets/000/142/275/medium640/temperature___humidity_sensor_and_neopixel_fritzing.png?1770136195)

## Sensors

A BME680 Temperature, Humidity, Pressure and Gas Sensor is the first piece of external hardware I connected to the bot. Any other I2C sensor could be used instead, of course, but the BME680 gives it a mix of different types of data relevant to human comfort to work with and made for a good first test.

Start by installing [Adafruit Blinka](https://circuitpython.org/blinka) on the Raspberry Pi. I did this step manually following the [instructions in the Blinka Learn Guide](https://learn.adafruit.com/circuitpython-on-raspberrypi-linux/installing-circuitpython-on-raspberry-pi). I chose manual for this step because the script takes a few minutes and can output some warnings that the bot might interpret as problems to try to fix when it would be unnecessary.

When making the Python virtual environment, make it inside of the OpenClaw workspace so that the bot will have easy access to use and install more things into it later.

```terminal
python3 -m venv ~/.openclaw/workspace/openclaw_venv --system-site-packages
```

When the Blinka install script is finished, it will prompt you to reboot the Pi.

After it comes back up, it's time to ask the bot to install the BME680 library and see if it can successfully get data from the sensor. I linked the bot to the pypi.org page for the CircuitPython BME680 driver, as well as the simpletest example from the same repo. Here is how the conversation started:

```auto
Me:
I have made a python venv for you in your workspace called `pipbot_venv` and I installed Adafruit Blinka inside of it. I have also connected a BME680 sensor to the I2C bus. Can you install this library in the venv https://pypi.org/project/adafruit-circuitpython-bme680/ and try to connect to the sensor and get data from it. Here is an example script that shows how to access the sensor: https://github.com/adafruit/Adafruit_CircuitPython_BME680/blob/main/examples/bme680_simpletest.py

Bot:
The I2C scan found a device at address **0x38**, but that's not a typical BME680 address. The BME680 usually appears at:
- **0x77** (default, SDO pin high/floating)
- **0x76** (SDO pin connected to GND)

Address 0x38 is commonly used by **AHT20/AHT21** sensors instead.

Could you double-check:
1. Is the sensor definitely a BME680? (0x38 suggests it might be an AHT20)
2. Are the SDA/SCL wires connected to the correct pins?
3. Is the sensor powered (VIN to 3.3V or 5V, GND to GND)?

If you want, I can try connecting to address 0x76 instead, or if you have an AHT20, I can grab that library instead!
```

The bot successfully installed the library and wrote a test script, but it didn't work when run. It started troubleshooting the issue by doing an I2C scan and found no devices with expected I2C address of `0x77 `or the alternate `0x76`. I confirmed those findings manually and then double checked all of the wiring including unplugging/re-plugging the STEMMA QT connector on the sensor. It turned out that it wasn't plugged in fully at first and started working fine once plugged in all the way.

```auto
Me:
I think I just didn't have the connector seated properly. Try looking again for the default address 0x77.

Bot:
🎉 The BME680 is working!

**Sensor readings:**
- **Temperature:** 19.8 °C (~68°F)
- **Gas resistance:** 35,910 ohm (air quality indicator)
- **Humidity:** 16.8%
- **Pressure:** 1006.5 hPa
- **Altitude:** ~56 meters (calculated from pressure)

The test script is saved at `~/workspace/bme680_test.py`. Note that the temperature has a -5°C offset applied (common for BME680s since the chip runs warm) — you may want to calibrate that against a known thermometer.

Also curious about that other device at 0x38 — that's typically an AHT20 humidity sensor. Do you have one of those on the same bus?
```

Success! Now the bot can get data from the sensor and relay it to us via any of the communications channels or take action based upon it.

## NeoPixels

To get NeoPixels going I gave the bot even less information than for the sensor. It was given only the PyPI link to the CircuitPython NeoPixel library, no example code. It still managed to complete this successfully on the first try. Here is the conversation:

```auto
Me:
Cool! next lets work on some rainbowy RGB fun. I have connect a strand of 30 NeoPixels to pin 26. Can you install https://pypi.org/project/adafruit-circuitpython-neopixel/ and then try to use it to control the NeoPixels. Try to set them to pink and brightness `0.1`

Bot:
Done! Are they glowing pink? 🩷✨

Me:
perfect! yes they are now glowing pink. You can add NeoPixel control to your tools file as well.

Bot:
Done! I've added NeoPixel documentation to TOOLS.md including:
- Pin and pixel count
- Code example for basic usage
- Common colors
- Brightness note
- Added the neopixel library to the installed packages list

Added! Now I've got docs for:
- 🌡️ BME680 Sensor
- 🌈 NeoPixels (30 on GPIO 26)

Ready to make some rainbow fun whenever you want! 🤖✨
```

# OpenClaw on Raspberry Pi

## TFT Display

## TFT Display

### Hardware
If you want to have the TFT display connected at the same time as I2C sensor(s), NeoPixels, or other hardware, then use a 2x20 ribbon cable to connect the spare rows of header pins on the underside of the PiTFT to a Pi Cobbler + as shown here. Then you can connect any desired hardware peripherals using the broken out pins on a breadboard that the Cobbler provides. The following connections are for the BME680 sensor connected via I2C.

- Cobbler **SDA** to BME680 **SDA** (blue wire)
- Cobbler **SCL** to BME680 **SCL** (yellow wire)
- Cobbler&nbsp; **3V** to BME680 **VIN** (red wire)
- Cobbler **GND** to BME680 **GND** (black wire)

These connections are for the NeoPixel strand.

- Cobbler **5V** to NeoPixel **5V** power (red wire)
- Cobbler **GND** to NeoPixel **GND** (black wire)
- Cobbler **26** to NeoPixel **DIN** (green wire)

![Fritzing wiring diagram showing the Pi Cobbler + connected to a BME680 sensor via I2C and a NeoPixel strip connected to 5V power, GND, and Pin 26](https://cdn-learn.adafruit.com/assets/assets/000/142/270/medium640/temperature___humidity_cobbler_fritzing.png?1770134303)

![Photo of Pi Cobbler + connected to BME680 sensor and a NeoPixel strip](https://cdn-learn.adafruit.com/assets/assets/000/142/272/medium640/temperature___humidity_cobbler_wiring.png?1770134390)

### Software

I used the 2.8" PiTFT w/ Capacitive touch and installed the driver for it manually using mirror install commands from the&nbsp;[Easy Install script from the Learn guide](https://learn.adafruit.com/adafruit-2-8-pitft-capacitive-touch/easy-install-2). The guide page documents several variations of the PiTFT, so be sure to run the mirror install command appropriate for the one you are using. For me this was:

```terminal
sudo -E env PATH=$PATH python3 adafruit-pitft.py --display=28c --rotation=90 --install-type=mirror
```

Info: The TFT command mentions "mirror", but on the Pi 5 under Wayland, this is will be treated as a second monitor with its own drawing area by default, not a mirror of an existing display.

Once the install is complete, you'll be prompted to reboot the Pi. After it comes up from the reboot, it should be showing part of the boot process, and then the desktop on the TFT display. Next there are a few system settings to change that make it easier to use both the TFT display and an HDMI display at the same time.

Open the Control Centre by clicking **Start** -\> **Preferences** -\> **Control Centre**.

In the **Display** section, set the **On-screen Keyboard** to **Disabled**.

In the **Screens** section, ensure that the layout of the screens shows the small SPI screen in the top left corner and the big HDMI screen to the right of it. If they aren't in this layout, then drag and move them to these positions.

In the **Taskbar** section, set the taskbar **Location** to the **HDMI** display.

![Rpi control centre Display section with on screen keyboard set to disabled](https://cdn-learn.adafruit.com/assets/assets/000/142/266/medium640/temperature___humidity_display_config_disable_onscreen_keyboard.png?1770130323)

![Rpi control centre Screens section showing small SPI screen top left and big HDMI screen to the right of it](https://cdn-learn.adafruit.com/assets/assets/000/142/267/medium640/temperature___humidity_display_config_layout.png?1770130351)

![Rpi control centre Taskbar section showing the taskbar Location set to the HDMI display](https://cdn-learn.adafruit.com/assets/assets/000/142/268/medium640/temperature___humidity_display_config_taskbar_both.png?1770130383)

Now the bot should be able to draw onto the TFT by keeping what it wants to show in the top left 320x240 pixels. I opted to use [pygame-ce](https://pyga.me/docs/) and just asked the bot to try it out.&nbsp;

```auto
Me:
I'm pretty sure it's not an AHT20, I have not connected one of those I don't think. We'll figure that out later. Next lets work on a TFT display for you. I have installed pygame-ce in your venv. And I've connected a TFT display and set it up so that the top left 320x240 pixels of the full system is shown on the TFT display. Can you try to to run a pygame-ce test script to show something on the TFT please?

Bot:
The script ran for 10 seconds and exited cleanly. You should have seen:
- Animated color-shifting background
- "Hello Pipbot!" text at the top
- A bouncing cyan circle
- Rotating red/green/blue rectangles
- FPS and time counters at the bottom

Did it show up on the TFT? 🖥️

Me:
I saw it on the main display. It created a window that looked like the correct size, but it was in the center of the HDMI display instead of on the TFT. Can you try to modify the script to move that window to the very top left (0,0) pixels coordinates?

Bot:
Added `SDL_VIDEO_WINDOW_POS = '0,0'` before init. Did it show up on the TFT this time?

Me:
Yes that time we did see it on the display. Is there a way to make it a borderless window so it won't have a title bar and will fit the 320x240 TFT perfectly?

Bot:
Added `pygame.NOFRAME` flag. Should now be a clean borderless 320x240 filling the TFT perfectly! 🎉

Me:
Awesome! that is working perfectly. Can you go ahead and save that as a tool so that you'll be able to draw more stuff on the display in the future easily.

Bot:
Done! Created `tft_display.py` module with a `TFTDisplay` class. Also updated `TOOLS.md` with docs for:

- **TFT Display** - usage examples and all drawing methods
- **BME680 Sensor** - quick reference
- **I2C Devices** - known addresses
- **Python venv** - installed packages

Now I can easily draw on the TFT anytime:

```python
from tft_display import TFTDisplay
tft = TFTDisplay()
tft.draw_text("🤖", (160, 120), center=True, size=64)
tft.update()
```
```

As the transcript shows, the first attempt rendered the pygame window on the HDMI display instead of the TFT. The bot was able to resolve that with one more tweak. Next I asked it to remove the window frame. Here is the test script it wrote running successfully on the TFT display

![TFT display showing the pygame-ce test script written by open claw bot](https://cdn-learn.adafruit.com/assets/assets/000/142/269/medium800/temperature___humidity_tft_display_test.png?1770132209 )

## Dashboard

To validate that the sensor and the TFT display can both work together at the same time, I asked it to create a basic dashboard to show the data from the sensor on the display.

```auto
Me:
can you please make a dashboard for the TFT display that shows live data from the BME680 sensor?

Bot:
The dashboard should be running on the TFT now! You should see:

- **Title:** "🤖 Pipbot Sensors"
- **Temperature** with live sparkline graph
- **Humidity** with live sparkline graph  
- **Pressure** in hPa
- **Air Quality** indicator (Good/OK/Poor based on gas resistance)

The sparklines will build up over time as it collects readings every second.

Can you see it? Press ESC to close when done, or let me know if there's any issues! 📊

Me:
Thanks, yeah that is great I can see the dashboard. Are you able to flip it 180 degrees? The way I have the TFT oriented currently that would be more convenient.

Bot:
Done! Added `rotate_180` option to the TFTDisplay class. The dashboard should now be flipped 180°. 

Is the orientation correct now? 🔄

Me:
perfect, that looks good now. Can you change the temperature to use Fahrenheit units please?

Bot:
Done! Temperature now shows in °F 🌡️
```

Here is a photo dashboard that it created

![Sensor dashboard created by open claw displayed on a PiTFT display](https://cdn-learn.adafruit.com/assets/assets/000/142/276/medium800/temperature___humidity_tft_sensor_dashboard.png?1770137078 )

# OpenClaw on Raspberry Pi

## USB Camera

Yellow: Note that using a camera may degrade your privacy. Always have the camera pointed to a place you don't mind OpenClaw seeing or disable the input.

Allowing the bot to interface with a USB camera is easy. [ffmpeg](https://ffmpeg.org/) is pre-loaded in the latest versions of Raspberry Pi OS and it is capable of getting a photo from the camera and manipulating it in various ways, if desired. I researched the command to use myself, but I suspect the bot would have come up with a workable one if asked without being given the specific command.

It was able to correctly identify an Adafruit Metro on the first try.

![Adafruit Metro sitting on a light surface](https://cdn-learn.adafruit.com/assets/assets/000/142/278/medium640/temperature___humidity_usb_camera_test.jpg?1770145274)

```auto
Me:
I have connected a camera to your system. You can use this command `ffmpeg -y -f v4l2 -video_size 1280x720 -i /dev/video0 -vf "hflip,vflip" -frames 1 out.jpg` to take a photo with the camera and save it as a jpg with the specified name. Can you try taking a photo now and tell me what you see?

Bot:
The camera works! 📸

I can see an **Adafruit Metro** board sitting on a light-colored surface (looks like a desk or workbench). It's one of Adafruit's Arduino-compatible development boards — I can clearly see:

- The **"Metro"** branding in that distinctive diagonal white text
- The **Adafruit star logo** in the corner
- CE and FCC certification marks
- The classic Arduino form factor with header pins along the edges (power, analog inputs, digital I/O)
- A barrel jack for power, USB port, and various SMD components
- What looks like an ATmega chip on the right side

Pretty fitting subject for my first photo, given where you work! 🦎 The image quality is solid — good lighting, reasonably sharp. The hflip/vflip in the command is orienting things correctly.

Want me to note this camera setup in TOOLS.md for future reference?

Me:
yeah, go ahead and make note of it.

Bot:
Done! Added a Camera section to TOOLS.md with the device path, resolution, orientation flags, and the capture command. Now I'll remember how to use it. 🦎
```

# OpenClaw on Raspberry Pi

## Speech Input and Output

Yellow: Note that using a microphone may degrade your privacy. Consider not having the microphone active when you want privacy.

Giving the bot access to talk and understand speech makes it easier to communicate without hands on the keyboard. Once this is setup you can talk with the bot using asynchronous voice messages on WhatsApp or your preferred communications app. You can send it a quick voice message asking it to do something, or ask it to send you back the status of something with its voice.

## eSpeak Text To Speech

[eSpeak](https://espeak.sourceforge.net/) is an open source text to speech project originally published 2006. It is very robotic compared to more modern TTS solutions that have been trained on countless hours of actual human speech recordings. The robot vibe fits quite nicely for an LLM agent running inside of a Raspberry Pi. eSpeak is included in the Raspberry Pi apt repository by default so it can be installed with a single command via apt.

```terminal
sudo apt install espeak
```

Once eSpeak is installed, it's easy to use it from the terminal with a command like this.

```terminal
espeak "Hello world"
```

I told the bot this was installed and ready, and asked it to try it out and then add it as a tool to use later.&nbsp;

Here's a recording of what it said followed by the chat transcript.

[espeak_hello_test.mp3](https://cdn-learn.adafruit.com/assets/assets/000/142/279/original/espeak_hello_test.mp3?1770146078)
```auto
Me:
I installed `espeak` so that you can use it to speak out loud. Go ahead and try to use to say hi.

Bot:
It ran without errors, which means it should have spoken through the audio output. Let me ask the user if they heard it.

Did you hear me? 🔊

Me:
Yes, heard it loud and clear. Perfect! You can go ahead and save that in your tools file so you use your voice later.
```

## Whisper Small Speech To Text

Next up is to enable speech to text so that we can talk to the bot using voice instead of text. There are many remote speech to text APIs available that may be used, but I wanted to see if it was possible to keep the transcription local using the [Whisper Small model](https://huggingface.co/openai/whisper-small). For this, I decided to be more hands off, passing the reigns over to the bot and see if it could complete the full installation and usage of the whisper-small model.

When it uses the whisper-small model for the first time, it must fetch the files for the model from Huggingface servers. It also takes a few minutes to download and install all of the necessary requirements from pip.&nbsp;

Peeking into the full session logs, it's possible to see the bots "thinking" steps where it writes a message to itself about what has happened and what it should do next. Reviewing the logs of this session showed that the bot encountered a few different issues when it wrote the test script, but was able to resolve them on its own.

Once everything was installed and the issues resolved, it was able to successfully transcribe the audio file. As the bot noted in the summary, it does take 30-60 seconds to transcribe a short audio message, so this is not good enough for real time communication or for long form discussions. But it is impressive that the Pi can handle it locally for free, and works great for short voice messages.

It did ignore the prompt to tell a joke from the test audio recording during the documented trial, but I did this step multiple times during testing and it did tell a joke during some of them. Here is one of the jokes

> "Why do programmers prefer dark mode? Because light attracts bugs!"

[test_audio_input.mp3](https://cdn-learn.adafruit.com/assets/assets/000/142/280/original/test_audio_input.mp3?1770147840)
```auto
Me:
Lets work on speech input. Can you please install the whisper-small model https://huggingface.co/openai/whisper-small and any requirements needed inside of your venv. I have put a test_audio_input.mp3 file in your workspace that you can try to transcribe with it in order to verify whether it's working.

Bot (Thinking):
The installation completed successfully. Now let me create a test script to load the whisper-small model and transcribe the test audio file.

Packages installed successfully. Now let me create a test script to download the whisper-small model and transcribe your audio file:

Bot (Thinking):
There's an error with num_frames. This seems to be a compatibility issue with the newer transformers version and the audio pipeline. Let me try a different approach - using the pipeline directly without the model preloading, or using a simpler API.

There's a compatibility issue with the pipeline. Let me try a simpler approach using the Whisper pipeline directly:

Bot (Thinking):
Same error. This seems to be a bug in transformers 5.0.0. Let me try loading the audio manually and processing it. I'll need to install librosa or soundfile for audio loading.

Same error - this appears to be a bug in the new transformers version. Let me try a different approach with manual audio loading:

Bot (Thinking):
Now let me update the test script to manually load the audio file and use the model directly rather than through the pipeline.

Now let me try a different approach using manual audio loading:

Bot:
**Summary:**

✅ **Installed:**
- `transformers` (5.0.0), `torch` (2.10.0), `torchaudio` (2.10.0)
- `librosa`, `soundfile` for audio loading
- `accelerate` for model optimization

✅ **Model:** `openai/whisper-small` downloaded and cached

✅ **Test transcription successful:**
- Audio file: `test_audio_input.mp3` (5.8 seconds)
- Result: *"Hey PipBot, can you hear this? If you're able to hear it, can you tell us a joke?"*

**Performance note:** Running on CPU, transcription takes ~30-60 seconds for short clips. The Pi 5 handles it well, but it's not real-time. For the learn guide, this should work great for demonstrating voice input capabilities.

The test script is at `whisper_test.py` in your workspace, and I've added documentation to `TOOLS.md` for reference.
```


## Featured Products

### Raspberry Pi 5 - 8 GB RAM

[Raspberry Pi 5 - 8 GB RAM](https://www.adafruit.com/product/5813)
The Raspberry Pi 5&nbsp;is the newest Raspberry Pi computer, and the Pi Foundation knows you can always make a good thing&nbsp;_better_! And what could make the Pi 5 better than the 4? How about a&nbsp;_faster_&nbsp;processor, USB 3.0 ports, and an updated Gigabit Ethernet chip...

Out of Stock
[Buy Now](https://www.adafruit.com/product/5813)
[Related Guides to the Product](https://learn.adafruit.com/products/5813/guides)
### Official Raspberry Pi 27W PD Power Supply 5.1V 5A with USB C

[Official Raspberry Pi 27W PD Power Supply 5.1V 5A with USB C](https://www.adafruit.com/product/5814)
The official Raspberry Pi PD USB-C power supply is here! Superfast with just the right amount of cable length to get your Pi 5 projects up and running! With true Power Delivery support, this supply can easily supply 5 A of current!

Comes in gothy Adafruit Black! Great for use with Pi 4...

In Stock
[Buy Now](https://www.adafruit.com/product/5814)
[Related Guides to the Product](https://learn.adafruit.com/products/5814/guides)
### Adafruit PiTFT Plus 320x240 2.8" TFT + Capacitive Touchscreen

[Adafruit PiTFT Plus 320x240 2.8" TFT + Capacitive Touchscreen](https://www.adafruit.com/product/2423)
Is this not the cutest little display for the Raspberry Pi? It features a 2.8" display with 320x240 16-bit color pixels and a capacitive touch overlay.&nbsp;That's right, instead of a resistive touchscreen, which requires a fingernail or stylus, you can now use a fingerpad. The screen...

Out of Stock
[Buy Now](https://www.adafruit.com/product/2423)
[Related Guides to the Product](https://learn.adafruit.com/products/2423/guides)
### Adafruit BME680 - Temperature, Humidity, Pressure and Gas Sensor

[Adafruit BME680 - Temperature, Humidity, Pressure and Gas Sensor](https://www.adafruit.com/product/3660)
The long awaited BME680 from Bosch gives you&nbsp;_all the environmental sensing you want_&nbsp;in one small package. This little sensor contains&nbsp; **temperature** ,&nbsp; **humidity** ,&nbsp; **barometric pressure,** &nbsp;and&nbsp; **VOC...**

In Stock
[Buy Now](https://www.adafruit.com/product/3660)
[Related Guides to the Product](https://learn.adafruit.com/products/3660/guides)
### Assembled Pi Cobbler Plus - Breakout Cable

[Assembled Pi Cobbler Plus - Breakout Cable](https://www.adafruit.com/product/2029)
The Raspberry Pi B+ / Pi 2 / Pi 3 / Pi 4&nbsp;has landed on the Maker World like a 40-GPIO pinned, quad-USB ported, credit card sized bomb of DIY joy. And while you can use most of our great Model B accessories by hooking up our [downgrade...](https://www.adafruit.com/product/1986)

In Stock
[Buy Now](https://www.adafruit.com/product/2029)
[Related Guides to the Product](https://learn.adafruit.com/products/2029/guides)
### Half-Size Breadboard with Mounting Holes

[Half-Size Breadboard with Mounting Holes](https://www.adafruit.com/product/4539)
This cute 3.2″ × 2.1″ (82 × 53mm) solderless half-size breadboard has four bus lines&nbsp;and 30 rows of pins, our favorite size of solderless breadboard for projects. You get a whoppin' **400 tie points**! [Plug in...](http://adafruit.com/feather)

Out of Stock
[Buy Now](https://www.adafruit.com/product/4539)
[Related Guides to the Product](https://learn.adafruit.com/products/4539/guides)
### Micro HDMI to HDMI Cable - 2 meter

[Micro HDMI to HDMI Cable - 2 meter](https://www.adafruit.com/product/1322)
Connect an HDMI device with a micro HDMI port to one with a regular size HDMI port together with this basic HDMI cable. It has nice molded grips for easy installation, and is 2 meter long (about 6 feet). This is a HDMI 1.3 cable. Works great with a BeagleBone or Raspberry Pi 4  
  
<img...></img...>

In Stock
[Buy Now](https://www.adafruit.com/product/1322)
[Related Guides to the Product](https://learn.adafruit.com/products/1322/guides)
### GPIO Ribbon Cable for Raspberry Pi Model A+/B+/Pi 2/Pi 3/Pi 4/5

[GPIO Ribbon Cable for Raspberry Pi Model A+/B+/Pi 2/Pi 3/Pi 4/5](https://www.adafruit.com/product/1988)
That new Raspberry Pi® Model B+ computer you just got has a row of 2x20 pin headers soldered on - those are the GPIO (general purpose input/output) pins, and for those of us who like to hack electronics, they are where the real fun is. By programming the Pi, you can twiddle those pins high...

In Stock
[Buy Now](https://www.adafruit.com/product/1988)
[Related Guides to the Product](https://learn.adafruit.com/products/1988/guides)

## Related Guides

- [Adafruit BME680](https://learn.adafruit.com/adafruit-bme680-humidity-temperature-barometic-pressure-voc-gas.md)
- [Raspberry Pi E-mail Notifier Using LEDs](https://learn.adafruit.com/raspberry-pi-e-mail-notifier-using-leds.md)
- [Basic Resistor Sensor Reading on Raspberry Pi](https://learn.adafruit.com/basic-resistor-sensor-reading-on-raspberry-pi.md)
- [Drive a 16x2 LCD with the Raspberry Pi](https://learn.adafruit.com/drive-a-16x2-lcd-directly-with-a-raspberry-pi.md)
- [Processing on the Raspberry Pi & PiTFT](https://learn.adafruit.com/processing-on-the-raspberry-pi-and-pitft.md)
- [Adafruit's Raspberry Pi Lesson 12. Sensing Movement](https://learn.adafruit.com/adafruits-raspberry-pi-lesson-12-sensing-movement.md)
- [Running OpenGL-based Games & Emulators on Adafruit PiTFT Displays](https://learn.adafruit.com/running-opengl-based-games-and-emulators-on-adafruit-pitft-displays.md)
- [Pi Hole Ad Detection Display with PiTFT](https://learn.adafruit.com/pi-hole-ad-pitft-tft-detection-display.md)
- [Raspberry Pi Analog to Digital Converters](https://learn.adafruit.com/raspberry-pi-analog-to-digital-converters.md)
- [Air Quality Sensor 3D Printed Enclosure](https://learn.adafruit.com/air-quality-sensor-silo-house.md)
- [Digi-Key IoT Studio Mini Smart Home](https://learn.adafruit.com/digikey-iot-studio-smart-home.md)
- [RasPipe: A Raspberry Pi Pipeline Viewer, Part 2](https://learn.adafruit.com/raspipe-a-raspberry-pi-pipeline-viewer-part-2.md)
- [Program an AVR or Arduino Using Raspberry Pi GPIO](https://learn.adafruit.com/program-an-avr-or-arduino-using-raspberry-pi-gpio-pins.md)
- [PiTFT Python + Pillow Animated Gif Player](https://learn.adafruit.com/pitft-linux-python-animated-gif-player.md)
- [Playing sounds and using buttons with Raspberry Pi](https://learn.adafruit.com/playing-sounds-and-using-buttons-with-raspberry-pi.md)
