Let's see how we get input from the touchscreen. We'll use this to light some LEDs on the breadboard.

With the PiTFT installed with the 4 tactile buttons there aren't many GPIs left on the model B Raspberry Pi. So wire up #17 and #4. The software renders 4 labels on the screen and then looks for mouse events in the four quarters

Startup

In this section we're now setting the two GPIOs as outputs so we can control the LEDs. We also now have a couple more environment variables defined for SDL via os.putenvso we can use the touchscreen. 

import pygame
from pygame.locals import *
import os
from time import sleep
import RPi.GPIO as GPIO
 
#Setup the GPIOs as outputs - only 4 and 17 are available
GPIO.setmode(GPIO.BCM)
GPIO.setup(4, GPIO.OUT)
GPIO.setup(17, GPIO.OUT)
 
#Colours
WHITE = (255,255,255)
 
os.putenv('SDL_FBDEV', '/dev/fb1')
os.putenv('SDL_MOUSEDRV', 'TSLIB')
os.putenv('SDL_MOUSEDEV', '/dev/input/touchscreen')
 
pygame.init()
pygame.mouse.set_visible(False)
lcd = pygame.display.set_mode((320, 240))
lcd.fill((0,0,0))
pygame.display.update()
 
font_big = pygame.font.Font(None, 50)

UI definition

The touch_buttons map simply holds the button text and center position (x,y with top left as 0,0) for this text on the display. We then loop through this map and draw each "button".

touch_buttons = {'17 on':(80,60), '4 on':(240,60), '17 off':(80,180), '4 off':(240,180)}
 
for k,v in touch_buttons.items():
    text_surface = font_big.render('%s'%k, True, WHITE)
    rect = text_surface.get_rect(center=v)
    lcd.blit(text_surface, rect)
 
pygame.display.update()

Main loop

Here we introduce pygame events. Whenever there's some user interaction an event is generated on a queue for our program to read. It's the responsibility of our program to read this queue quickly enough to avoid events getting thrown away. Of course, we want to do this anyway to have a reasonably responsive program.

We're only really interested in the MOUSEBUTTONUP event, this will be added to the queue when you lift your finger off the display. We then simply get the current mouse position (this is also availble in the event itself) and determine which quadrant of the screen was selected.

while True:
    # Scan touchscreen events
    for event in pygame.event.get():
        if(event.type is MOUSEBUTTONDOWN):
            pos = pygame.mouse.get_pos()
            print pos
        elif(event.type is MOUSEBUTTONUP):
            pos = pygame.mouse.get_pos()
            print pos
            #Find which quarter of the screen we're in
            x,y = pos
            if y < 120:
                if x < 160:
                    GPIO.output(17, False)
                else:
                    GPIO.output(4, False)
            else:
                if x < 160:
                    GPIO.output(17, True)
                else:
                    GPIO.output(4, True)
    sleep(0.1)

This is certainly not a perfect UI experience: you could touch the screen on one button, drag to another and then lift off - which button did you hit? The complexities of event handling for simple things like clicking on buttons is one of the reasons why UI frameworks exist. We'll start using one in the next section. 

Again, you can run this from the pygamelcd project:

sudo python test3.py

This guide was first published on Feb 01, 2016. It was last updated on Mar 08, 2024.

This page (From screen to GPIO) was last updated on Jan 24, 2016.

Text editor powered by tinymce.