Use neural signals from your brain’s visual cortex to control electronics. This guide will show you how to use the Nextmind Dev Kit, Unity game development software, and an Adafruit Feather microcontroller board to control a servo motor with your mind & visual focus. This example project translates area of visual focus into the physical position of a servo with attached pointer, but it is just that – an example. You’ll likely think of a million more awesome applications after experimenting a bit.

The Nextmind Dev Kit offers an unusually reliable method for converting neural activity to user input. The device's sensors measure tiny electrical signals from the visual cortex located at the back of the brain. So, it essentially reads what you’re visually focussing on. This may seem less magical compared to other brain-computer interfaces, but it seems to make the Nextmind much more practical, dependable than other brain-computer interfaces I've experimented with.

What you’ll need

  • Nextmind Dev Kit
  • Windows or Mac computer w Bluetooth
  • "Craft stick" / tongue depressor (optional)
1 x Feather M4 Express
Feather M0, QT Py, others should work as well
1 x Micro USB Cable
Connect Feather to your computer
1 x Hook-Up Wire
Set of 6 spools
1 x Half-size Breadboard
For Feather to servo connections
1 x Wire Strippers
For makin' jumpers
1 x Pocket Screwdriver
Install pointer on servo
1 x Panavise Jr.
Optional way to hold servo during use

This guide was first published on Jan 20, 2021. It was last updated on Jan 20, 2021.

This page (Overview) was last updated on Jan 19, 2021.

Text editor powered by tinymce.