AdaBox 005: Break For Pi

This quarter’s AdaBox is a welcome switch away from the Adafruit feather line of boards. While I’ve created my own retro gaming device with a Raspberry Pi before, this is a nice kit with everything you need except a screen (you could use a TV). It’s a bummer that they didn’t opt for the Pi Zero W.

adabox-005-contents.png

The box came with a set of Hammer Headers, which I’ve been skeptical about since I first saw. I have no issues soldering, actually enjoy the task, and prefer it so I have solid connections to the pins. I figured I’d give the headers a try though. Plus it was good excuse to try out the garage sale hammer I bought. Take a look at the 8x speed time-lapse…

I appreciate the idea and can see that hammer headers would be a good option in schools where they can’t have soldering irons, but I’ll never use them again. It took me 6 minutes and I felt like I was destroying the Pi Zero.

The Adafruit Joy Bonnet is a cute little add-on for the Pi. The first thing I noticed when holding it was how cheap the thumbstick feels and sounds. I wouldn’t expect much out of such a small controller that’s only $15 and snaps on to a Raspberry Pi though.


When I get my 3D printer later this year I’ll make a case for this 7″ screen I bought a couple of years ago, maybe even with a way to clip in the Pi Zero. Or better yet, a Pi 3B, which is better suited for a retro gaming device.

This is now my 8th Raspberry Pi. The 7th was named grasshopper, but what type of pie should I use for the letter H? Comment with your suggestions because the Wikipedia list I usually reference has two “H” pies I’ve never heard of.

 

A Raspberry Pi HAT

I successfully built the second piece to a large project I’m working on. I’ve essentially built my own XL Raspberry Pi HAT (Hardware Attached on Top). Since I’m not following the specs, I shouldn’t really call it a HAT.

I’m not sure how, but once again I correctly connected everything on the first try. Either I’m extremely lucky, my attention to detail is paying off, or a combination of the two. I’m just waiting for some catastrophic failure to happen soon when I solder things the wrong way one of these days. Every one of my solder bridges worked. I did run continuity tests on all of the early bridges, which I’m sure was a big factor to my success.

Any guesses on what this board does? Leave your best guess in the comments. It’ll be at least a month before I share more details because I need to finish the entire project first.

Multiplexing 7 Segment Common Cathode Displays on a Raspberry Pi

7-segment-displays
I picked up a 10 pack of these 7 segment red LED displays for less than $5. Since each display requires connecting to a minimum of 8 of the 10 pins (9 if using the decimal point), they aren’t exactly easy to work with. Sure, you can buy these where 2 or 4 displays are already connected in a nice package, controlled with the help of an integrated circuit, but where is the fun in that?

If you need to use more than 1 or 2 displays (at 8-9 pins per display), you’ll quickly run out of pins on your microcontroller or Raspberry Pi. The most common way to work with several of these displays is called multiplexing. It’s a method where you briefly turn on one display, turn it off, turn on the next one, and turn it off. You repeat this through all of your displays and then start over. If you do this fast enough, the human eye thinks all of the displays are on at once. It’s pretty slick!

The advantages of multiplexing are:

  • Fewer wires/pins needed to drive the displays.
  • Lower power consumption since the LEDs on only one display are lit.

common-cathode-7-segment-LED-display-pinout.png
Image source: learningaboutelectronics.com
Let’s get our hands dirty, shall we?

Seven of the pins on one of these displays match up to the 7 segments (labeled a through g), one pin is for the decimal point (DP), and the two remaining pins can be used for the common cathode (cc), though you only need to connect one or the other. Over to the right you can see how all of the pins and LED segments are arranged. Pretty straight forward.

I’m using 6 of these displays in a project, so I needed a lot of wires. It got complex and tangled in a hurry, but amazingly, I connected all the wires without a single mistake on my first try. 🙂 For the most part, I based my circuit design off of this schematic…

multiplexing-7-segement-display-schematic.png
Image source: circuitstoday.com
The end result is something like the Fritzing screenshot below. With so many wires overlapping, it’s not easy to see what’s really going on here. I suggest grabbing wiring.fzz from my GitHub repo and playing around with it in the Fritzing app.

multiplexing-fritzing.png

When I went to write my proof of concept code, I decided to use the Gpiozero Python library to simplify working with the LEDs. The library allowed me to set up a couple of arrays for the LED segments and the 6 digits (displays)…

segment_leds = []
for i in range( len( segment_pins ) ) :
segment_leds.append( LED( segment_pins[i] ) )

digits = []
for i in range( len( digit_pins ) ) :
digits.append( LED( digit_pins[i] ) )

Then I could easily loop through and toggle the LEDs in a display as necessary…

for i in range( len( digits ) ) :
for j in range( 7 ) :
if ( numbers[ digit_values[i] ][j] ) :
segment_leds[j].on()
else :
segment_leds[j].off()

To make sure things worked I count up from 999000 and then start back at 000000 after hitting 999999. You can see the full code on GitHub.

Now for some visual proof that I actually got it all working! Here it is running when I keep one digit lit for 5/10,000th of a second before turning it off and lighting the next digit.

You’d never know that only one digit is turned on at a time, would you?

If I change from 0.0005 to 0.05 of a second you can start to see that only one display is on at any point in time.

You may also notice it’s counting up a low slower due to the way this code increments the counter. Don’t worry about that.

When I keep each digit turned on for half of a second you can really see how this works.

An issue I’m running into on a Pi Zero is when the processor gets busy doing other tasks, there is a bit of flicker across the displays. You can see this a couple of seconds in to the first video. I’m guessing the code would perform much better on a Raspberry Pi 3B. For my project it’s not a concern, but I want to mention it in case you follow this for your own project. You may also pick up what looks like random flickering of a single digit here and there but that’s due to video timing; the human eye doesn’t see any of that when it’s in front of you.

If necessary, you can take multiplexing a step further and only light up an individual LED on each display at a time, with a method called charlieplexing. It will use even less power, but due to the speed at which you need to switch from one LED to the next, especially across an array of multiple displays, you lose brightness to the human eye.

The MagPi Magazine: Issue 57

This morning when I saw the latest issue of The MagPi Magazine came with an exclusive Google kit, I wanted it. I was up early for a golf meeting and some errands so I stopped at Barnes & Noble to see if they had any copies. I was excited to find a couple on the shelf!

Easy to find in a sea of magazines.

The woman working the register said she had just put them out and there was actually a 3rd copy that was coming apart so she was going to glue it back together.

At $15 for a magazine, I don’t think you can really call this a “free” kit, but it’s still a good value. I don’t think I’ve ever “unboxed” a magazine before…

I love the emphasis on the maker here. Google is in this together with us.
The actual magazine is the first thing you see when you open the box.
A sheet with the typical warnings and instructions.
The fun stuff! Various electronics parts and cardboard pieces to make a case.
I guess the kits come with either a green, yellow, red or blue button.

This will be neat to mess around with. I’ve thought about turning one of my Pis into an Alexa type device to put in my office or bedroom and could easily do it now. If you have any project ideas involving voice, let me know.

Infrared Hacking

Remember last week’s post about tearing apart a component switch to repurpose parts? I spent some time fooling around with IR after that. I thought it would be neat to recreate the basic functionality of switching between 3 devices. For my proof of concept the devices were simple the 3 status LEDs, but you can imagine the possibilities of turning on different devices or triggering processes run on a computer.

The new microcontroller I got is one I posted about a few months ago, called Puck.js. It has an IR transmitter built-in, so I wanted to use it to mimic a remote. Puck.js is pretty slick. It’s really neat being able to program a device in Javascript through a web IDE over Bluetooth Low Energy. No wires at all!

Here’s a video of my hacking results.

Time for the geeky stuff…

First I tried recording IR commands from the remote using the method shown in the Infrared Record and Playback with Puck.js tutorial. I ran into two problems.

  1. Propping a component into the GPIO pins like the author did doesn’t work for shit. You can’t get a solid electrical connection, especially if you bump it at all.
  2. I kept getting Out of Memory errors on the device because the array would get really large really soon.

My next thought was to wire up to one of my other microcontroller that run Arduino, but the popular IR Library (IRLib2) doesn’t support the chips used in any of the boards I have. So over to a Raspberry Pi Zero. Pretty much every search result mentioned using Linux Infrared Remote Control (LIRC). May of the setup instructions I found were incomplete, but I was able to get things running by taking pieces from these two sites:

I’ll detail the steps that worked for me. Before I go down the software route though, I wanted to make sure the IR sensor worked. The only markings on the component are “71M4” and I have been unable to find a datasheet anywhere to match. Luckily these IR receivers are pretty standard and I had a pretty good idea of the pins from looking at how it was connected in the old device. img_8940I got the idea of hooking up a simple LED test circuit on the data pin from an Adafruit learn guide. Pin 1 is data, going into a GPIO pin on the Raspberry Pi (26 in my case), pin 2 is ground, and pin 3 is power (VCC). You may want to use the 3.3V pin on the Pi to provide your power instead of 5V just to be safe, or consult the datasheet for the IR sensor you’re using. Connect the anode of the LED to power and the cathode to pin 1 of the sensor using a 220 Ω (I used 200) resistor. When you press buttons on an IR remote, the sensor will send data through pin 1 and the LED will light up. Here’s a Fritzing wiring diagram for this test as well.

ir-sensor-test-fritzing.png

My test was successful! Now I was able to move on with some confidence knowing the part worked.

Install LIRC:

sudo apt-get update
sudo apt-get install lirc

Edit the /etc/modules file:

sudo nano /etc/modules

Add to the end:

lirc_dev
lirc_rpi gpio_in_pin=26

Change the pin if you’re using something other than 26. If you’re also going to do IR transmitting, you can add a space and gpio_in_pin=22 on that last line.

Press Ctrl + X, hit Y to say you want to save, and then Enter.

Edit the /etc/lirc/hardware.conf file:

sudo nano /etc/lirc/hardware.conf

Look for the DRIVER, DEVICE, and MODULES settings. Set them to match:

DRIVER="default"
DEVICE="/dev/lirc0"
MODULES="lirc_rpi"

Press Ctrl + X, hit Y to say you want to save, and then Enter.

Edit your /boot/config.txt file:

sudo nano /boot/config.txt

Look for this line:

# Uncomment this to enable the lirc-rpi module

If you see it, remove the # from the next line and edit it to look like this (if your file doesn’t have it, add this to the end of the file):

dtoverlay=lirc-rpi,gpio_in_pin=26

If you’re going to do transmitting, also add this to the same line:

,gpio_out_pin=22

Change both pins to match whatever you’re using. Press Ctrl + X, hit Y to say you want to save, and then Enter.

Reboot your Pi:

sudo reboot

Now it’s time to use LIRC to record the codes sent by whatever remote you’re using. First you’ll want to see names you want to give your buttons. Run:

irrecord --list-namespace

Scroll through the list and make notes on all of the codes you want to use for your buttons. You’ll need the codes in a bit. Here was my list:

KEY_POWER
KEY_1
KEY_2
KEY_3

Stop LIRC:

sudo /etc/init.d/lirc stop

Use irrecord to create a configuration file for your remote. Follow the instructions carefully that come up on your screen. This took me several minutes for my remote with only 4 buttons.

Note: When it says Please enter the name for the next button (press to finish recording) is when you’ll need those codes above.

irrecord -d /dev/lirc0 ~/lircd.conf

When finished you’ll have a new file in your home directory. Take a look at it:

cat ~/lircd.conf

Mine looked like:

begin remote

  name  /home/pi/lircd.conf
  bits           16
  flags SPACE_ENC|CONST_LENGTH
  eps            30
  aeps          100

  header       9004  4474
  one           580  1666
  zero          580   542
  ptrail        578
  repeat       9006  2229
  pre_data_bits   16
  pre_data       0x61D6
  gap          107888
  toggle_bit_mask 0x0

      begin codes
          KEY_POWER                0x7887
          KEY_1                    0x40BF
          KEY_2                    0x609F
          KEY_3                    0x10EF
      end codes

end remote

Make a backup of the default LIRC configuration file:

sudo mv /etc/lirc/lircd.conf /etc/lirc/lircd_original.conf

Move your new configuration file over:

sudo cp ~/lircd.conf /etc/lirc/lircd.conf

Restart LIRC:

sudo /etc/init.d/lirc start

Install the Python LIRC library:

sudo apt-get install python-pylirc

Create a pylirc.conf file:

nano pylic.conf

You need to set up each button similar to what mine looks like:

begin
  remote = *
  button = KEY_POWER
  prog = pylirc
  config = KEY_POWER
end

begin
  remote = *
  button = KEY_1
  prog = pylirc
  config = KEY_1
end

begin
  remote = *
  button = KEY_2
  prog = pylirc
  config = KEY_2
end

begin
  remote = *
  button = KEY_3
  prog = pylirc
  config = KEY_3
end

Do a simple copy/paste and change the button and config for each entry.

Press Ctrl + X, hit Y to say you want to save, and then Enter.

Create a basic Python test program:

nano pylirc-test.py

Paste in:

#!/usr/bin/python

import pylirc

pylirc.init( 'pylirc', './pylirc.conf', 0 )

while ( True ) :
	s = pylirc.nextcode( 1 )
	command = None
	if ( s ) :
		for ( code ) in s :
			print( code["config"] )

Press Ctrl + X, hit Y to say you want to save, and then Enter.

Run the program:

python pylirc-test.py

Press buttons on your remote and if everything is working you’ll see the special name codes being output for each button you press.

pylirc-test-output.png

Hit Ctrl + C to stop the program.

I already had all of the logic written for the buttons to work and switch LEDs, so it was easy to add in a little more code to take action when the appropriate IR codes were received.

Once I found the correct information, setup on the Pi was quite easy. A lot of steps, but easy stuff. Making the Puck.js duplicate my Infrared remote’s codes was a bit of a challenge. From the Puck.js Infrared tutorial I linked at the beginning I knew I needed to have an array of pulse lengths, but I didn’t have anything like that from the LIRC configuration. All I had was some hex values for each code:

  • KEY_POWER: 0x7887
  • KEY_1: 0x40BF
  • KEY_2: 0x609F
  • KEY_3: 0x10EF

Combined with another hex code for pre_data (0x61D6) from the lircd.conf file, I had more complete codes:

  • KEY_POWER: 0x61d67887
  • KEY_1: 0x61d640bf
  • KEY_2: 0x61d6609f
  • KEY_3: 0x61d610ef

I searched all over for tools to reverse these into pulses or “Pronto Hex” values, which I also found could be used with Puck.js by decoding them. I couldn’t find anything. At some point I came across the Infrared remote control signals repository on GitHub.

Infrared remote control signals from the LIRC remote configurations project, converted to Pronto Hex and Protocol, Device, Subdevice, and Function using lirc2xml

BINGO! It had the Pronto Hex codes. I cloned the repo and started searching for my hex values. I found power, 1, and 2 matched up with codes used by something called a gigabyte TV. I plugged the codes into a program and they worked! I was only missed the code for button 3.

Then I spend way too much time still searching around. I knew enough about how IR worked and had 3 codes. I finally realized I should be able to figure out what changes to make in order to get my 4th and final code. I converted the hex values to binary:

  • KEY_POWER: 0x7887 = 111100010000111
  • KEY_1: 0x40BF = 100000010111111
  • KEY_2: 0x609F = 110000010011111
  • KEY_3: 0x10EF = 001000011101111

Then I started looking at the end of each array of Pronto Hex codes, because every code uses the same pre_data. I quickly determined an ON bit (1) was 003e, OFF (0) was 0013, and they were separated by 0017. I made the necessary adjustments and had all 4 buttons working with IR!

This IR journey turned out to be quite an adventure. I learned a lot, which was the point. My infrared-3-input-selector project on GitHub has the Python program used in the demo video, my pylirc config file, the simple pylinc test program, the Puck.js code, and even an Arduino sketch with the button and LED logic I created initially before realizing I needed to switch to the Raspberry Pi.

grasshopper

I bought the new Raspberry Pi Zero W along with the official Raspberry Pi Zero case. It’s really nice not having to worry about a wireless adapter. The addition brings my Pi family up to 7, which of course means it’s time for the letter G. Not a lot of pies starting with G. I ended up going with grasshopper pie, which is based off the popular drink and happens to be the first drink I ever made with my Mom. We always used to make the ice cream variation during the holidays, which is effectively a mint milkshake.

HC-SR04 as a Motion Sensor

The HC-SR04 ultrasonic sensor uses sonar to determine distance to an object like bats do. It offers excellent non-contact range detection with high accuracy and stable readings in an easy-to-use package. From 2cm to 400 cm or 1” to 13 feet. Its operation is not affected by sunlight or black material like Sharp rangefinders are (although acoustically soft materials like cloth can be difficult to detect). It comes complete with ultrasonic transmitter and receiver module.
Complete Guide for Ultrasonic Sensor HC-SR04

You can get the HC-SR04 from Amazon or various electronics shops for $3-5 or even under $2 if you buy packs of them. I got 2 of them in a parts kit I bought on Amazon and used one for Blog in a Box Paparazzi.

I was using the sensor to sort of detect motion, or more specifically when someone walked into a room. My prototype was set on a desk at about chest height about 1-2 feet after the doorway. While working on the project I ran into several challenges:

  • Accuracy of readings.
  • Other activity on the Raspberry Pi.
  • Sampling over multiple readings.
  • Rogue readings vs actual motion.
  • Timing between readings.
  • Bailing if an echo takes too long.

I wrote my code in Python and heavily based it on ModMyPi’s blog post HC-SR04 Ultrasonic Range Sensor on the Raspberry Pi. Here are the important pieces…

def read_ultrasonic() :
	# Make sure the trigger pin is clean
	GPIO.output( ULTRASONIC_TRIG_PIN, GPIO.LOW )
	# Recommended resample time is 50ms
	time.sleep( 0.05 )
	# The trigger pin needs to be HIGH for at least 10ms
	GPIO.output( ULTRASONIC_TRIG_PIN, GPIO.HIGH )
	time.sleep( 0.02 )
	GPIO.output( ULTRASONIC_TRIG_PIN, GPIO.LOW )

	# Read the sensor
	while ( True ) :
		start = time.clock()
		if ( GPIO.input( ULTRASONIC_ECHO_PIN ) == GPIO.HIGH ) :
			break
	while ( True ) :
		diff = time.clock() - start
		if ( GPIO.input( ULTRASONIC_ECHO_PIN ) == GPIO.LOW ) :
			break
		if ( diff > 0.02 ) :
			return -1

	return int( round( diff * 17150 ) )

def is_ultrasonic_triggered() :
	global prev_ultrasonic

	# Take 6 readings
	for i in range( 6 ):
		ultrasonic = read_ultrasonic()
		#Shift readings
		prev_ultrasonic = ( prev_ultrasonic[1], prev_ultrasonic[2], prev_ultrasonic[3], prev_ultrasonic[4], prev_ultrasonic[5], ultrasonic )

		if ( is_light_enough()
				and prev_ultrasonic[0] != -1
				and prev_ultrasonic[3] < ULTRASONIC_DIST and prev_ultrasonic[4] < ULTRASONIC_DIST and prev_ultrasonic[5]  ULTRASONIC_DIST and prev_ultrasonic[1] > ULTRASONIC_DIST and prev_ultrasonic[2] > ULTRASONIC_DIST ) :
			#print 'Ultrasonic: {0}'.format( prev_ultrasonic )
			return True

	return False

while ( True ) :
	if ( is_ultrasonic_triggered() ) :
		take_picture()

It worked alright, but triggered a little too often. About a week later I came across the Python package gpiozero, which makes it easy to work with a bunch of common Raspberry Pi GPIO components. I wrote an alternate version of BIAB Paparazzi using this package, which worked a bit better. It was so much simpler with gpiozero because it has built-in support for the HC-SR04. All I had to do was initialize the sensor and tell it what code to run when something in range was detected.

ultrasonic = DistanceSensor(
	echo = ULTRASONIC_ECHO_PIN,
	trigger = ULTRASONIC_TRIG_PIN,
	max_distance = ULTRASONIC_MAX,
	threshold_distance = ULTRASONIC_DIST )

ultrasonic.when_in_range = take_picture

The neat thing about the gpiozero package is when you initialize a sensor it automatically starts taking readings, keeps the values in a queue, and does comparisons against an average. My code attempted to do something along those lines, but was much more rudimentary. As nice as this version sounds, it still triggered too often. You can find the complete code for both versions in the BIAB Paparazzi repo on GitHub.

I think I was pushing the limits of what the HC-SR04 is meant for. Most of the examples I’ve seen are people using these to detect approaching walls on a robot. The biggest issue I ran into was the inaccurate readings. For example I’d be getting readings of about 160cm and then out of nowhere it would return a distance of 80-90 cm, even several in a row at times.

At the end of the day there are reasons it’s such a cheap sensor. 😉 For a couple of dollars, what do you expect? I’m curious to try my code on a more powerful Raspberry Pi 3 and see if it works any better. Was the less powerful Pi Zero causing problems?

Blog in a Box Paparazzi

wapi-512Happy Pi Day! I figured I better post something Raspberry Pi related today…

This weekend I played around with Blog in a Box which was recently released by our Tinker team at Automattic.

A quick and easy way of putting WordPress onto a Raspberry Pi.

BIAB ships with modules to use the Raspberry Pi camera and SenseHAT. I hadn’t used my Pi camera yet and had a fun idea to hack around with.

blog-in-a-box-ssh

The camera module allows you to take a photo on a schedule by setting a period of minutes, hours, or days between each photo. I wanted to have a little more fun, so I wired some other electronics up to a Raspberry Pi Zero and wrote a little Python program.

The first electronic element was a simple button. Press it and a picture is taken. Next up was a photocell (light sensor). When the room quickly changes from dark to light, it’ll take a picture. Since the Pi doesn’t have analog inputs, I went with a neat technique of measuring the sensor as a resistor used to ‘fill up’ a capacitor. The last element was an ultrasonic sensor I haven’t used yet either. It measures the distance to an object in front of it, so I’m kind of using it as a motion detector. Walk in front of the sensor and a picture is snapped. Due to mismatched voltages on the PI’s GPIO and the output signal of the rangefinder, I had to use some resistors to create a voltage divider circuit.

To create visual feedback I wired up an LED for each of these 3 components. When one of the components triggers a photo, the associated LED lights up until the process is complete.

I named it Blog in a Box Paparazzi. Of course the code and wiring info are available on GitHub. Should be easy to adjust if you have other sensors, buttons, switches, or whatever you want to trigger photos. Let me know if you try something different.

Home Assistant Pi

With all of the Raspberry Pis I have (now up to 6 after adding “flapper”), I wanted to get a bunch of data in Home Assistant (yes, I’m still working on a larger home automation post) and have an easy way to reboot or shutdown each computer.

I wrote a little app which runs as a service on each Pi. Here’s an example of what shows up in Home Assistant.

home-assistant-pi-groups.png

The Python app and sample Home Assistant configurations are in my home-assistant-pi project on GitHub. Of course it’s all Open Source.