My next project. This is going to be home to a Raspberry Pi acting as a home server.

My next project. This is going to be home to a Raspberry Pi acting as a home server.

Waking up with an RPi - pt.5: summing up

Part 1: Introduction

Part 2: Lights & Sensors

Part 3: Audio

Part 4: Software

image

The dawn clock is working well. To recap, here’s how it works:

And here’s a basic description of how it works (more info & functions in previous posts):

  • One hour before alarm time (configurable), the first LED is switched on. Subsequent LEDs are switched on at the rate of one each minute. To simulate dawn, the first 8 LEDs are red, the next 8 are yellow, the next 8 alternate red & yellow, the next 8 alternate yellow & white, the remaining 32 are all white.
  • 30 minutes before alarm time, bird song (recorded in our orchard) starts playing.
  • At alarm time, the lights ripple, the remaining 4 LEDs switch on, the LCD backlight comes on, birdsong stops and a song, chosen at random from a folder of MP3 files, plays. The LCD displays the name (where available) or filename of the chosen song.
  • When the song stops, there’s a slight pause and then the clock starts playing the radio, streamed via the Internet.
  • 45mins (configurable) after alarm time, the lights ripple & go off, the radio stops & the LCD backlight goes off.

But no project is ever really finished. Here are some ideas for additions and improvements:

  • Have more ambient sounds, in addition to the bird song, and have that morning’s ambient sound selected at random. I recently recorded a local stream: and while that’s liable to make me want to pee, that’s probably not a bad thing, given the whole point of the device is to make me get out of bed.
  • A button on the front that will switch the lights on/off.
  • A button on the front that will start/stop the radio.
  • A button on the front that will start/stop playing random music.
  • Currently, the radio streams BBC Radio 3. Could add extra options in the config settings.
  • Add Twitter functionality - eg, the clock could tweet alarms/errors.
  • A security light mode where it switches the lights on/off at random intervals when the ambient light is below a certain threshold.
  • Add a 4-digit, 7-segment LED display to show the current time in nice large numbers.

Currently, in order to add MP3 files to the folder from which the clock selects its alarm song I SSH into the RPi, then SCP files from my Linux machine where I’ve copied the files. It might be nice to improve this process and maybe even use the clock’s web site to create & select playlists (different songs for Summer & Winter, for example).

Or maybe I’ll just get on with the next project…

Waking up with an RPi - pt.4: software

image

The software for the dawn clock is written in Python. And no, I’m not going to upload it to GitHub, or anywhere else for that matter. There are two reasons for this: first, it needs work still; and second, it would take too much explaining, not least because it uses loads of my hand-rolled libraries and classes for the Raspberry Pi.

So here I’m just going to discuss a few, pertinent details.

Playing sounds

The ambient sound (bird song) and music are played using mpg123. I stuck with this solution, even though I would later use mplayer for other purposes, because it worked and I’d already written a class using it.

The class uses the following line to start playing an MP3 file:

self.subproc = subprocess.Popen([‘/usr/bin/mpg123’, ‘-q’, self.pathfile])

The string in self.pathfile includes the full path and filename of the audio file to be played. By assigning the object created with subprocess to a class variable, it’s possible to stop the audio playing with:

self.subproc.terminate()

It’s also possible to test whether the song is still playing with the following function:

def isPlaying(self):

    playing = False

    self.subproc.poll()

    if self.subproc.returncode == None:

        # subprocess hasn’t terminated yet

        playing = True

    return playing

The volume of the output is set with amixer. From the MP3 class this is set with:

setAudio = subprocess.Popen([‘amixer’, ‘-q’, ‘sset’, ‘PCM’, str(self.volume) + ‘%’])

where self.volume is an object property - an integer between 0 and 100. The same method is used in the class I wrote to handle streaming radio. This class is always instantiated with the full URL for the stream and is played using the call:

self.subproc = subprocess.Popen([‘/usr/bin/mplayer’, ‘-nolirc’, ‘-really-quiet’, self.url])

Like the MP3 class, playing can be stopped using the terminate() method.

Settings

Most of the settings for the dawn clock are configured via its web site. The settings are written to a simple config file which is accessed each time the Python script starts up. The web server has read/write access to this file and so I use simple HTML/PHP to manage the configuration. Here’s a screenshot from the dawn clock’s web server:

image

The Python temperature and pressure intervals refer to how often the readings are taken (once a minute for temps, every 10 mins for pressure, with the settings above) and how often they are logged to the MySQL database. All records in the database older than the data retention period are periodically deleted.

These records are accessed by the dawn clock’s web server to produce graphs on its home page:

image

The panel at the bottom of that screenshot is a PHP module written out by the Python script once an hour. You’ll see it includes details of the ‘alarm song’ that was played that morning. I often manage to sleep through this, but then - apparently inexplicably - have some music stuck in my head all day. This usually helps explain where that came from.

Times & Dates

The other thing worth mentioning is handling of times and dates. When I wrote the Arduino C code for the previous dawn clock, this caused me all kinds of headaches. Given that I was planning to implement a lot more functionality with the Mk.II clock, I quite expected the handling of times and dates to be the hardest part.

For example, there are many instances in the code where I need to know whether the current time is before or after a designated time (such as the alarm time). The thing is, the alarm time is the same every day, so any given time is always both before and after. Confused? Let’s say the alarm time is 7:15 and it’s now 11:30. The current time is after today’s alarm but before tomorrow’s. There are ways of getting around this, but it gets convoluted, and the example I gave was one of the simpler ones.

All these problems went away in an instant when I discovered Python’s datetime library. For example:

now = datetime.datetime.today()

That gives you an object with the current time and date. The latter is important.

Let’s set the alarm time to 7:15 today:

alarmTime = datetime.datetime(now.year, now.month, now.day, 7, 15, 0)

(This is a kind of paraphrase of how I do it in the code - simplified for clarity. For example, the hour and minute are pulled from the config file in the real code.)

That’s set the alarm time to 7:15 today, but what if, when this code is run, it’s already later than 7:15? That would mean the alarm had been set in the past. No problem:

if alarmTime < now:

    # this is being set in the past

    alarmTime = alarmTime + datetime.timedelta(days=1)

That’s now set the alarm to tomorrow. I could give plenty more examples, but the important thing is that the use of a datetime object, and the ability to modify it with methods like timedelta() makes working with dates and times astonishingly easy.

The display

Finally, a note on the LCD panel. Here’s what it displays:

image

The details are:

Line 1: Current time and date.

Line 2:

  • Current temperature. If the temp is higher or lower than the previous reading, this is followed by an up or down arrow.
  • Lowest/Highest temps in past 24 hours.
  • Current barometric pressure. The arrow preceding the number shows that the pressure has fallen since the previous reading. The bar after the reading shows that this is considered ‘low’ pressure. Normal and high pressures are shown by ascending bars.

Line 3:

  • Current settings (in %) for the volume settings for music and ambient sounds (this is temporary: I’ll probably replace this with something more useful).
  • PIR setting. The appearance of ‘pir’ means that PIR mode has been selected. In this case, at any time after the alarm lights have gone off but before a cut-off time (set to 22:30 in the settings screenshot earlier), if the ambient light level drops below a certain point and the PIR sensor detects movement, the LEDs are switched on. The asterisk in this image shows that motion has been detected.
  • Alarm time set.

Line 4: The last line is reserved for messages. In this case, it’s showing a ‘wake-up’ message, set in the configuration screen shown earlier. When the alarm time rolls around, this line shows the name of the song being played.

That’s it for now. My next post will sum things up…

[UPDATE 21/01/2012]: I changed my mind about using mpg123 for playing MP3s. The reason has to do with how I stop playing them. In the classes I created to play MP3s and radio streams, the stop() method used the Python subprocess.terminate() method to stop playing. However, this had the side-effect of leaving zombie processes unless I made sure to then delete the objects from the main program. That was difficult in some cicumstances and a bit clunky. I know that zombie processes consume almost nothing in the way of resources, but it’s inelegant.

It then occurred to me that mplayer allows you to send it keyboard commands. Actually, the commands are simply provided via STDIN, so that means they can be sent programmatically.

So, in the base class that is used by both the MP3 and RadioStream classes, the audio is started with:

self.subproc = subprocess.Popen([‘/usr/bin/mplayer’, ‘-msglevel’, ‘all=-1’, ‘-really-quiet’, self.source], stdin=subprocess.PIPE)

(where self.source has previously been defined as either a full path and filename for an MP3 file or a URL for a radio stream).

And stopping the audio from playing uses:

self.subproc.communicate(b’q’)

This sends the single character ‘q’ to the STDIN of the mplayer subprocess. This causes mplayer to stop playing and exit. The b in front defines it as a byte array - a string won’t work. So now the mplayer process exits cleanly with no zombie processes left behind.

Part 1: Introduction

Part 2: Lights & Sensors

Part 3: Audio

Part 4: Software

Part 5: Summing up

Waking up with an RPi - pt.3: audio

My first big disappointment with the Raspberry Pi was the audio output from the headphone jack. When I built my first dawn clock, using an Arduino, I had to buy a separate shield for audio output, and even that was severely limited. With the RPi, I thought, life will be easy because all the audio stuff I need is built-in. Didn’t turn out that way.

In my innocence, I tried hooking up a small speaker directly to the headphone jack. The output was barely audible. Then I did some reading on the RPi forum and found many complaints about just how bad this output is. No only is it weak, requiring amplification, but it’s also subject to loud noises whenever you start and stop playing. (This problem has, allegedly, been reduced with the latest Raspbian builds, but it still seemed pretty bad to me.)

Then I found this topic on the RPi forum and resolved that what I needed was an external, USB sound card. The one suggested in the thread, and which I subsequently bought, is this little beauty from China:

image

The only problem with this board is that it has a nice, bright LED on board to tell you it’s working. As mentioned in my previous post, this is something of a disadvantage for this project, so out with the black gaffer tape again. That’s why the board is unrecognisable in the picture at the bottom of my previous post.

While it boasts optical output, I was still planning to use the headphone jack, which meant that I still needed some kind of amplifier.

It just so happened that I had a pair of Logitech powered PC speakers hanging about. I ripped them apart (they were glued rather than screwed in key places) and removed the amp board. It was a bit of a squeeze getting it into the dawn clock, but offered the advantage of having a volume knob that, through the judicious drilling of a hole, I could access from the outside.

It worked!

But…

I had an inkling that all was not well when I heard that dreadful morse-like stuttering you get when your cellphone interferes with audio gear. Oh well, I thought, I’ll just have to remember not to take my cellphone into the bedroom.

But when I set up the dawn clock in the bedroom I found that the Logitech’s susceptibility to RF interference made it unfit for purpose. Something in that room (I suspect an old electric radiator) was bleeding RF noise like a slaughtered pig. (Yes, pigs do emit RF.)

So, I still needed an amplifier. A bit more searching and I came up with this 3W mini-amplifier board (again from Aliexpress in China):

image

When they said ‘mini’ they weren’t kidding. The actual thing is almost certainly smaller than the image you’re looking at, unless you’re reading this on your phone. The holes for making connections are so tiny, and so close together, that I struggled to find wire from my stock that would work. So soldering was tricky. Also, I wanted to use standard 0.1” headers for connecting the speakers. In the end, I put the headers on a small proto board and - because I didn’t have bolts thin enough to go through the amp board’s mounting notches - glued the amp to the proto board. It ain’t pretty, but it works.

image

The input cable from the amp runs to a 3.5mm stereo jack that plugs into the headphone socket of the audio card. The audio card plugs into the USB port of the RPi. I connected two 4 Ohm speakers to the headers on the proto board and … it worked!

The only issue with this set-up is that there’s no hardware volume control. Maybe I could have been clever with some pots on the proto board (that’s why I left space - in case I needed them). But in my blithely optimistic way, I assumed that I would be able to control volume programmatically, from software. Turned out I was right.

I’ll deal with that side of things more in the next post, on software. But as it’s relevant here, I’ll just give a flavour of the software used:

  • For playing MP3s (ambient sound and music) I use mpg123. [UPDATE: see below]
  • For playing the radio stream, I use mplayer.
  • For setting the volume for either, I use amixer.

All are called from within the Python script, of which more later…

[UPDATE: 21/01/2013] I’m now using mplayer for MP3s too - see Part 4 linked to below. Also, after the audio is played and the dawn clock is silent again, I’m getting a distinct and annoying whine, which I think is coming from the cheapo amplifier board. The only solution I’ve so far found for this is … wait for it …

Turn it off and back on again.

I’ve ordered a different amplifier board, so we’ll see if that solves the problem.

Part 1: Introduction

Part 2: Lights & Sensors

Part 3: Audio

Part 4: Software

Part 5: Summing up

Waking up with an RPi - pt.2: lights & sensors

image

One of the problems with my Arduino-powered dawn clock was that 10 LEDs just don’t put out enough light. I wanted a lot more for the Mk.II, but that then raises the issue of how they are controlled. Cascading shift register ICs would be one way to go. I decided, instead, to go with I2C port expanders.

I selected the MCP23017 which I bought from Adafruit. This has a configurable address so you can use more than one on the same I2C bus.

I thought for a long time about how to mount the LEDs. I’m fairly challenged when it comes to mechanical things, so opted to go for a number - actually four - discrete panels. Each would host 16 LEDs controlled by their own MCP23017.

image

Back to Adafruit, this time for its lovely Perma-Proto PCBs. If you’re not familiar with these, they’re basically prototyping boards, but laid out in the same pattern and with the same connections as a classic breadboard. They’re not cheap but, from my point of view, had one advantage over normal trackboard in that they’re white, and thus nicely reflective.

As mentioned in the previous post, in order to simulate dawn, the first eight LEDs to come on are red, the next eight yellow, then alternating red and yellow, then alternating yellow and white. The remaining 32 LEDs are all white.

The boards look quite neat from the front. But there’s lots of paddling going on at the back.

image

I used two pieces of aluminium strip to mount the boards on. With all four mounted, the interior of the clock looks like this:

image

I figured I might be using other I2C devices, too, and I wanted to be able to use them at 5V logic, so I created a breakout board providing multiple headers to plug in I2C devices, with a bidirectional level shifter to connect safely to the Raspberry Pi’s 3V3 logic.

I also broke out a couple of the RPi’s GPIO pins. One of these is connected to a PIR motion sensor, mounted just below the clock’s LCD panel. Incidentally, the LCD panel is another I2C device, this time from Devantech.

I also thought I’d want at least a couple of analogue inputs, so I installed an ADC chip on the board. I’d bought a bunch of these 10-channel I2C devices from HobbyTronics, so used one of those. In the end, I’m only using one analogue sensor - a basic light resistor - so a 10-channel ADC is definitely overkill. Oh well, at least there’s plenty of scope for expansion. The light resistor is used to measure ambient light levels - ie, to decide when it’s dark. It’s mounted inside the both, at the top of one of the metal strips on which the LED panels are mounted. That way it’s behind the front panel, which acts as a general diffuser. Of course, the software for the clock doesn’t take ambient light readings when any lights are switched on!

The following pic shows the breakout board. If you value your sanity, don’t try to make sense of it. As an electronics naif, I have a bad habit of designing boards as I solder. There are, therefore, bits of this board that were never used, especially in the lower third…

image

The other analogue device I thought I was going to use was a temperature sensor. In the end, however, I opted for an I2C breakout board that provides both a nicely calibrated thermometer and at atmospheric pressure sensor. It’s not cheap, but what price perfection?

image

I positioned this near the top of the box, at the side, with a hole to the outside world. Once I started using the clock, I found that the measured temperature was rising by about a degree during the period the lights were on. Obviously, the inside of the box was heating up & skewing the measurement. I made the hole to the outside world bigger, moved the sensor closer to it & stuck some foam behind the board as crude thermal insulation. But the problem persists, so there’s more work to be done.

The RPi used is a revision 1 board - so no mounting holes. In order to mount it, I butchered a ModMyPi case. The bottom half of the case was screwed to the case. I cut a big hole in the top of the case, partly to give access to the GPIO pins but also as a cooling measure. This RPi is left running 24/7 so overheating was a concern. To that end, I also mounted heatsinks on the main chips. In the following pic, you can also see the audio amplifier - but more on that in a subsequent post.

image

Network connection is provided by a Netgear N150 USB wifi dongle. This has proven to be remarkably reliable, especially given that the wifi AP is one floor down. It worked without any configuration with the current Raspbian build. I have configured the home router to assign this device a fixed IP (based on its MAC), so that I can easily SSH into the RPi and also connect to its web server. The only problem with the dongle is that it has a built-in LED that flickers to indicate network traffic. This would be very annoying in a darkened bedroom, but some black gaffer tape soon sorted that. (This would become a common theme.)

The next step was to provide power. As we’ll see in the next post, my original plan for the audio output was to use the innards of some Logitech PC speakers. The power supply that came with them output 12V, so I figured I might need a 12V supply for the dawn clock, in addition to the 5V required by the RPi and breakout board. It just so happened that I had a couple of these devices from DFRobot.

image

In addition to providing a switched 5V (or variable) output, this board will also pass through what is supplied to it. With an external 12V, this would easily allow me to power everything in the clock.

In the end, I decided against the Logitech kit and went with a USB-powered sound card. For reasons I’ll explain later, I may replace this power board with a power distribution board of my own, outputting just 5V.

Anyway, with the light panels removed, here’s how the interior of the dawn clock looks:

image

Next time I’ll talk about the audio stuff…

Part 1: Introduction

Part 2: Lights & Sensors

Part 3: Audio

Part 4: Software

Part 5: Summing up

Waking up with an RPi - pt.1: introduction

I don’t like waking up. I need easing into the day, slowly and carefully, especially in Winter. My wife is the same, which is why she bought a ‘dawn clock’. This is an alarm clock with a light that fades up slowly, over the course of 90 minutes, before sounding the alarm. It’s a fine device and we still use it, but I thought “I could do better than that”.

What I did, eventually, is shown below - a Raspberry Pi-powered dawn clock. But that wasn’t my first attempt.

image

One of the things we miss most in Winter is waking up to the sound of birdsong. We’re in the Normandy countryside with trees all around. One Summer morning, my wife went out early and recorded the birds’ dawn chorus. And I thought, wouldn’t it be great to wake up to that each morning?

And so I took an Arduino Uno and coupled it with the Adafruit Wave Shield and Real Time Clock (RTC). I added a 16x2 character LCD panel that displays the time and temperature (thanks to a simple sensor). The Arduino switches on a series of bright white LEDs, one every five minutes. Eight are switched one via a shift register but the first two fade up slowly thanks to PWM directly from Arduino pins. Half-an-hour before wake-up time, the clock starts playing the birdsong recorded in our orchard. At wake-up time, the clock plays a song chosen randomly from the files loaded on to the Wave Shield’s SD card. The whole thing is shoved, rather crudely, into a plastic food container. But it works and has been waking us up, gently, for about the past 18 months.

I knew I could still do better, though.

The Arduino-based project had some issues. First, changing the clock time, or altering the alarm time, involved editing the C code, which in turn meant plugging the clock into the computer. Clumsy.

Second, loading new songs on to the card meant shutting down the clock, digging out the SD card and sticking that into my computer. Also clumsy.

In addition, the Wave files for the shield have to be carefully prepared, which meant taking existing MP3 files and editing them in Audacity. Time consuming.

And there was no way to add any extra functionality. Like most things that come from Adafruit, the Wave Shield is a very fine product. But it does consume a lot of the Uno’s program space. I’d had to cheat every byte I could out of the code to get it to run. 

And then I got a Raspberry Pi…

And so was born the Speculatrix Dawn Clock Mk.II. Here’s a quick run-down of the specs and functions:

  • 20x4 LCD panel displaying time, date, temperature, barometric pressure, alarm time, alarm song played and other stuff.
  • PIR sensor to detect movement.
  • I2C barometric temperature and pressure sensor.
  • Light level sensor.
  • 64 LEDs driven via I2C port expanders.
  • Wifi-enabled.
  • Stereo sound via a USB sound card, amplifier board and two speakers. Ability to play standard MP3 files.
  • Web server for settings, information.

And here’s a description of how it works:

  • One hour before alarm time, the first LED is switched on. Subsequent LEDs are switched on at the rate of one each minute. To simulate dawn, the first 8 LEDs are red, the next 8 are yellow, the next 8 alternate red & yellow, the next 8 alternate yellow & white, the remaining 32 are all white.
  • 30 minutes before alarm time, the bird song starts playing, slowly fading up over the course of a couple of minutes.
  • At alarm time, the lights ripple, the remaining 4 LEDs switch on, the LCD backlight comes on, birdsong stops and a song, chosen at random from a folder of MP3 files, plays. The LCD displays the name (where available) or filename of the chosen song.
  • When the song stops, there’s a slight pause and then the clock starts playing the radio - BBC Radio 3 streamed via the Internet.
  • 45mins after alarm time, the lights ripple & go off, the radio stops & the LCD backlight goes off.
  • If the motion sensor is selected, if the ambient light level falls below a certain threshold and the PIR motion sensor detects movement, the LEDs come on for 30secs. This only happens up until a certain cut-off time (we have this set to 22:30), so that you don’t have the lights coming on in the middle of the night should a cat decide to join us in the bedroom.
  • In addition to displaying the current temperature and barometric readings, the clock logs these readings (at user-configurable intervals) to a MySQL database. Graphs for recent readings (24hrs for temps, 48hrs for pressures) are available via the clock’s web server. The web server also shows highest and lowest temps for the past 36 hours and which song played that morning.
  • The clock is configurable via the web server.

The whole thing is housed in an old SAD lightbox. My wife & I are both affected by SAD and so have bought a number of lightboxes over the years. This one had an annoyingly buzzy transformer, so we rarely used it. I’d long eyed it as a potential project box. So I just ripped out all of its components.

You can already see that the Mk.II has a lot more functionality. So how was this achieved? More information in the next part.

Part 1: Introduction

Part 2: Lights & Sensors

Part 3: Audio

Part 4: Software

Part 5: Summing up

Kinect on the BeagleBoard (and Ubuntu)

[Updated 17 May 2012] A few weeks ago I managed to get my Ubuntu 11.04-running laptop properly configured to use the Kinect, using OpenNI and whatnot. It took a bit of doing. Now I’m trying to repeat the process with my BeagleBone (also running Ubuntu 11.04).

I’ve got a long way down the road, and have got it working, but I think it’s a little flaky and there are some issues. This post will get updated as & when I sort the problems.

Making Things SeeThere’s nothing inherently difficult about installing OpenNI. I’m learning about using the Kinect from the very fine book ‘Making Things See' by Greg Borenstein. Following his installation instructions, I had the device running on my MacBook Pro in a couple of minutes. I probably could have done the same on Windows, if I did Windows. But when it comes to Linux, Borenstein pretty much says, 'if you use Linux, you'll know how to do this' and leaves you to it.

I found a very useful blog post on ‘Daves thoughts on stuff’ [sic]. He detailed a 29-step process to getting the Kinect working on Linux. If you want to follow along with what I do here, you’ll want to have that page open too. I’m not going to cut’n’paste Dave’s copy, just add my own notes where my experience differed from his. The numbers of the steps below correspond to Dave’s steps.

I also found a very handy post by Jatin Sharma about getting OpenNI working on the BeagleBoard. So I’ll make reference to that, too.

Except where specified, all these notes apply equally to the Intel laptop and BeagleBone Arm installations.

So here we go…

[1] First problem is that libglut3-dev has been replaced by freeglut3-dev. Also, Dave doesn’t specify some libraries and apps that you’re going to need installed (possibly because he already had then installed). The following is a more comprehensive list of stuff you’re going to need to install, as detailed by Jatin. These all installed fine on my BeagleBoard:

# apt-get install git-core cmake libglut3-dev pkg-config gcc g++ build-essential libxmu-dev libxi-dev libusb-1.0-0-dev doxygen graphviz git

It should be noted, perhaps, than I’m logged into the BeagleBone as root. In fact, this is how I use the ‘Bone. It makes life simpler and, frankly, worrying about ‘security’ on my robot is just a little silly. So the hash mark above is my command-line prompt, not part of the command. Also, if you’re not logged in as root, you’ll need to use ‘sudo’ for a lot of this stuff.

You’re also going to need Java JDK, so for my BeagleBone I used:

# apt-get install openjdk-6-jdk

(Yeah, I know OpenJDK 7 is available, but I hit problems with that).

[2-3] Okay.

[4] The repo has changed. Dave says to cd to OpenNI/Platform/Linux-x86/Build, but for Intel platforms ‘Linux-x86’ has been replaced with Linux. So:

# cd OpenNI/Platform/Linux/Build

[5] ~/kinect/OpenNI/Platform/Linux/CreateRedist/RedistMaker doesn’t have execute permissions, so need to chmod before doing make && make install.

Jatin’s experience is that trying to use the Arm, rather than Intel, version throws up too many errors, so he suggests using the Intel version but with some modifications to ~/kinect/OpenNI/Platform/Linux-x86/Build/CommonMakefile. But things seem to have changed quite a lot with the repo. Make worked fine on my BeagleBone.

On Intel laptop: ‘make install’ looks for the install.sh script in ~/kinect/OpenNI/Platform/Linux/Redist but in fact it, and other files, are in a further subdir. This was OpenNI-Bin-Dev-Linux-x64-v1.5.2.23 on my laptop - obviously, your filename may vary). You could just manually go in there, once make install fails, and run install.sh, but I copied all the files in that subdir to ~/kinect/OpenNI/Platform/Linux/Redist/

On BeagleBone: ‘make install’ was rather more of a problem. It calls the ~/kinect/OpenNI/Platform/Linux/CreateRedist/RedistMaker script (just a wrapper to the Redist_OpenNi.py Python program). But this throws up an error on Arm. Presumably, you need instead to run RedistMaker.Arm (first having chmod’d it to give it execute permissions) instead. Jatin didn’t seem to have this problem (although he had another that involved installing Mono as the fix. I installed Mono as a precautionary measure.) But even running the RedistMaker.Arm script caused a problem. The Python script attempts to calculate the number of compile jobs and passes this to make using the ‘-j <num>’ option. But in my case, it calculated the number as 0, and make requires -j to have a positive integer. I edited the Redist_OpenNi.py script, commenting out the line:

MAKE_ARGS += ’ -j’ + calc_jobs_number()

(Sidenote: In an attempt to get the BeagleBone installation running I’ve gone back and tried installing from SensorKinect. In that case I had to run RedistMaker manually using:

RedistMaker Arm

This threw up a similar problem with the -j switch. In this case RedistMaker itself is the Python script. This has the line:

make -j

I commented out this line then added another:

End of sidenote).

The, while in the ~/kinect/OpenNI/Platform/Linux/CreateRedist directory, I ran:

# python Redist_OpenNi.py Arm

This appeared to work fine.

Next problem: make install looks for the install.sh script in ~/kinect/OpenNI/Platform/Linux/Redist but in fact it, and other files, are in a further subdir - OpenNI-Bin-Dev-Linux-x64-v1.5.2.23 on the laptop and OpenNI-Bin-Dev-Linux-Arm-v1.5.2.23 on the BeagleBone (obviously, your filename may vary). You could just manually go in there run install.sh, but I copied all the files in that subdir to ~/kinect/OpenNI/Platform/Linux/Redist/. Running make install then worked.

[6] Okay.

[7 - Updated 17 May 2012] Dave says to get Sensor from git clone https://github.com/boilerbots/Sensor.git. This worked fine on my laptop. Jatin says to go for https://github.com/avin2/SensorKinect, which produced fewer errors on the BeagleBoard-XM. I cloned both repos to give me options. The first time I tried this with the BeagleBoard, I went with Sensor. Of course, that didn’t work out so well. So I tried again, this time with SensorKinect. So the comments below refer to Sensor on the Intel Laptop and SensorKinect on the Arm-based BeagelBoard.

[8] cd into Sensor or SensorKinect.

[9] It seems as though the ‘checkout’ process was unnecessary (both versions).

[10] With SensorKinect, the subdir name is ‘Linux’, not ‘Linux-x86’.

[11] Intel: On the Intel laptop: CommonMakefile - which is looked for by the Sensor package - has been renamed CommonCppMakefile. The advice is to cd to /usr/include/ni and create a symbolic link:

# ln -s CommonCppMakefile CommonMakefile

But you might find even CommonCppMakefile is missing from /usr/include/ni, in which case, copy these makefiles from the downloaded OpenNI files:

# cp ~/kinect/OpenNI/Platform/Linux/Build/Common/* /usr/include/ni

I had to edit ~/kinect/Sensor/Platform/Linux-x86/Build/XnFormats/Makefile - the line that says 

LIB_USED = XnCore OpenNI

should read

USED_LIBS = XnCore OpenNI

Actually, I just added that extra line. The make then worked, but it still wouldn’t make install.

That’s because it was looking for a file in the dir ~/kinect/Sensor/Platform/Linux-x86/Bin/Release. But the files it was looking for were actually in dirs called x64-Release on the Intel laptop and Arm-Release on the BeagleBone. So on both machines I created a dir called ~/kinect/Sensor/Platform/Linux-x86/Bin/Release and copied (using cp -vR) all the files from x64-Release/Arm-Release into it. Using a symlink would no doubt have been more elegant.

Arm [updated 17 May 2012]: ‘make’ worked fine, but ‘make install’ complained of an unknown architecture. So here’s what I did:

* cd to ~/kinect/SensorKinect/Platform/Linux/CreateRedist

* ran:

chmod u+x RedistMaker

./RedistMaker Arm

cd ~/kinect/SensorKinect/Platform/Linux/Redist/Sensor-Bin-Linux-Arm-v5.1.0.25

chmod u+x install.sh

./install.sh

(Remember, I’m still doing all of this as root.)

According to Jatin, I should now be able to plug in my Kinect and get it working. That didn’t happen. See the ‘next steps’ section below for more details.

[12] The NITE files you’re looking for are now at: http://www.openni.org/Downloads/OpenNIModules.aspx.

[13-15] Went fine.

[16] There was no MapOutputMode entry in the XML files.

[17-21] Okay.

[22] Are these binaries? No make && make install necessary.

[25] This subdir & file don’t exist in the version I downloaded & installed.

[26] As detailed in the section below, the SensorKinect package installs its own udev rules, so I didn’t need to do this. But I did edit the SensorKinect version.

Next steps - BeagleBone

On the BeagleBone, when trying to run some of the NITE samples I got:

Open failed: USB interface is not supported!

It wasn’t the gspca_kinect module problem, because I’d already taken the steps below on the BeagleBoard.

Using lsusb tells me that the Kinect is connected. The output includes:

Bus 001 Device 009: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
Bus 001 Device 010: ID 045e:02ad Microsoft Corp. Xbox NUI Audio
Bus 001 Device 011: ID 045e:02ae Microsoft Corp. Xbox NUI Camera

And dmesg | tail also showed me that the Kinect had connected properly.

[13498.772869] usb 1-1: new high-speed USB device number 3 using musb-hdrc

[13498.913178] usb 1-1: New USB device found, idVendor=0409, idProduct=005a

[13498.913228] usb 1-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0

[13498.920702] hub 1-1:1.0: USB hub found

[13498.920965] hub 1-1:1.0: 3 ports detected

[13499.502849] usb 1-1.2: new full-speed USB device number 4 using musb-hdrc

[13499.612244] usb 1-1.2: New USB device found, idVendor=045e, idProduct=02b0

[13499.612295] usb 1-1.2: New USB device strings: Mfr=1, Product=2, SerialNumber=0

[13499.612334] usb 1-1.2: Product: Xbox NUI Motor

[13499.612363] usb 1-1.2: Manufacturer: Microsoft

[13501.042841] usb 1-1.1: new high-speed USB device number 5 using musb-hdrc

[13501.145079] usb 1-1.1: New USB device found, idVendor=045e, idProduct=02ad

[13501.145129] usb 1-1.1: New USB device strings: Mfr=1, Product=2, SerialNumber=3

[13501.145171] usb 1-1.1: Product: Xbox Kinect Audio, © 2011 Microsoft Corporation. All rights reserved.

[13501.145211] usb 1-1.1: Manufacturer: Microsoft

[13501.145240] usb 1-1.1: SerialNumber: A44887B03785042A

[13502.833003] usb 1-1.3: new high-speed USB device number 6 using musb-hdrc

[13502.936174] usb 1-1.3: New USB device found, idVendor=045e, idProduct=02ae

[13502.936226] usb 1-1.3: New USB device strings: Mfr=2, Product=1, SerialNumber=3

[13502.936265] usb 1-1.3: Product: Xbox NUI Camera

[13502.936294] usb 1-1.3: Manufacturer: Microsoft

[13502.936321] usb 1-1.3: SerialNumber: A00367914376042A

My first thought was to check udev rules. SensorKinect installs its own rules at /etc/udev/rules.d/55-primesense-usb.rules. I noticed that these effectively duplicated the rules I’d set up manually following Dave’s instructions. I removed the manually installed ones. I also noticed that the SensorKinect rules included the settings: OWNER=”xxx”,GROUP=”users”. I tried changing OWNER to “root” (as that’s how this BeagleBone is going to be run). I also replaced all instances of the deprecated SYFS with ATTR. Then I ran:

/etc/init.d/udev restart

Then I tried editing /usr/etc/primesense/GlobalDefaultsKinect.ini to uncomment the line:

UsbInterface=2

On the Arm platform, UsbInterface defaults to 2, so I thought I’d try it with a value of 1 because I’d read of that working with Android on various Arm-based devices. I tried running the sample program Sample-NiSimpleRead.

It worked!

Then it stopped working. I Ctrl-C’d out of the program after which any attempt to re-run it gave the message the device wasn’t connected. Unplugging and reconnecting didn’t help. I ran lsusb and just got error messages (invalid pointer). Restarting udev didn’t help and subsequent attempts to run the sample program resulted in a segfault. Soooo….

Rebooted the BeagleBone and it worked several times. Then any further attempts to run the sample program returned the repeated message:

UpdateData failed: A timeout has occurred when waiting for new data!

So I’m going to have to look into that.

Another problem: the Kinect isn’t recognised until I unplug it & plug it back in again. Obviously I don’t want to have to do that on the robot, so it would be useful if I could get the system to see it programmatically.

Also, as of this writing, I’m running the BeagleBoard headless, so I can’t see the visual output. However, my DVI cape should arrive tomorrow.

Next steps - Intel laptop

A lot of the Nite stuff had to be altered because of the different version. (Actually, this applied to the BeagleBone installation too - mainly just a problem with some dirs being prefixed with ‘Arm-‘).

I hit the error:

Xlib: extension “GLX” missing on display “:0”

This seems to be a freeglut problem - OpenGL GLX extension not supported by display “:0”

I removed Nvidia drivers (I think this involved apt-get remove nvidia-common or some such, but this is the one step I forgot to document). Then I got a USB problem. Was getting the message:

InitFromXml failed: Failed to set USB interface!

So did:

rmmod gspca_kinect

But to make this more permanent, I cd’d to /lib/modules/<kernel-version>/kernel/drivers/media/video/gspca/

And renamed gspca_kinect.ko to gspca_kinect.ko.BAK. Then I read that this isn’t enough, so edited /etc/modprobe.d/blacklist.conf and added the line:

blacklist gspca_kinect

That got everything working on the laptop.

Stepping up to the BeagleBone

So far, my experiments with robotics have been Arduino-powered. The Arduino is a superb piece of kit - inexpensive (I now have five), easy to program and (so far) relatively hard to break.

But it is still just a microprocessor-powered board. I’ve been using the Uno, built around the ATmega328 and the Mega 2560 which uses (surprise, surprise) the ATmega2560.

It’s amazing how much you can do with them. The Uno boasts 32KB of Flash memory for programs and 2K of RAM. That’s not dissimilar to the BBC Micro I had in 1982. The Mega boosts that to a heady 256KB Flash and 8KB RAM. For the stuff I’ve been doing so far - reading sensors, controlling motors and some rudimentary intelligence - that’s been fine. I’ve not run out of space on a Mega yet.

But…

I really want to do some more advanced things, like vision and especially using a Kinect. An Arduino simply isn’t going to hack it.

I briefly considered a Bilibot, which employs an onboard Intel-powered laptop. But the price was prohibitive and I’m not hugely impressed with the iCreate as a platform. (It will be interesting to see what they come up with for v2).

So I started looking at Arm-based platforms. I like the looks of the BeagleBoard-XM and the PandaBoard - both fine and highly capable devices. In the end, though, I went for the newer - and cheaper - BeagleBone.

The ‘Bone is superficially similar to the Arduino in the way it breaks out all those pins, and comes in a handily small form factor. But there are huge differences - not least the fact that the ‘Bone is capable of running a full operating system (with GUI, if you want). That also means you don’t get the integrated IDE of the Arduino. The makers are heading in that direction, with the bundled Node.js running on Angstrom Linux. But my feeling is that, if you’re going to take full advantage of the device, you’ll want to get away from that approach.

So, I’m at the bottom of the learning curve with the ‘Bone, which I’m really enjoying. The next bunch of posts will be all about this particular adventure, although I’ll still find time to talk about a couple of remaining Arduino-based projects I have in train right now.

In fact, my plan is to use both Arduinos and BeagleBones working together, which is why I will shortly blog about how I got them to talk to each other.