The Northbelt

Recently I read this old WIRED article linked on a dicussion forum.

It kindof blew my mind. For a while in 2004, a man in Germany wore a belt which continuously gave tactile feedback indicating which direction was north, and he developed a sixth sense of direction.

"On a visit to Hamburg, about 100 miles away, he noticed that he was conscious of the direction of his hometown. Wächter felt the vibration in his dreams, moving around his waist, just like when he was awake."

So obviously, I had to have a magical north pointing belt myself. There are a bunch of project pages across the web of folks who have made their own, but they were all pretty old and bulky, so I’ve started a little project to build one using slightly more up to date parts. It’s currently a work in progress, but here’s a picture of the breadboard prototype.

You can follow along with my progress in this github repo:

Fitbit for Bonsai Trees

Things have been kindof busy lately but I’ve been on-again / off-again working on a little project to wirelessly track the temperature, light level and soil moisture of my bonsai tree, so I can track its progress and know when to water (and more importantly, when not to).

I’m using a Pololu Wixel as the brains of this device. The picture above is of the first prototype, built out on a permaproto board with a huge enclosure and a wired connection to the sensors. This prompted my previous blog post on the Wixel in general and how to get it to sleep in low power modes to maximize battery life. I also experimented with a solar panel to provide power which worked nicely in direct sunlight but would need to be augmented with a big capacitor and a harvesting set up to cope with San Francisco’s cloudy days (and nights!), so I decided to go in a different direction - to operate with low power draw and in a small package using a single AA battery, and eventually to swap out the Wixel and use a CC2511 directly on the device itself.

This past weekend I finally got time to work on the project again and managed to layout a schematic and PCB for prototype number 2, which will essentially be a Wixel shield but get power supply, board and sensors down into a single small package. I used Sketchup to visualize the 3D layout of components in the little package and to design a case which I’ll be 3D printing this week.

Read on →


I’m working on a project (more soon!) which needs to sample a few sensors and send data wirelessly at low power. After testing out an Arduino Uno + Bluetooth shield (too flaky, too bulky) and deciding that all the BLE solutions out there are not quite ready yet (though some look promising), I discovered the Pololu Wixel. I bought mine from Sparkfun at $20 each - you’ll need at least 2 since the RF protocol is proprietary.

The Wixel is a tiny dev board using the TI CC2511F32 System-on-Chip. The CC2511 supports wireless comms via a proprietary 2.4 GHz RF protocol and includes an 8051 CPU core and built-in USB. The CC2511F32 is really cheap and Pololu have done a nice job of packaging it on a 1.5 x 0.75 inch PCB which includes a USB mini connector and RF antenna. They have also created an easy to use SDK for developing apps in C and flashing the board.

Read on →

Introducing PiUi: Add a Simple Mobile Phone UI to Your RaspberryPi Project.

I’m excited to introduce you to a project I have been working on for a few weeks in my spare time: PiUi.

A lot of folks asked how to use my RPi Timelapse Controller without the LCD Plate - which is kindof expensive and not everyone is comfortable to solder one up themselves. The answer of course is that this is possible, but… without a UI you are limited to having the controller run on boot and it’s difficult to know everything’s working correctly and/or take control when you know better.

The same is true of many hardware projects and an HDMI monitor + keyboard is not a feasible method of interaction away from your desk - wouldn’t it be great if you could add a UI on a device you already have in your pocket to any Raspberry Pi project?

PiUi makes it easy to implement a rich mobile UI directly in python code and access it from your Android or iPhone. It’s powered by ratchet.js so there are lots of UI components available to create beautiful interfaces.

All you need in addition to a Raspberry Pi is a wifi adaptor (like this one from Adafruit). Your Pi will create a wifi access point to connect your phone to, then simply navigate to http://piui/ in a browser to access your app’s UI. There’s even an Android app to make connecting easy and show useful status info plus an iPhone webapp you can save to your homescreen.

Once the access point is set up (easy with pre-prepared sd card images), the full code required for a python helloworld example is:


sudo pip install piui



PiUi is open source - fork it on github - and is just getting started, so please use it, let me know what you think and help improve it.

For detailed setup instructions, read on.

Here’s a little demo of the Timelapse project with a PiUi interface (source at

Read on →

Raspberry Pi Website Hit Tracker

You just made a funky neon sign flash in my living room.

How? I have just completed my latest project which is a neon lamp which lights up every time someone visits my website. It’s controlled by a little relay board I built out on an Adafruit permaproto board and connected to a Raspberry Pi. The Raspberry Pi is running a simple python script which generates an event every time someone loads this page. I’ve made the part which integrates with the website open for anyone to use so you can build this out for yourself - have fun!

Building the Relay Board

The relay board connects to the Raspberry Pi General Purpose I/O (GPIO) pins via a ribbon cable. Adafruit sells a Pi Cobbler which makes it easy to break out the GPIO pins on a breadboard for prototyping. Once you’re happy that the prototype is working, transferring a breadboard layout to the permaproto board is quite simple (the soldering is easy and you just copy what you had on the breadboard).

Here is the circuit diagram for the relay board - we drive the relay coil from the 5V supply which is switched on and off using a transistor controlled by one of the digital out pins (I used pin 18). The diode prevents reverse voltage as the relay switches off from damaging the Pi.

You can see step-by-step instructions on how to assemble the relay board on this spark.

Spark: Raspberry Pi relay board.

Read on →

Raspberry Pi Timelapse Controller

A few weeks ago, I found this beautiful video on Youtube – a timelapse video of stars and the Milky Way. Seeing the stars appear to rotate overhead (due to the rotation of the Earth) and the intricate structure of our own galaxy gave me a profound feeling of the scale of the universe that we move through on spaceship Earth. Of course, I wanted to record my own Milky Way timelapse.

Capturing the Milky Way requires dark skies and long exposures, so this seemed like a great project to build using my fairly old Canon EOS 350D and Raspberry Pi. I also spent some time exploring what existing timelapse controllers can do - the holy grail of timelapse is to be able to capture sunset (and sunrise) seamlessly, where a wide range of shutter speeds need to be used to capture an appealing scene as the ambient light levels change profoundly. You can see at the end of the milky way video I linked above that sunrise is not handled so well! There are a number of scripts which can be run in-camera with homebrew firmware (e.g. chdk) but these cannot choose the best shutter speed based on the images taken - they have to guess the best values once there is too little light for the camera lightmeter to judge. Since we can run fully featured image processing software like ImageMagick on the Linux based Pi, I decided to build a controller which could capture sunset.

I also recently got hold of an Adafruit LCD Plate for my Pi so I’ve added a User Interface too.

I haven’t yet been able to make the Milky Way timelapse which is my end goal, but hope to do so in the coming weeks next time it’s dark, clear and I’m at Lake Tahoe, but the controller is working nicely.

Read on to find full instructions, some demo videos and the software so you can try it yourself.

At the top of this post you can see the set up and a demo video.

Read on →

Raspberry Pi Webcam; a Gentle Intro to Crontab

Here’s a quick and easy first project for new Raspberry Pi owners - turn your Pi into a webcam, and learn about Linux’s ability to run repeated tasks at scheduled intervals with the cron utility.

These instructions work with Adafruit’s Occidentalis distribution for Raspberry Pi. They likely also work with any version of the Raspian distro, but I highly recommend Occidentalis if you’d like to do more hardware hacking with your Pi. Adafruit have good instructions on how to get started and install on an sd card.

You will need to set up a wired or wireless internet connection to your Pi.

Choose a webcam

If you have a USB webcam lying around the house it’s very likely that it will work just fine. If not, I used the Logitech Pro 9000 successfully and a full compatibility list is available to check before you buy one.

Install fswebcam

fswebcam is a small and simple webcam app for *nix. Install it by issuing the following command on your Pi

sudo apt-get install fswebcam
Read on →

Living in the Future

Paul Graham’s latest essay - How to get startup ideas - is a great read.

I was struck by the Bucheit/Pirsig conjecture:

“Live in the future, then build what’s missing.”

and the following paragraph regarding ideas that come out of folks’ experience at college.

pg encourages those readers who are still studying to take classes unrelated to their CS major so that they may see more problems worth solving. In contrast, what struck me about this paragraph was how much, for me, college was like living in the future. In the late nineties, I lived in an environment where every single member of my social circle had an always-on 10 Mbit connection to the Internet and spent inordinate amounts of time communicating via email, IM etc. It seems like no coincidence that so many successful Internet companies were born out of students of that era. I doubt that today’s students encounter the future of much at all in their dorm rooms. Perhaps universities should be working hard to make sure that campus living is more like living in the future than setting up mobile app development courses, incubators etc etc.

Parsing Huge XML Files With Go

I’ve recently been messing around with the XML dumps of Wikipedia.  These are pretty huge XML files - for instance the most recent revision is 36G when uncompressed.  That’s a lot of XML!  I’ve been experimenting with a few different languages and parsers for my task (which also happens to involve some non trivial processing for each article) and found Go to be a great fit.

Go has a common library package for parsing xml (encoding/xml) which is very convenient to code against.  However, the simple version of the API requires parsing the whole document in one go, which for 36G is not a viable strategy.   The parser can also be used in a streaming mode but I found the documentation and examples online to be terse and non-existant respectively, so here is my example code for parsing wikipedia with encoding/xml and a little explanation!

(full example code at

Here’s a little snippet of an example wikipedia page in the doc:

// <page> 
//     <title>Apollo 11</title> 
//      <redirect title="Foo bar" /> 
//     ... 
//     <revision> 
//     ... 
//       <text xml:space="preserve"> 
//       {{Infobox Space mission 
//       |mission_name=&lt;!--See above--&gt; 
//       |insignia=Apollo_11_insignia.png 
//     ... 
//       </text> 
//     </revision> 
// </page>

In our Go code, we define a struct to match the <page> element, its nested <redirect> element and grab a couple of fields we’re interested in (<text> and <title>).

type Redirect struct { 
    Title string `xml:"title,attr"` 

type Page struct { 
    Title string `xml:"title"` 
    Redir Redirect `xml:"redirect"` 
    Text string `xml:"revision>text"` 

Now we would usually tell the parser that a wikipedia dump contains a bunch of <page>s and try to read the whole thing, but let’s see how we stream it instead.

It’s quite simple when you know how - iterate over tokens in the file until you encounter a StartElement with the name “page” and then use the magic decoder.DecodeElement API to unmarshal the whole following page into an object of the Page type defined above. Cool!

decoder := xml.NewDecoder(xmlFile) 

for { 
    // Read tokens from the XML document in a stream. 
    t, _ := decoder.Token() 
    if t == nil { 
    // Inspect the type of the token just read. 
    switch se := t.(type) { 
    case xml.StartElement: 
        // If we just read a StartElement token 
        // ...and its name is "page" 
        if se.Name.Local == "page" { 
            var p Page 
            // decode a whole chunk of following XML into the
            // variable p which is a Page (se above) 
            decoder.DecodeElement(&p, &se) 
            // Do some stuff with the page. 
            p.Title = CanonicalizeTitle(p.Title)

I hope this saves you some time if you need to parse a huge XML file yourself.