Tweet confirming Kickstater backing
As close as you're going to get to an *unboxing* here

Back in November 2013, nursing a broken hand and in a period of “meaningful work” with the LA, I decided to treat myself, but this would be no ordinary treat, this would be a future treat. I decided to back the PowerUp 3.0 through it’s Kickstarter campaign. I’d already been very happy with The Vamp which I’d funded, so I wasn’t unfamiliar with the Kickstarter process.

Well it arrived this week, Tuesday to be precise, and as I’ve been busy with other things this week I’ve left it until now to open the box.

Two guesses what I’ll be doing tomorrow ? ☺

Strictly more Pi : part 2

Google + Pano of Mark and Ashleigh performing
A Google+ Pano of Mark & Ashleigh's performance.

Mark and Ashleigh danced 3 times for us; a Viennese Waltz, a Tango and something Richard Claydermanesque. I only know the first from the movies, the second from The Gotan Projects’ “La Revancha Del Tango” while the third seems inexplicably and inescapably linked to the 80’s melodrama “The Thorn Birds” staring with Richard Chamberlain, go figure, and here you thought I was going all “high brow.”

The Pi “Dance, sensor, camera thingymebob Pi™“ was a one shot deal, the scripts started on boot and ran until I manually stopped them by connecting an ethernet cable and ssh’ing back into the Pi and running sudo /etc/init.d/dance90fps stop and sudo /etc/init.d/xloborgdata stop. Not very elegant at all, but it would do for a first run.

The Viennese Waltz

As seen from Claire’s iPad

As seen from Mark’s Google Glass

The Tango

As seen from Claire’s iPad

As seen from Ashleigh’s “Dance, sensor, camera thingymebob Pi™“

The Final Dance

The Final Dance as seen from with Claire’s iPad

The Final Dance as from with Mark’s Google Glass

The Final Dance as seen from with Ashleigh’s “Dance, sensor, camera thingymebob Pi™“

The thing to remember is that Ashleigh’s headcam was recording at 90fps and even then it stuggled to capture the speed of her motion as they performed. In the next post (i.e. when I work out how to do it) I’ll post the data captured from the XLoBorg and complete the triptych.

Reflections aka Things I wish I’d thought of at the start

My kit bag would have included the following:

  1. a pen
  2. a notepad
  3. a stopwatch
  4. a clapperboard

It turned out to be rather more time consuming than I thought finding the start and end points of each performance in the 1.2GB 90fps file, and a basic note of start/stop times would have made the last few days a lot easier.

It would also be very useful to have a working video-editing software, because Linux. Pitivi did it’s usual trick of pretending to start and atempting to import the raw footage only to vanish with a “What’d you expect?” Kdenlive, pulled in a monster truck of dependancies and took an absolute age to render. These are both really cool open source projects but they’re not there yet.

So in the end my editing workflow looked something like this.

Handwritten scrawl of timing details
How not to edit.

followed by commands like

ffmpeg -ss 0 -t 00:04:32 -i ashleigh_clayderman.mp4 ashleigh_clayderman_2.mp4

Which is the polar opposite of non-destructive editing, but “you live and learn”

Strictly more Pi : part 1, redux

The Raspberry Pi Head Cam

does my hair look big in this ?

While you ponder the magnificance of the image above, let me explain how it was that I came to be sitting in my office wearing a Raspberry Pi camera sewn onto one of Claire’s Accessories finest headbands. That I’m posting it at all should answer the question that first popped into your head “Has he no shame?” to which the answer is a resounding no

It all started in Starbucks on Street Lane, as all things Clare Garside are wont to do. You can read Claire’s motivations and thinking behind the project on her blog post. I can’t remember exactly how the conversation went, probably as I was still hyperventilating having, to paraphrase Withnall, “gone cycling by mistake”. Anyway by the end of it I’d left with one of these

XLoBorg

XLoBorg

XLoBorg is a motion and direction sensor for the Raspberry Pi. The plan was that we, and I say we with the hyperventaliting caveat still fresh in your minds, would learn to dance using some of the ideas from Tim Ferriss’ The 4-Hour Chef in particluar exploring his ideas around Meta-Learning.

meta-learning Deconstruction, Selection, Sequencing and Stakes Compression, Frequency and Encoding
© Tim Ferriss.

So how did I end up with the rather fetching headband? I hear you ask. It all started when Claire mentioned that the Ten Centre had a Google Glass, well at that moment the project just expanded to incorporate Glass.

First Principles

Dance as defined by a simple Google Search Algorithm as defined by a simple Google Search
dance and algorithms.

Nothing new there, indeed Mark Dorling has an excellent resource Get with the ‘algor-rhytm’ on his digitalschoolhouse.org.uk site.

We wanted to find out

  1. what a dance looked like from the dancers point of view.
  2. what a dance felt like from a dancers point of view.

The Google Glass would give us one POV, a Raspberry Pi headcam could give us the other, the same Raspberry Pi with the XLoBorg would give us a record of the motion and direction, the G force exerted on the dancer and thus from humble beginnings I give you the “Dance, sensor, camera thingymebob Pi™“

Dance, sensor, camera thingymebob Pi
Needles, thread, a Pi Camera and a headband The finished headband
My sewing skills haven't improved.

Getting the thingymebob™ working

Pi Camera

As I’d done some time-lapse work with the Pi Camera before the initial plan was to capture a series of pictures from the camera on the headband and make a time-lapse video out of resulting stills. So testing was required, hence the image at the start of this post. Testing pointed out the first real problem with the endeavour, the shutter speed was too slow and the images where blurry, oh and they were 90º off.

So first solution involved changing the mode of the camera to sports which would force a faster shutter speed and adding rot 90 to rotate the resulting image.

modififications made I ended up with dance_capture.sh

#!/bin/bash
# script to take timelaspe images of a dance using a head mounted raspberry pi camera (don't ask) rotated at 90 degrees
# verion 1.2 RBM 28/07/14 amended for dance capture, head band

# This script will run at startup and run for a set time taking images every x seconds
# it is based on the scripts found at 
# https://github.com/raspberrypi/userland/issues/84 
# and http://www.stuffaboutcode.com/2013/05/time-lapse-video-with-raspberry-pi.html


# for the format of the final images we will use  date
# the output directory will be /home/pi/dance_capture

filename=$(date +"%d%m_%H%M%S")_%04d.jpg
outdir=/home/pi/dance_pics/
length=1800000 # 1/2 hours 
rate=1000 # 1000 miliseconds = 1 second  



/opt/vc/bin/raspistill -ex sports -rot 90 -o  $outdir/$filename -tl $rate -t $length &

Thanks to Martin O’Hanlon’s excellent <Stuff about=”code” /> blog for the scripts.

The images where still too blurred, so a Plan B was required.

Raspivid

Enter the 90 fps mode for the Raspberry Pi camera, you can read about it on the Raspberry Pi Foundation blog

dance_capture.sh became the slightly less documented 90frames.sh with a sleep to give the dancer time to get into position before recording.

#! /bin/sh
# script to capture vga video at 90fps for dance project
sleep 30
raspivid -rot 90 -w 640 -h 480 -fps 90 -t 900000 -o /home/pi/dance_pics/ninetyfps_dance.h264 &

As the blog explains the 90fps mode is limited to 640x480 which is more than enough for our lttle experiment.

XLoBorg

hack is the only word I can use to describe what I did to xloborg.py which came from the PiBorg examples this snippet is my only alteration, and proper programmers will be able to spot why it took me an age to find xloborg_results

# Auto-run code if this script is loaded directly
if __name__ == '__main__':
    # Load additional libraries
    import time
    # open a file to write the results to
    f = open('xloborg_results', 'w') 
    # Start the XLoBorg module (sets up devices)
    Init()
    try:
        # Loop indefinitely
        while True:
            # Read the 
            x, y, z = ReadAccelerometer()
            mx, my, mz = ReadCompassRaw()
            temp = ReadTemperature()
            everything = (x, y, z, mx, my, mz, temp)
            everythingString = str(everything)
            # print 'X = %+01.4f G, Y = %+01.4f G, Z = %+01.4f G, mX = %+06d, mY = %+06d, mZ = %+06d, T = %+03d°C' % (x, y, z, mx, my, mz, temp)
            f.write (everythingString)
            time.sleep(0.1)
    except KeyboardInterrupt:
        # User aborted
        pass

xloborg_results did give me a 1.1MB file full of readings.

(-1.0625, -0.421875, 0.15625, 936, 85, -85, 3)(-0.828125, -0.265625, 0.265625, 930, 86, -84, 3)(-0.96875, -0.078125, 0.328125, 931, 67, -68, 3)(-1.109375, -0.046875, 0.484375, 950, 35, -61, 4)(-0.859375, -0.15625, -0.171875, 948, 23, -50, 4)(-1.046875, -0.390625, 0.4375, 942, 23, -42, 3) and so on....

which correspond to:

X, Y, Z, mX, mY, Mz, T in the python snippet above, so for the moment I’m happy that it worked. The next challenge is to represent this data in a meaningful way, so I’ll be looking at gnuplot to do that.

tbc.

This year’s fashion must haves

Mark and Ashleigh
Ashleigh and Mark make it look so effortless, I still think the thingymebob™ looks better on me 😏

Teaser post until tomorrow and yes Mark is wearing Google Glass, but wait to you see what Ashleigh’s wearing.

Mark and Ashleigh from @northleedsdance
Playing with Google Glass : part 2

Wyke Beck Cycle Route
Wykebeck Way.

So I went you for a quick bike ride, just to test out Glass, it was either that or annoy my family all evening by annunciating at regular intervals “ok glass. Take a photo” or “ok glass. Record a video” so I figured I’d do that in public on a bike, as you do.

Wyke Beck Way

Wyke Beck Way

I trundled off along the Wyke Beck Way talking to myself/Glass as I went. My fears of the Glass slipping off and being crushed under my wheels proved unfounded, they fit rather snuggly under my helmet. I had some fun taking a few photos, “ok glass. Take a photo”

Chasing Rabbits Sunset at Roundhay Lake Harehill Cemetery
Photos taken on the move.

I had some fun sending messages to my wife “ok glass. Send a message to” the voice recognition worked pretty well considering my huffing and puffing but you can judge for yourself.

Send a message to, Google Glass command
>Voice recognition in action.

Action Sequence

No write up about Google Glass would be complete without the obligatory “ok glass. Record a video” action sequence, so without further ado I give you, the imaginatively titled.

Cycling with Google Glass