HackspaceHat part 1: WebRTC, Janus and Gstreamer

Barney and I have been working on a “HackspaceHat” – a telepresence hat so you can show people around Hackspaces. The idea is this: someone in the hackspace puts on the hat. This turns it on, and triggers a message (to twitter?) that the hat is available. Then someone can join via a web browser, see and hear what’s happening as the wearer moves around the space, and send voice requests about what direction to go in.

We were thinking of using WebRTC for it, as there’s quite a lot of support in various browsers for it, and that makes the “user” end of the hat quite straightforward. However, we’ve not yet found a great solution for the “hat” end.

Here’s where we’ve got to, and rough instructions to get as far as we have. I’ll do a step by step guide when we’ve got further.

Where we are:

  • happily streaming audio and video from a Pi 2 to Firefox, with low latency, by accessing a web-server on the device, on a local network on a separate machine

Where we want to be:

  • sending audio the other way
  • hosting the application on a remote server and accessing it via a different network

Instructions

Install Janus gateway

I followed these instructions. There are a few slight alterations in the installation of gstreamer, and I’ve added in audio.

Install Janus gateway as in those instructions:

$ sudo aptitude install libmicrohttpd-dev libjansson-dev libnice-dev libssl-dev libsrtp-dev libsofia-sip-ua-dev libglib2.0-dev libopus-dev libogg-dev libini-config-dev libcollection-dev pkg-config gengetopt libtool automake dh-autoreconf
$ git clone https://github.com/meetecho/janus-gateway.git
$ cd janus-gateway
$ sh autogen.sh
$ ./configure --disable-websockets --disable-data-channels --disable-rabbitmq --disable-docs --prefix=/opt/janus
$ make
$ sudo make install
$ sudo make configs
$ sudo nano /opt/janus/etc/janus/janus.plugin.streaming.cfg

edit out the others and put this one in (slightly modified from the tutorial, to allow audio):

[gst-rpwc]
type = rtp
id = 1
description = RPWC H264 test streaming
audio = yes
audioport = 8005
audiopt = 10
audiortpmap = opus/48000/2
video = yes
videoport = 8004
videopt = 96
videortpmap = H264/90000
videofmtp = profile-level-id=42e028\;packetization-mode=1

Install gstreamer

The instructions are slightly out of date – you need to do his:

sudo apt-get install libgstreamer0.10-0 libgstreamer0.10-dev gstreamer0.10-tools gstreamer0.10-plugins-base libgstreamer-plugins-base0.10-dev gstreamer0.10-plugins-good gstreamer0.10-plugins-ugly gstreamer0.10-plugins-bad gstreamer0.10-ffmpeg

The gstreamer pipeline I used is this:

raspivid --verbose --nopreview -hf -vf --width 640 --height 480 --framerate 15 --bitrate 1000000 --profile baseline --timeout 0 -o - | gst-launch-0.10 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=127.0.0.1 port=8004 alsasrc device=plughw:Set ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host=127.0.0.1 port=8005

– basically the raspi camera piped to two gstreamer pipelines, one for video, one audio. You seem to be able to just concatenate them together.

Note: I used a cheap little USB audio card for this – like this one. To find your device name (mine is “plughw:Set”) do:

cat /proc/asound/cards

and look for the USB one. You may need to do this too:

sudo nano /etc/modprobe.d/alsa-base.conf
Edit
options snd-usb-audio index=-2
to
options snd-usb-audio index=0

and reboot (see this post for more useful audio commands for the Pi).

It’ll need a microphone in the mic slot.

Install nginx

(Again following the instructions here)

$ sudo aptitude install nginx

Copy the Janus HTML content to the Nginx server root:

$ sudo cp -r /opt/janus/share/janus/demos/ /usr/share/nginx/www/

Start Nginx:

$ sudo service nginx start

Go to the Janus gateway demo page

(I’ve modified the streaming example a bit to understand it better – see this gist – so you could put that in /usr/share/nginx/www/ if you like.)

Then – assuming the Pi is on the same network – go to

http://<ip.of.pi>/demos/streamingtest.html

(or http://<ip.of.pi>/demos/janus-gateway-streamtest.html if you’re using my modified version) and you should see video and hear audio in the browser.

AWS new instance ssh timing out

In case this is any use to anyone else –

I’ve had AWS instances running for a few years. Today I went to create another one for something and infuriatingly, couldn’t connect to it over ssh at all: ssh just kept timing out.

I found a few links to do with groups, but the default group created for me in the (much improved) wizard seemed to be fine for incoming ssh connections. I then found a bunch of links about ip tables and the like, and after an extremely frustrating hour got it to work. Essentially (and without having done much research on this), AWS seems to have moved to a Virtual Private Cloud architecture, which existing users like me seem to have been moved into by AWS creating a default VPC for me, but one which somehow didn’t allow incoming connections.

I’m really not sure precisely why it worked, but it happened shortly after I created a new VPC and then created an instance using that. My VPC has DNS resolusion, DNS hostnames and “classic link”.

So: vague, but maybe it’ll help someone.

Some links:

Ubuntu EC2 AMIs

Types of AWS instances

Amazon Linux AMI Instance Type Matrix “This table shows which flavors of the Amazon Linux AMI are recommended on each Amazon EC2 instance type.”

Instance Storage

Adding an EBS (storage)

EC2 instance pricing

Connecting to your instance using ssh.

Shonkbot

UPDATE: better instructions are linked from Shonkbot’s github page

It was the journey home from MakerFaire UK, and Matt Venn, Richard Sewell and Anton Bowers were plotting about cheap robots, that were easy and cheap to build but also could do something interesting.

Richard has since made a shonky robot, designed to be made for as little money as possible. It writes its own name using a pen, and currently costs about £7 – we’re hoping to reduce that.

He donated a couple of stepper motors to me so I could try and make one myself. It was easy…esque. I don’t have a gluegun and that seems to have given it an extra level of shonkiness. But it’s a start. So here are some notes for later.

shonky1

shonky2

bot2

Step 1: Assemble the components

components

  • An arduino nano or clone
  • Some 20cm cable ties
  • A three x ‘AA’ battery box like this one
  • On old CD
  • A little breadboard
  • 2 stepper motors like these
  • wheels to fit the motors
  • O-rings or elastic bands to fit the wheels
  • glue (not UHU though. a gluegun works best, or failing that, some superglue)
  • 14 male to female jumper wires
  • 3 ‘AA’ batteries
  • A sharpie

You’ll also need a laptop and a mini (not micro) USB cable to connect the nano.

Step 2: Glue the wheels onto the CD

wheels

The wheels need to have some sort of grip on them. 50mm length rubber bands (measured like this) would work for these ones.

Step 3: Glue the supports onto the CD

Richard’s top tip: bend the cable tie a little about an inch from the latch. Mine: make the loop as big as you can.

loops

My pen-holder cable ties just wouldn’t stick (with superglue), so I tried something else (see later). The loops were ok though.

Step 4: Programme the Arduino and try the stepper motors powered from the USB

Here’s Richard’s code. You need this stepper motor library from here on github.

Connecting it up is easy:

  • board 1: wire arduino pin D2 to driver board IN1, D3 to IN2, D4 to IN3, D5 to IN4
  • board 2: wire arduino pin D6 to driver board IN1, D7 to IN2, D8 to IN3, D9 to IN4

Then you just add ground and 5v from the arduino to the motor boards, and then connect up the battery pack to the mini breadboard.

wiring

I had to get new drivers for the cheapo Arduino clone (CH340G chip) I was using.

On Mac Yosemite, I had to use the Arduino IDE 1.6 rather than the old 1.0 I had installed.

Step 5: Glue or fasten the boards, breadboard and battery pack onto the top of the CD

Gluedots seem to work well. Or even rubber bands – they don’t need to be very firmly stuck. If you put the stepper boards at the front the LEDs will light up satisfyingly. The battery pack is relatively heavy so needs to be more or less centred to stop it falling backwards. Make sure you leave the CD hole free for the pen.

Step 6: Add the sharpie

final

Here I connected my sharpie to two cable ties with a rubber band, pushed through the underside of the CD and then wrapped a bit of wire round it to secure it – ideally it should stay as vertical as possible. Richard did it differently:

richard_bot

Step 7: Connect up the batteries and place on A3 paper

Place about half way up the short side of the piece of paper, with the front facing towards the centre of the paper.

This is the best I’ve managed:

shonky_word

Mine is a bit weird because its wheels are slightly too big because of the velcro padding I needed to attach my too-big rubber bands – this can be fixed as a parameter in the code though.

Here’s what it ought to look like:

shonky_writing_better

Of course there’s lots of other things that could be done with it.

Guest blog: David Miller – A solar-powered glitter ball rotator

My Dad’s always had an interest in tinkering with electronics and the like. Recently he made an interesting thing so I asked him to write it up, and here it is:

A few weeks ago my wife and I were invited to join four friends for Sunday lunch. Our host has a south-facing dining room with a glitter ball sitting in the window. We were all entertained by the light beams gradually moving around the walls and ceiling. I suggested that a small solar cell and motor attached to the glitterball might improve the entertainment.

I bought these parts from Maplin soon after at a cost of £3.60. I glued the motor and cell to a spare ruler to which I also glued a strong copper wire to attach all to the ceiling. I have this running in our garden room.

SolarMotor

It rotates too fast and I need to add a resistor. Whether our friend will want this clutter with her glitter ball I have yet to find out. I await her next visit here before further changes.

Glitterball

Walls Have Eyes goes to the Design Museum

We made this initially as a post for a presentation at work, but it doesn’t seem quite right for a work blogpost (though we will do one for that too) but it seems a shame for it not to be public.

The context is this: a fairly quick hack Andrew Nicolaou, Jasmine Cox and I made for Mozfest got nominated for Design of the Year 2015, and so we redesigned it so that it would last five months in place at the museum as part of their exhibition (we hoped; we’ve had some teething problems).

This was something that’s completely new to me and Andrew, though Jasmine is much more experienced at making these kinds of things.

This is the story of us setting it all up.

Jasmine took most of the photos.

As Andrew pointed out, it’s come out a bit like a Peter and Jane book.

We initially made Walls Have Eyes very quickly as part of Ian and Jasmine’s Ethical Dilemma Cafe
WallsHaveEyes5

The combination of electronics in innocuous frames

WallsHaveEyes6


and an extremely noisy dot matrix printer

WallsHaveEyes2


and an updating html output from the cameras, meant that it got the message across quite well

mozfest_image


Then we were unexpectedly nominated for the Designs of the Year, which meant we had to build something that lasts 5 months.


So we needed to redesign it a bit and improve the code

plan

It was going to be on a wall rather than in an ambient cafe environment, so it needed a trigger, to make the experience more immediate, like this ultrasonic sensor
proximity_sensor


It needed wired networking rather than wifi for reliability, and we needed to test it intensively

in_kitchen


so Andrew and Libby rewrote the code (mostly Andrew).

code

Andrew designed and laser cut some beautiful glowing fittings for the frames

closeup_frame

Andrew, Dan and Libby tested it at QCon, including creating a ‘surveillance owl’ fitting for the sensor

qcon

and working through a load of issues

issues

By thursday we had all the bits more or less working in the kitchen

at_work

On Friday morning we took it all to the Design Museum, realising in the process that we needed better bags

packed

At the muesum, this was the first time we’d put the Raspberry Pis in the frames

libby_brain

and consequently that took a while

andrew_libby_frames_baffled

jasmine

Then placement took even longer

drilled_holes

and involved drilling

drilling_jasmine

and pondering

libby_andrew_pondering

and threading wires through holes

threading

We didn’t quite get it ready by the end of friday and had a mad dash to get trains, punctuated by Libby taking pictures of Tower Bridge

tower_bridge

On Monday, Andrew did a very slow, stressful dash across London through the roadworks to pick up some postcards and sort out the networking so we could debug remotely.

Then on Tuesday we all went over to do some final tweaks

dusting

and debugging.

is_it_working

Then, finally, the party started.

serious_andrew_libby

and it was working!

man_looking

and people were looking at it!

three_looking

so we had a small beer.

laughing_libby_jasmine


and it works still…

near_complete


…although yesterday we had to do a little fix.

gluegun