Update – I’ve been doing more experiments with WebRTC on the Pi – latest is here.
Barney and I have been working on a “HackspaceHat” – a telepresence hat so you can show people around Hackspaces. The idea is this: someone in the hackspace puts on the hat. This turns it on, and triggers a message (to twitter?) that the hat is available. Then someone can join via a web browser, see and hear what’s happening as the wearer moves around the space, and send voice requests about what direction to go in.
We were thinking of using WebRTC for it, as there’s quite a lot of support in various browsers for it, and that makes the “user” end of the hat quite straightforward. However, we’ve not yet found a great solution for the “hat” end.
Here’s where we’ve got to, and rough instructions to get as far as we have. I’ll do a step by step guide when we’ve got further.
Where we are:
- happily streaming audio and video from a Pi 2 to Firefox, with low latency, by accessing a web-server on the device, on a local network on a separate machine
Where we want to be:
- sending audio the other way
- hosting the application on a remote server and accessing it via a different network
Install Janus gateway
I followed these instructions. There are a few slight alterations in the installation of gstreamer, and I’ve added in audio.
Install Janus gateway as in those instructions:
$ sudo aptitude install libmicrohttpd-dev libjansson-dev libnice-dev libssl-dev libsrtp-dev libsofia-sip-ua-dev libglib2.0-dev libopus-dev libogg-dev libini-config-dev libcollection-dev pkg-config gengetopt libtool automake dh-autoreconf
$ git clone https://github.com/meetecho/janus-gateway.git
$ cd janus-gateway
$ sh autogen.sh
$ ./configure --disable-websockets --disable-data-channels --disable-rabbitmq --disable-docs --prefix=/opt/janus
$ sudo make install
$ sudo make configs
$ sudo nano /opt/janus/etc/janus/janus.plugin.streaming.cfg
edit out the others and put this one in (slightly modified from the tutorial, to allow audio):
type = rtp
id = 1
description = RPWC H264 test streaming
audio = yes
audioport = 8005
audiopt = 10
audiortpmap = opus/48000/2
video = yes
videoport = 8004
videopt = 96
videortpmap = H264/90000
videofmtp = profile-level-id=42e028\;packetization-mode=1
The instructions are slightly out of date – you need to do his:
sudo apt-get install libgstreamer0.10-0 libgstreamer0.10-dev gstreamer0.10-tools gstreamer0.10-plugins-base libgstreamer-plugins-base0.10-dev gstreamer0.10-plugins-good gstreamer0.10-plugins-ugly gstreamer0.10-plugins-bad gstreamer0.10-ffmpeg
The gstreamer pipeline I used is this:
raspivid --verbose --nopreview -hf -vf --width 640 --height 480 --framerate 15 --bitrate 1000000 --profile baseline --timeout 0 -o - | gst-launch-0.10 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=127.0.0.1 port=8004 alsasrc device=plughw:Set ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host=127.0.0.1 port=8005
- basically the raspi camera piped to two gstreamer pipelines, one for video, one audio. You seem to be able to just concatenate them together.
Note: I used a cheap little USB audio card for this – like this one. To find your device name (mine is “plughw:Set”) do:
and look for the USB one. You may need to do this too:
sudo nano /etc/modprobe.d/alsa-base.conf
options snd-usb-audio index=-2
options snd-usb-audio index=0
and reboot (see this post for more useful audio commands for the Pi).
It’ll need a microphone in the mic slot.
(Again following the instructions here)
$ sudo aptitude install nginx
Copy the Janus HTML content to the Nginx server root:
$ sudo cp -r /opt/janus/share/janus/demos/ /usr/share/nginx/www/
$ sudo service nginx start
Go to the Janus gateway demo page
(I’ve modified the streaming example a bit to understand it better – see this gist – so you could put that in /usr/share/nginx/www/ if you like.)
Then – assuming the Pi is on the same network – go to
http://<ip.of.pi>/demos/janus-gateway-streamtest.html if you’re using my modified version) and you should see video and hear audio in the browser.