Makevember and lockdown have encouraged me to make an improved version of libbybot, which is a physical version of a person for remote participation. I’m trying to think of a better name – she’s not all about representing me, obviously, but anyone who can’t be somewhere but wants to participate. [update Jan 15: she’s now called “sock_puppet”].
This one is much, much simpler to make, thanks to the addition of a pan-tilt hat and a simpler body. It’s also more expressive thanks to these lovely little 5*5 led matrixes.
Her main feature is that – using a laptop or phone – you can see, hear and speak to people in a different physical place to you. I used to use a version of this at work to be in meetings when I was the only remote participant. That’s not much use now of course. But perhaps in the future it might make sense for some people to be remote and some present.
New recent features:
- easy to make*
- wears clothes**
- googly eyes
- expressive mouth (moves when the remote participant is speaking, can be happy, sad, etc, whatever can be expressed in 25 pixels)
- can be “told” wifi details using QR codes
- can move her head a bit (up / down / left / right)
* ish
**a sock
I’m still writing docs, but the repo is here.
