Interactive Flowerbots and legobots with a bit of machine vision

I made a bunch of robots recently building on some of the ideas from libbybot and as an excuse to play with some esp32 cams that Tom introduced me to in the midst of his Gobin Frenzy.

The esp32s cams are kind of amazing and very very cheap. They have a few quirks, but once you get them going it’s easy to do a bit of a machine vision with them. In my case, new to this kind of thing, I used previous work by Richard to find the weight of change between two consecutive images. Dirk has since made better versions using a PID library. I made some felt-faced ones using meccano and then some lego ones using some nice lego-compatible servos from Pimoroni. Because esp32s can run their own webserver you can use websockets or mjpeg to debug them and see what they are doing.

The code’s here, (here’s Dirk’s) and below are a few pictures. There’s a couple of videos in the github repo.

Time squish

I keep seeing these two odd time effects in my life and wondering if they are connected.

The first is that my work-life has become either extremely intense – and I don’t mean long hours, I mean intense brainwork for maybe a week – that wipes me out – and then the next is inevitably slower and less intense. Basically everything gets bunched up together. I feel like this has something to do with everyone working from home, but I’m not really sure how to explain it (though it reminds me of my time at Joost where we’d have an intense series of meetings with everyone together every few months, because we were distributed. But this type is not organised, it just happens). My partner pointed out that this might simply be poor planning on my part (thanks! I’m quite good at planning actually).

The second is something we’ve noticed at the Cube – people are not committing to doing stuff (coming to an event, volunteering etc) until very close to the event. Something like 20-30% of our tickets for gigs are being sold the day before or on the day. I don’t think it’s people waiting for something better. I wonder if it’s Covid-related uncertainty? (also 10-15% don’t turn up, not sure if that’s relevant).

Anyone else seeing this type of thing?

Sparkfun Edge, MacOS X, FTDI

More for my reference than anything else. I’ve been trying to get the toolchain set up to use a Sparkfun Edge. I had the Edge, the Beefy3 FTDI breakout, and a working USB cable.

Blurry pic of cats taken using Sparkfun Edge and HIMAX camera

This worked great for the speech example, for me (although the actual tensorflow part never understands my “yes” “no” etc, but anyway, I was able to successfully upload it)

$ git clone --depth 1 https://github.com/tensorflow/tensorflow.git
$ cd tensorflow
$ gmake -f tensorflow/lite/micro/tools/make/Makefile TARGET=sparkfun_edge micro_speech_bin
$ cp tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/keys_info0.py tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/keys_info.py
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/create_cust_image_blob.py --bin tensorflow/lite/micro/tools/make/gen/sparkfun_edge_cortex-m4_micro/bin/micro_speech.bin --load-address 0xC000 --magic-num 0xCB -o main_nonsecure_ota --version 0x0
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/create_cust_wireupdate_blob.py --load-address 0x20000 --bin main_nonsecure_ota.bin -i 6 -o main_nonsecure_wire --options 0x1
$ export BAUD_RATE=921600
$ export DEVICENAME=/dev/cu.usbserial-DN06A1HD
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/uart_wired_update.py -b ${BAUD_RATE} ${DEVICENAME} -r 1 -f main_nonsecure_wire.bin -i 6

But then I couldn’t figure out how to generalise it to use other examples – I wanted to use the camera because ages ago I bought a load of tiny cameras to use with the Edge.

So I tried this guide, but couldn’t figure out where it the installer had put the compiler. Seems basic but….??

So in the end I used the first instructions to download the tools, and then the second to actually do the compilation and installation on the board.

$ find . | grep lis2dh12_accelerometer_uart
# you might need this - 
# mv tools/apollo3_scripts/keys_info0.py tools/apollo3_scripts/keys_info.py 
$ cd ./tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/boards_sfe/edge/examples/lis2dh12_accelerometer_uart/gcc/
$ export PATH="/Users/libbym/personal/mayke2021/tensorflow/tensorflow/lite/micro/tools/make/downloads/gcc_embedded/bin/:$PATH"
$ make clean
$ make COM_PORT=/dev/cu.usbserial-DN06A1HD bootload_asb ASB_UPLOAD_BAUD=921600

etc. Your COM port will be different, find it using

ls /dev/cu*

If like me the FTDI serial port KEEPS VANISHING ARGH – this may help (I’d installed 3rd party FTDI drivers ages ago and they were conflicting with the Apple’s ones. Maybe. Or the reboot fixed it. No idea).

Then you have to use a serial programme to get the image. I used the arduino serial since it was there and then copy and pasted the output into a textfile, at which point you can use

tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/boards_sfe/common/examples/hm01b0_camera_uart/utils/raw2bmp.py

to convert it to a png. Palavers.

ESP32 M5StickC, https, websockets, and Slack

I got one of these lovely M5StickCs for a present, and had a play with it as part of Makevember. I wanted to make a “push puppet” (one of those toys that you push upwards and they collapse) that reacted to Slack commands. Not for any reason really, though I like the idea of tiny colleagues that stand up when addressed on slack. Makevember doesn’t need a reason. Or at any rate, it doesn’t need a good reason.

Here are some notes about https and websockets on the ESP32 pico which is the underlying board for the M5StickC.

I made a “slack wobbler” a couple of years ago, also in makevember – an ESP8266 that connected to slack, then wobbled when someone was mentioned, using a servo. Since then I ran into some https problems, obviously also encountered by Jeremy21212121 who fixed it using a modified version of a websockets server. This works for the ESP8266 – turns out you can also get the same result using httpsClient.setInsecure() using BearSSL. I’ve put an example of that here.

For ESP32 it seems a bit different. As far as I can tell you need the certificate not the fingerprint in this case. You can get it using openssl s_client -connect api.slack.com:443

For ESP32 you also need to use the correct libraries for wifi and wifimulti. The websocket client library is this one.

And a final note – the M5StickC is very cool but doesn’t enable you to use many of its GPIO ports. The only one I can find that allows you to use a servo directly is on the Grove connector, which I bodged some female jumper wires into, though you can get a grove to servo converter (there are various M5Stick hats you can use for multiple servos). Here’s some code. And a video.

Sock-puppet – an improved, simpler presence robot

Makevember and lockdown have encouraged me to make an improved version of libbybot, which is a physical version of a person for remote participation. I’m trying to think of a better name – she’s not all about representing me, obviously, but anyone who can’t be somewhere but wants to participate. [update Jan 15: she’s now called “sock_puppet”].

This one is much, much simpler to make, thanks to the addition of a pan-tilt hat and a simpler body. It’s also more expressive thanks to these lovely little 5*5 led matrixes.

Her main feature is that – using a laptop or phone – you can see, hear and speak to people in a different physical place to you. I used to use a version of this at work to be in meetings when I was the only remote participant. That’s not much use now of course. But perhaps in the future it might make sense for some people to be remote and some present.

New recent features:

  • easy to make*
  • wears clothes**
  • googly eyes
  • expressive mouth (moves when the remote participant is speaking, can be happy, sad, etc, whatever can be expressed in 25 pixels)
  • can be “told” wifi details using QR codes
  • can move her head a bit (up / down / left / right)

* ish
**a sock

I’m still writing docs, but the repo is here.

Libbybot-lite – portrait by Damian