#Mayke Day 4 – TTN and LoRaWAN – TiL

Tarim and I have been trying to get a LoRaWAN network up and running in Bristol using some of the old Bristol Wireless antenna locations. First step for me was in January when we got together and tried to get a Raspberry Pi Gateway working, with so much #fayle – a subtly broken Pi, a dodgy PSU connector, and I did not know that the Raspberry Pi imager process had changed for Bullseye (you have to set a user in settings, and enable ssh there – you can also put the wifi details in, so it’s handy if you know about it).

Aaanyway for #mayke (now on Mastodon) I’ve been trying for a couple of days to get a TTGO LoRa32 OLED v1.3(?) I bought ages ago to work with the Pi gateway. In summary: argh. there’s so many partial examples around and different naming things and allsorts. But are some notes on what works.

On the Raspberry Pi: 3B+ and a IC880A board that Tarim had – then install Bullseye (with ssh access and wifi and a pi user) and then install using The Things Network (TTN)’s example gateway instructions. All fine. My only daftness here was finding this command: /opt/ttn-station/bin/station -p and assuming (why?) that I was tailing the logs instead of running another instance on top of the systemctl one. Which led to all sorts of weird errors, including ones related to not resetting the device e.g.

[lgw_receive:1143] CONCENTRATOR IS NOT RUNNING, START IT BEFORE RECEIVING

 [HAL:INFO] [lgw_spi_close:159] Note: SPI port closed

[lgw_start:764] Failed to setup sx125x radio for RF chain 0

etc.

D’oh.

The TTGO was more tricky. There seem to be multiple libraries at multiple levels of abstraction and I wanted one that was Arduino-IDE compatible. It’s really hard to find out what pin mapping you need for these slightly obscure (and superceded) TTGO boards. Then there’s the difference between LoRaWAN Specification 1.0.3 and LoRaWAN Specification 1.1.1. After a while I realised that the MCCI_LoRaWAN_LMIC_library (0.9.2) I was using in the code I had found on the internet was made for 1.0.3 – and then configuring a TTN device was muuch easier with fewer baffling options.

One final self-own by my frenetic searching of forums looking for a bit of code with the right pin mapping for the TTGO

I somehow found some old code (I think it was this – don’t use it, 5 years’ old! – which I think is based on an old version of this, but adapted for the TTGO) which didn’t recognise all the event types from TTN. Updated below, basically adding this in setup()
LMIC_setAdrMode(1);
LMIC_setLinkCheckMode(1);

in setup()
and LMIC_setLinkCheckMode(1) again in case EV_JOINED.
Thank you TTN forum users, and again.

A couple more things – though there are probably more I’ve forgotten.

  1. The gateway was ok to set up on the TTN console, but setting up devices was not – all the names for the different device ids were completely baffling and seem to have changed over time. You also need to set up an application before you can add a device. Two key learnings (a) you can get the little / big endian -ness and the right format for the ids by clicking on the ids themselves in the console, see image below and (b) the Gateway has the JoinEUI you need to set up a device (check the Gateway’s messages for this, see image below).
  2. You HAVE TO hand edit ./project_config/lmic_project_config.h in MCCI_LoRaWAN_LMIC_library on your machine to pick the right region (on a mac, mine was in /Users/[me]/Documents/Arduino/libraries/MCCI_LoRaWAN_LMIC_library/project_config/lmic_project_config.h)

Formatting endianness and chars

LSB is little- MSB is big- and <> switches between chars with the preceding 0x business and without. DEVEUI and APPEUI are little and APPKEY is big.

JoinEUI for devices is in the gateway messages like this:

I somewhat enjoyed the detective work and even read some of TFM. So a happy #mayke for me.

The final code I used:

// MIT License
// https://github.com/gonzalocasas/arduino-uno-dragino-lorawan/blob/master/LICENSE
// Based on examples from https://github.com/matthijskooijman/arduino-lmic
// Copyright (c) 2015 Thomas Telkamp and Matthijs Kooijman

#include <Arduino.h>
#include "lmic.h"
#include <hal/hal.h>
#include <SPI.h>

#define LEDPIN 2


unsigned int counter = 0;
char TTN_response[30];

// This EUI must be in little-endian format, so least-significant-byte
// first. When copying an EUI from ttnctl output, this means to reverse
// the bytes.

// Copy the value from Device EUI from the TTN console in LSB mode.
static const u1_t PROGMEM DEVEUI[8]= { 0x.., 0x.., .. };
void os_getDevEui (u1_t* buf) { memcpy_P(buf, DEVEUI, 8);}

// Copy the value from Application EUI from the TTN console in LSB mode
static const u1_t PROGMEM APPEUI[8]= { 0x.., 0x.., .. };
void os_getArtEui (u1_t* buf) { memcpy_P(buf, APPEUI, 8);}

// This key should be in big endian format (or, since it is not really a
// number but a block of memory, endianness does not really apply). In
// practice, a key taken from ttnctl can be copied as-is. Anyway its in MSB mode.
static const u1_t PROGMEM APPKEY[16] = { 0x.., .. };
void os_getDevKey (u1_t* buf) { memcpy_P(buf, APPKEY, 16);}

static osjob_t sendjob;

// Schedule TX every this many seconds (might become longer due to duty
// cycle limitations).
const unsigned TX_INTERVAL = 120;

// Pin mapping
const lmic_pinmap lmic_pins = {
    .nss = 18,
    .rxtx = LMIC_UNUSED_PIN,
    .rst = 14,
    .dio = {26, 33, 32}  // Pins for the Heltec ESP32 Lora board/ TTGO Lora32 with 3D metal antenna
};

void do_send(osjob_t* j){
    // Payload to send (uplink)
    static uint8_t message[] = "Hello OTAA!";

    // Check if there is not a current TX/RX job running
    if (LMIC.opmode & OP_TXRXPEND) {
        Serial.println(F("OP_TXRXPEND, not sending"));
    } else {
        // Prepare upstream data transmission at the next possible time.
        LMIC_setTxData2(1, message, sizeof(message)-1, 0);
        Serial.println(F("Sending uplink packet..."));
        digitalWrite(LEDPIN, HIGH);
    }
    // Next TX is scheduled after TX_COMPLETE event.
}

void onEvent (ev_t ev) {
    Serial.print(os_getTime());
    Serial.print(": ");
    Serial.print(ev);
    Serial.print(": ");
    switch(ev) {
        case EV_SCAN_TIMEOUT:
            Serial.println(F("EV_SCAN_TIMEOUT"));
            break;
        case EV_BEACON_FOUND:
            Serial.println(F("EV_BEACON_FOUND"));
            break;
        case EV_BEACON_MISSED:
            Serial.println(F("EV_BEACON_MISSED"));
            break;
        case EV_BEACON_TRACKED:
            Serial.println(F("EV_BEACON_TRACKED"));
            break;
        case EV_JOIN_FAILED:
            Serial.println(F("EV_JOIN_FAILED"));
            break;
        case EV_REJOIN_FAILED:
            Serial.println(F("EV_REJOIN_FAILED"));
            break;
        case EV_LOST_TSYNC:
            Serial.println(F("EV_LOST_TSYNC"));
            break;
        case EV_RESET:
            Serial.println(F("EV_RESET"));
            break;
        case EV_RXCOMPLETE:
            // data received in ping slot
            Serial.println(F("EV_RXCOMPLETE"));
            break;
        case EV_LINK_DEAD:
            Serial.println(F("EV_LINK_DEAD"));
            break;
        case EV_LINK_ALIVE:
            Serial.println(F("EV_LINK_ALIVE"));
            break;

        case EV_SCAN_FOUND:
            Serial.println(F("EV_SCAN_FOUND"));
            break;
        
        case EV_TXSTART:
            Serial.println(F("EV_TXSTART"));
            break;
        case EV_TXCANCELED:
            Serial.println(F("EV_TXCANCELED"));
            break;
        case EV_RXSTART:
            // do not print anything -- it wrecks timing 
            break;

        case EV_TXCOMPLETE:
            Serial.println(F("EV_TXCOMPLETE (includes waiting for RX windows)"));

            if (LMIC.txrxFlags & TXRX_ACK) {
              Serial.println(F("Received ack"));
            }

            if (LMIC.dataLen) {
              int i = 0;
              Serial.print(F("Data Received: "));
              Serial.write(LMIC.frame+LMIC.dataBeg, LMIC.dataLen);
              Serial.println();
              Serial.println(LMIC.rssi);

              for ( i = 0 ; i < LMIC.dataLen ; i++ )
                TTN_response[i] = LMIC.frame[LMIC.dataBeg+i];
              TTN_response[i] = 0;

            }

            // Schedule next transmission
            os_setTimedCallback(&sendjob, os_getTime()+sec2osticks(TX_INTERVAL), do_send);
            digitalWrite(LEDPIN, LOW);

            // Schedule next transmission
            os_setTimedCallback(&sendjob, os_getTime()+sec2osticks(TX_INTERVAL), do_send);
            break;
        case EV_JOINING:
            Serial.println(F("EV_JOINING: -> Joining..."));

            break;
        case EV_JOINED: {
              Serial.println(F("EV_JOINED"));

              LMIC_setLinkCheckMode(1);
            }
            break;

        default:
            Serial.println(F("Unknown event"));
            Serial.print(ev);
            Serial.print("\n");
            break;
    }

}

void setup() {
    Serial.begin(115200);
    delay(2500);                      // Give time to the serial monitor to pick up
    Serial.println(F("Starting..."));

    // Use the Blue pin to signal transmission.
    pinMode(LEDPIN,OUTPUT);

    // LMIC init
    os_init();

    // Reset the MAC state. Session and pending data transfers will be discarded.
    LMIC_reset();
    LMIC_setClockError(MAX_CLOCK_ERROR * 1 / 100);
    // Set up the channels used by the Things Network, which corresponds
    // to the defaults of most gateways. Without this, only three base
    // channels from the LoRaWAN specification are used, which certainly
    // works, so it is good for debugging, but can overload those
    // frequencies, so be sure to configure the full frequency range of
    // your network here (unless your network autoconfigures them).
    // Setting up channels should happen after LMIC_setSession, as that
    // configures the minimal channel set.

    LMIC_setupChannel(0, 868100000, DR_RANGE_MAP(DR_SF12, DR_SF7),  BAND_CENTI);      // g-band
    LMIC_setupChannel(1, 868300000, DR_RANGE_MAP(DR_SF11, DR_SF7B), BAND_CENTI);      // g-band
    LMIC_setupChannel(2, 868500000, DR_RANGE_MAP(DR_SF10, DR_SF7),  BAND_CENTI);      // g-band
    LMIC_setupChannel(3, 867100000, DR_RANGE_MAP(DR_SF9, DR_SF7),  BAND_CENTI);      // g-band
    LMIC_setupChannel(4, 867300000, DR_RANGE_MAP(DR_SF8, DR_SF7),  BAND_CENTI);      // g-band
    LMIC_setupChannel(5, 867500000, DR_RANGE_MAP(DR_SF7, DR_SF7),  BAND_CENTI);      // g-band
    LMIC_setupChannel(6, 867700000, DR_RANGE_MAP(DR_SF7, DR_SF7),  BAND_CENTI);      // g-band

    
    // TTN defines an additional channel at 869.525Mhz using SF9 for class B
    // devices' ping slots. LMIC does not have an easy way to define set this
    // frequency and support for class B is spotty and untested, so this
    // frequency is not configured here.

    // Disable link check validation
    ///LMIC_setLinkCheckMode(0);
    LMIC_setAdrMode(1);
    LMIC_setLinkCheckMode(1);
    //LMIC_setClockError(MAX_CLOCK_ERROR * 1 / 100);

    // TTN uses SF9 for its RX2 window.
    LMIC.dn2Dr = DR_SF9;

    // Set data rate and transmit power for uplink (note: txpow seems to be ignored by the library)
    //LMIC_setDrTxpow(DR_SF11,14);
    LMIC_setDrTxpow(DR_SF9,14);

    // Start job
    do_send(&sendjob);     // Will fire up also the join
}

void loop() {
    os_runloop_once();
}

Updated real_libby GPT-2 chatbot

real_libby, my GPT-2 retrained, slack chatbot hosted on a raspberry pi 4 eventually corrupted the SD card. Then I couldn’t find her brain (I could only find business-slack real_libby who is completely different). And so since I was rebuilding her anyway, I thought I’d get her up to date with Covid and the rest of it.

For the fine-tuning data, since I made the first version in 2019 I’ve more or less stopped using irc ( :’-( ) and instead use Signal. I still use iMessage, and use Whatsapp more. I couldn’t figure out for a while how to get hold of my Signal data so first built an iMessage / Whatsapp version, as that’s pretty easy with my setup (details below, basically sqlite3 databases from an unencrypted backup of my iPhone). I had about 30K lines to retrain with, which I did using this as before, on my M1 macbook pro.

The text/whatsapp version uses exclamation marks too much and goes on about trains excessively. Not super interesting.

It is in fact possible to get Signal messages out as long as you use the desktop client, which I do (although it doesn’t transfer messages between clients, only ones received while that device was authorised). But I still had 5K lines to play with.

I think Signal-libby is more interesting, though she also seems closer to the source crawl, so I’m more nervous about letting her loose. But she’s not said anything bad so far.

Details below for the curious. It’s much like my previous attempt but there were a few fiddly bits.

The Signal version is a bit more apt, I think, and says longer and more complex things.

She makes up urls quite a bit; Signal’s where I share links most often.

Maybe I’ll try a combo version next, see if there’s any improvement.

Getting data

A note on getting data from your phone – unencrypted backups are bad. Most of your data is just there lying about, a bit obfuscated, but trivially easy to get at. The commands below just get out your own data. Baskup helps you get out more.

iMessages are in

/Users/[me]/Library/Application\ Support/MobileSync/Backup/[long number]/[short number]/3d0d7e5fb2ce288813306e4d4636395e047a3d28

sqlite3 3d0d7e5fb2ce288813306e4d4636395e047a3d28
.once libby-imessage.txt
select text from message where is_from_me = 1 and text not like 'Liked%';

Whatsapp are in

/Users/[me]/Library/Application\ Support/MobileSync/Backup/[long number]/[short number]/7c7fba66680ef796b916b067077cc246adacf01d

sqlite3 7c7fba66680ef796b916b067077cc246adacf01d
.once libby-whatsapp.txt
SELECT ZTEXT from ZWAMESSAGE where ZISFROMME='1' and ZTEXT!='';

Signals desktop backups are encrypted so you need to use this, which I could only get to work using docker. Signal doesn’t back up from your phone.

Tweaks for finetuning on a M1 mac

git clone git@github.com:nshepperd/gpt-2.git
git checkout finetuning
cd gpt-2
mkdir data
mv libby*.txt data/
pip3 install -r requirements.txt
python3 ./download_model.py 117M
pip3 install tensorflow-macos # for the M1
PYTHONPATH=src ./train.py --model_name=117M --dataset data/

tensorflow-macos is tf 2, but that seems ok, even though I only run tf 1.3 on the pi.

rename the model and get the bits you need from the initial model

cp -r checkpoint/run1 models/libby
cp models/117M/{encoder.json,hparams.json,vocab.bpe} models/libby/

Pi 4

The only new part on the Pi 4 was that I had to install a specific version of numpy – the rest is the same as my original instructions here.

pip3 install flask numpy==1.20
curl -O https://www.piwheels.org/simple/tensorflow/tensorflow-1.13.1-cp37-none-linux_armv7l.whl
pip3 install tensorflow-1.13.1-cp37-none-linux_armv7l.whl

Interactive Flowerbots and legobots with a bit of machine vision

I made a bunch of robots recently building on some of the ideas from libbybot and as an excuse to play with some esp32 cams that Tom introduced me to in the midst of his Gobin Frenzy.

The esp32s cams are kind of amazing and very very cheap. They have a few quirks, but once you get them going it’s easy to do a bit of a machine vision with them. In my case, new to this kind of thing, I used previous work by Richard to find the weight of change between two consecutive images. Dirk has since made better versions using a PID library. I made some felt-faced ones using meccano and then some lego ones using some nice lego-compatible servos from Pimoroni. Because esp32s can run their own webserver you can use websockets or mjpeg to debug them and see what they are doing.

The code’s here, (here’s Dirk’s) and below are a few pictures. There’s a couple of videos in the github repo.

Time squish

I keep seeing these two odd time effects in my life and wondering if they are connected.

The first is that my work-life has become either extremely intense – and I don’t mean long hours, I mean intense brainwork for maybe a week – that wipes me out – and then the next is inevitably slower and less intense. Basically everything gets bunched up together. I feel like this has something to do with everyone working from home, but I’m not really sure how to explain it (though it reminds me of my time at Joost where we’d have an intense series of meetings with everyone together every few months, because we were distributed. But this type is not organised, it just happens). My partner pointed out that this might simply be poor planning on my part (thanks! I’m quite good at planning actually).

The second is something we’ve noticed at the Cube – people are not committing to doing stuff (coming to an event, volunteering etc) until very close to the event. Something like 20-30% of our tickets for gigs are being sold the day before or on the day. I don’t think it’s people waiting for something better. I wonder if it’s Covid-related uncertainty? (also 10-15% don’t turn up, not sure if that’s relevant).

Anyone else seeing this type of thing?

Sparkfun Edge, MacOS X, FTDI

More for my reference than anything else. I’ve been trying to get the toolchain set up to use a Sparkfun Edge. I had the Edge, the Beefy3 FTDI breakout, and a working USB cable.

Blurry pic of cats taken using Sparkfun Edge and HIMAX camera

This worked great for the speech example, for me (although the actual tensorflow part never understands my “yes” “no” etc, but anyway, I was able to successfully upload it)

$ git clone --depth 1 https://github.com/tensorflow/tensorflow.git
$ cd tensorflow
$ gmake -f tensorflow/lite/micro/tools/make/Makefile TARGET=sparkfun_edge micro_speech_bin
$ cp tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/keys_info0.py tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/keys_info.py
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/create_cust_image_blob.py --bin tensorflow/lite/micro/tools/make/gen/sparkfun_edge_cortex-m4_micro/bin/micro_speech.bin --load-address 0xC000 --magic-num 0xCB -o main_nonsecure_ota --version 0x0
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/create_cust_wireupdate_blob.py --load-address 0x20000 --bin main_nonsecure_ota.bin -i 6 -o main_nonsecure_wire --options 0x1
$ export BAUD_RATE=921600
$ export DEVICENAME=/dev/cu.usbserial-DN06A1HD
$ python3 tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/tools/apollo3_scripts/uart_wired_update.py -b ${BAUD_RATE} ${DEVICENAME} -r 1 -f main_nonsecure_wire.bin -i 6

But then I couldn’t figure out how to generalise it to use other examples – I wanted to use the camera because ages ago I bought a load of tiny cameras to use with the Edge.

So I tried this guide, but couldn’t figure out where it the installer had put the compiler. Seems basic but….??

So in the end I used the first instructions to download the tools, and then the second to actually do the compilation and installation on the board.

$ find . | grep lis2dh12_accelerometer_uart
# you might need this - 
# mv tools/apollo3_scripts/keys_info0.py tools/apollo3_scripts/keys_info.py 
$ cd ./tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/boards_sfe/edge/examples/lis2dh12_accelerometer_uart/gcc/
$ export PATH="/Users/libbym/personal/mayke2021/tensorflow/tensorflow/lite/micro/tools/make/downloads/gcc_embedded/bin/:$PATH"
$ make clean
$ make COM_PORT=/dev/cu.usbserial-DN06A1HD bootload_asb ASB_UPLOAD_BAUD=921600

etc. Your COM port will be different, find it using

ls /dev/cu*

If like me the FTDI serial port KEEPS VANISHING ARGH – this may help (I’d installed 3rd party FTDI drivers ages ago and they were conflicting with the Apple’s ones. Maybe. Or the reboot fixed it. No idea).

Then you have to use a serial programme to get the image. I used the arduino serial since it was there and then copy and pasted the output into a textfile, at which point you can use

tensorflow/lite/micro/tools/make/downloads/AmbiqSuite-Rel2.2.0/boards_sfe/common/examples/hm01b0_camera_uart/utils/raw2bmp.py

to convert it to a png. Palavers.