What I’ve done did in October 2022

It’s my first month of my sabbatical

Got a new TV

I’ve been slowly improving the TV and Sound System in the front-room and this month I bought the last part of that upgrade — a new TV.

The main feature I was after was HDMI 2.1 so I could pass through the Dolby Atmos signal from my Games Console to my Soundbar. I found a nicely priced screen in Samsung UE50BU8000 which had that feature and while I was hesitent of getting a Samsung for privacy reasons I’ve been able to not connect it to the internet so not had to accept their T&Cs for data collection.

This new TV has meant that the bedroom had an upgrade, going from a 28" 1080p set to a 50" 4K set where I moved my old 2016 TV in there — this has taken some adapting to as it really lights up the room when the Apple TV shuts off which makes it hard to sleep if we nod off half way through a show.

Had a bit of a spring clean

The main goal for my first month on sabbatical has been to clear out the spare bedroom / office we have in the flat. I’ve not really paid attention to the accumulation of various things left laying about as new tech has been bought, board games have stopped being played and hobbies get given up on.

So far I’ve been able to sell our various bits of old tech for about £600 and I’m currently waiting for a pricing on the mound of comics and board games that I decided I no longer needed.

Hopefully the proceeds from this harvest will help pay the rent for one of the months I’m not going to be receiving my normal salary for but mainly I’m just happy to have the space back.

Went pumpkin picking

At the start of the month my girlfriend dragged me down to a random field just outside of Leeds to go pick pumpkins for carving ahead of Halloween, however instead of buying one each like originally planned we ended up with 5 in total.

I was quite happy with how the carvings came out but today we had to chuck them because they’d gone mouldy so we might have been a little too early with carving them.

I did the top left, was very happy how it came out. I also did the M one (it’s a red pumpkin so it’s meant to be a Megamato from Kirby) but that was a right pain to carve so I won’t talk about it

Leeds Testing Atelier

Also at the start of the month was Leeds Testing Atelier. It was the first one in about three years and it was great meeting up with everyone in the Leeds testing community again.

There were some really great talks and alround it was a really fun day.

Got a new iPhone

I was a little on the fence about buying a new iPhone as I’ve previously tried and failed to move away from Apple (my justification for that failure is that they make good phones and I’ll trust them for now as I’ve moved core things to my own NAS).

In November I’ll be going abroad for the first time since 2019 so I wanted to get a good camera to take with me. I looked at a number of compact systems (DLSR feels too bulky to lug about) but none of them could match the iPhone 14 Pro for quality and they were around the same price so I buckled and upgraded from an iPhone 11 Pro to the iPhone 14 Pro.

The main thing I’ve enjoyed about the phone so far is the LiDAR sensor which isn’t a new feature (I think the iPhone 12 Pro had one) but for me it is and I’ve been putting it to work using a number of AR apps (I’m also looking to build my own).

The best app I’ve seen so far is Scaniverse by Niantic which makes some really amazing 3D models just by scanning objects. Below is a tweet with a video of one of my scans.

Video of a 3D scan of a tree with mushrooms on it I took in Hyde Park, Leeds


LeedsJS was back this month, the first month since I’ve been made JS & UI development capability lead at work so it’s been good to get a little more involved in things.

This month there was a demo of building JS applications without writing JS (essentially using features of MVC frameworks to provide ‘live’ areas of a website) and Ady Stoke’s amazing Accessibility Quiz which left everyone thinking more about some of the assumptions they have about accessibility.

Ported MIDI to LSDJ to JS

Between 2017 and 2019 I spent some time figuring out how to parse a MIDI file and then transcribe that MIDI representation of music into the structure used by Little Sound DJ, a tracker for the Nintendo Gameboy.

The original version was written in Python and the library I used to parse the MIDI file was an old library written for Python 2.7 and in the last three years Python 2.7 has finally died so it was time to upgrade.

Unfortunately the Python MIDI space seems to be a bit underserved and NodeJS had a number of libraries so I experimented a little and found a good replacement. This meant that I then had to port the Python code over to Node and clean it up (it was a right mess).

I’m happy with the result though as it also gave me an excuse to build something in NextJS.

Got a MacBook Pro

Similar to the reason for the iPhone upgrade I found myself falling back into Apple’s clutches as although Framework have released some firmware to improve battery life I still don’t seem to get more than 4 hours out of my first generation Framework laptop.

Due to the fact that I’ll be travelling for a month and the fact that for whatever reason Linux file managers just don’t have the column view that Finder has I decided to buy a M1 Macbook Pro (I have one for work and the battery life if crazy good).

I got a good deal on it too, Curry’s had a sale on the base 14" model and John Lewis matched that price so I was able to shave about £150 off the price I’d have paid directly from Apple.

Tried out Apple Freeform

iPadOS 16.2 is now in beta and with that new version comes Apple’s Freeform app, a proposed collaborative whiteboard experience that takes advantage of Apple’s communication APIs and the iPad’s Apple Pencil accessory.

It is after trying it out however, utter crap and so hilariously lacking in the integrations with tools that people will want to actually use to pull in information to collaborate about that it’s going to go the way of Apple’s other productivity software like Numbers and Pages in that you use it because it’s there not because you want to.

Started playing around with Face Recognition libraries

A week or so ago I found out about the face_recognition Python library and started having a bit of a play around with how that library might be used in a web context.

The library alone wouldn’t be enough to scale up to web levels of data but I found an neat euclidean distance based solution to this that uses the cube extension in PostgreSQL which means the request only needs to process a face to get the vector representation and that can then be used to perform a database query instead of having to store, then load every vector for every face the system has seen using the Python library.

I’m currently working through a demo project for this tech using Strapi (tweet with video below) and have seen that face-api.js can provide these vectors too so I might move to that in order to make the API more responsive.

Demo of the API I built being called by Strapi when media is uploaded and then querying using a different pic of same person to get that media



Currently building reciprocal.dev. Interested in building shared understanding, Automated Testing, Dev practises, Metal, Chiptune. All views my own.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Colin Wren

Currently building reciprocal.dev. Interested in building shared understanding, Automated Testing, Dev practises, Metal, Chiptune. All views my own.