Automating OctoPrint's release tests

There was something different about the 1.4.1rc4 release candidate that (unless you follow me on Twitter or already watched OctoPrint On Air #33) you probably didn’t notice, but it made a world of a difference to me. 1.4.1rc4 was the first release candidate to go through final release preparations using my newly created test setup which will save me a ton of time per release cycle, and since it features some fairly cool technology and was a several years long journey I want to tell you a bit about it.

The problem

Let’s start with how the release preparations have looked like for the past years. I have a release checklist that I run through for each and every release and release candidate. It’s currently maintained in Microsoft OneNote but it would work just as well as a simple Markdown document in a Gist, and for the longest time that’s actually how I did it.

This checklist contains all the steps that I need for pushing out a release, from the final commits into the code for updating the translations, supporter list and error reporting IDs, writing release notes, into tagging the release version, running a whole gauntlet of install and update tests before finally pulling the actual trigger on the release, pushing a prepared blog post live and then finally merging the release branches back through the whole branch hierarchy on the code repository.

The part of that process that so far ate most of the time for each release was the install and update tests. In order to ensure that there are no nasty surprises when pushing a release out to everyone of you, I flash several instances of various versions of OctoPi and manual installs, provision them, set them to a determined start state consisting of a start version and selected release channel, then fake the release notification and check that the update finishes successfully and everything looks fine. I have been using a bunch of helper scripts powered by Fabric 1.x for huge parts of that for a long while now, but some steps of this whole process I still had to do manually for each and every single update test, of which there are usually 7-10 per release: flashing SD cards and creating my starting point.

For every update scenario I had to:

  1. Power down a test Pi
  2. Eject its SD card
  3. Insert the card into the card reader on my desktop PC
  4. Flash the target image
  5. Configure WiFi, hostname and password
  6. Eject the card from the card reader
  7. Reinsert the card into the test Pi
  8. Wait for the Pi to boot
  9. Run through the first time setup wizard
  10. Update OctoPrint to the version I want to test updating from
  11. Configure the correct release channel
  12. Install a custom plugin to fake release notifications

I automated step 5 a couple years ago using a task defined in my Fabric file which would place the needed octopi-wpa-supplicant.txt, octopi-hostname.txt and octopi-password.txt files on the boot partition of the Pi. That was at least one time sink removed from the process. The next step was to automate steps 9 through 12, again by some added tasks in my Fabric file and after some fiddling around that was also achieved.

But the biggest time sink of all of that was still all that SD card juggling and waiting for things to flash, so steps 1-4 and 6-8. For those to happen, SD cards have to be moved around physically. Booting the test Pis from network or via a configurable USB drive or something was not an option, as in order to do that I’d have had to change the firmware or the boot parameters of the test Pis themselves, causing them to no longer mirror the devices you all are using out there for running OctoPrint, and thus making the tests less trustworthy.

Looking for solutions

I have been pondering this problem for several years now and over the course of time I kept looking for various possible solutions again and again while manually running my test cluster.

I looked into a flash station that would at least allow me to flash a whole bunch of cards at once to use for all of the test scenarios, but most available flash stations are used to flash the same image to multiple cards, not multiple images, so that was a dead end (and it would also have been prohibitively expensive anyhow and still forced me to juggle cards).

I looked into SD card emulation but had to realize that doing that would at least require an FPGA, a lot of time and firmware development and even then would not guarantee the outcome I needed as having my tests possibly slowed down by emulation would negate most of the speed gains I could achieve by automation and would also be very frustrating to use.

I looked into building something that could switch the data lines of an SD between two readers, and that multiplexing approach looked like the most promising one. But the DIY solutions I found sounded a bit hit and miss and I couldn’t find anything available that was still affordable or even buildable or buyable. That is, until my most recent research into that direction at the start of July.

Building a prototype

I found a nifty little device produced by Linux Automation GmbH called USB-SD-Mux that was created specifically to test embedded devices that boot from MicroSD cards and that have to have their SD card switched between themselves and a test host in order to provide them with various test scenarios. It is a little board that you plug into the MicroSD slot of your device under test (DUT) on one side, insert your MicroSD into a slot on the other and then connect to your test host (running Linux with at least kernel 4.*) via a MicroUSB cable. You can then toggle the card between host and DUT with a simple command. If it’s switched to the host, it will be seen as a regular mass storage device. With some udev rules you can even make the device names it presents as make predictable and thus fully automate toggling and mounting.

One of these boards cost around 100€, but it was pretty much exactly what I needed, so I ordered one to test.

Additionally, I looked around for a way to switch power to a bunch of USB powered Pis without having to use WiFi enabled power sockets or design something myself and thankfully fell over Yepkit’s YKUSH device. It’s a 3 port powered USB hub, capable of individually switching each output (again via a small command line tool that’s available for Linux) and claiming to support up to 2A concurrently per port. Cost: ~35€. I ordered one as well.

The parts arrived and I rigged up a first prototype, with a Pi4 as the controller/host and a single DUT - a 0W that was collecting dust. After some late night fiddling around and installing some tools on a fresh Raspbian image I was able to toggle the DUT on and off and switch its MicroSD card between host, DUT and back while SSHd into the controller.

After that successful first test I quickly ordered a second USB-SD-Mux and started designing the actual testrig.

The testrig

The goal was to create a testrig, capable of supporting two DUTs (a Pi 2 and a Pi 3), controlled by a Pi 4, and all of that contained in a small but accessible and maintainable package that still can get expanded too should I want to add a third DUT at some point.

Similar to my former test cluster setup the DUTs are mounted vertically with the ports facing forward and the SD card with inserted USB-SD-MUX facing backwards. I used a small printed standoff to also secure the USB-SD-MUX to the tray so that nothing could wiggle loose.

To the side of the DUTs are the controller and the USB hub in a horizontal configuration. I modified the USB hub to have a 5.5x2.5mm power input into which I plugged a 5V/8A power supply which should hopefully prevent any under voltage issues from cropping up on the DUTs.

Thanks to the awesome people at Mr Beam Lasers GmbH I’ve recently been able to add their OctoPrint powered MrBeam dreamcut to my office, so that’s what I used to create the mounting hardware that I needed out of 3mm Plywood. And this is the result:

SSHing into the controller now allows me to quickly switch both of the DUTs on and off, and also toggle their SD card between the controller and the DUT in order to flash them and configure WiFi credentials and so on. I added a bunch of commands to my Fabric file to accomplish that as well, and after some more scripting I can now say that my goal has finally been achieved: All of the steps outlined above can be done with a command line like

TAG=1.4.1rc4 TARGET=pi3 fab flashhost_flash_and_provision:0.17.0 octopi_test_update_maintenance:maintenance,version=1.4.1rc3

As I said, 1.4.1rc4 was the first release candidate to be pushed out with the help of the revised test setup and using the testrig, and the update testing that used to take 2h+ was done in under 1h. To say that I’m ridiculously excited would be an understatement.

Conclusion

After several years of juggling SD cards on every release and growing increasingly frustrated by that, I finally found a solution to automate all of that and then some. It wasn’t a cheap solution, but considering how much time it will save me in the future and also how many cases of “did I do X or didn’t I do X yet?” slowing down or nuking whole test runs will be prevented by this it was well worth the investment.

I might still look into automating the actual update and “does everything look good” steps of the test runs further, using end-to-end tests powered by cypress like I just recently added to the continous integration on Github, but for now I’m calling this long overdue build a very successful project and can’t wait to continue to use it for future releases.

If you want to take a look at the Fabric file or the BOM and files of the testrig, you can find all of that on Github.


This is a companion discussion topic for the original entry at https://octoprint.org/blog/2020/07/29/automating-octoprints-release-tests/
3 Likes

Fantastic stuff! Always glad to see open source projects converging on more efficient production of the project to the benefit of both those who develop it and those who use it.

1 Like

oh this makes the OCD in my heart sing. so now I can expect updates twice as fast right? :sweat_smile:

1 Like

@Mcmasterp all I can promise is that they might arrive around noon CET of The Release Day™ more often now than around the afternoon :stuck_out_tongue:

It's always good to be able to automate the boring things! I use fabric extensively and help out with support on IRC when people have questions. I noticed you don't use @task decorators in your fabfile for the fabric tasks that should be accessible. You can prevent fab -l from displaying functions like env_from_yaml and dict_merge by only decorating the functions you wish to become tasks. Also since you use Fabric 1.x I assume you already know that 2.x is not quite ready for prime-time. There's currently an effort planned to bring patches from the "Fabric3" fork in to the main tree to add Python 3 support for Fabric 1.x while 2.x work continues.

1 Like

Ah, thanks for that, I'll have to add @task. The fabfile organically grew over the years and at first only consisted of tasks. I honestly never checked if there was a decorator for that ^^

I am actually running my build via "Fabric3" since I already ran into the issues you mentioned. Tried half an afternoon to migrate my existing stuff to 2.x and then gave up frustrated and just used the fork with Py3.

Wow that allot that goes into a release! Never would have guessed. Open source can be quite fulfilling, but also demanding. Keep up the great work!

1 Like

Thank you for doing such complete testing, it definitely shows in the quality of the releases.

I first saw the sd-mux back in October, colleagues who do a lot of embedded Linux and thus similar use cases were showing it off. I was considering it as a way to put files on my printer's SD card until I found out the price... (I think I can actually get away with just sharing the SPI pins, but it hasn't been important enough to me for now.)

Hadn't heard of Fabric before, looking into it.

Thanks for the useful post!

1 Like

We all under-appreciate how much work and and effort that you put in to OctoPrint. That does not mean that we are not all eternally grateful. It is great to see that actual process involved hardware wise that is necessary to deal with a micro SD based system.

2 Likes

Impressive test rig!
I am not a programmer or developer but set how i could use some of these ideas for maker projects. Bringing up the idea of automated setup is there a way to preset things, for the lay user? Such as:
-config file that can be dropped onto the sd card after imaging
--either updating a text file or a web similar to the Duet 3D reprap configurator. This would also enable designating which plugins to add
-setup questions during first boot of octopi
-more setup questions during the web interface setup when first accessing Octoprint

I realize not every one is using the same Linux distros but something that can detect the most commonly used versions and make changes to them. I, for one, like to assign a static up to raspbian. I believe a config file would also improve pip and distro upgrades. Backup the current config file as a quick way to restore any lost settings. Or use it on a clean install.

I'd tried using those external sd reader things you've got in the first picture, but, I found that cable length seemed to have a detrimental effect on certain speed sd cards. Varying the cable length seemed to mitigate it slightly, but still left questionable results

A brief attempt was made to build something like the mux you listed, but, I gave up on that due to the ridiculously small scale of things that needed to be soldered

Of course, I had no idea that a SDmux even existed

Thank you for all your hard work !

1 Like

That's an idea that I've had as well, but have not yet had time to dig into. Mind you, that would be more of an OctoPi feature than an OctoPrint feature (difference see here).

Since OctoPi usually runs headless that's pretty impossible - you can't display setup questions if there's no display.

Like? Imho the first run wizard is already fairly long and it would be detrimental to user experience to add a ton more stuff to it.

Just to make sure, you are aware of the bundled backup plugin?

I must have been lucky then :sweat_smile: Had been rocking this setup for 5 years or so now, before I built the new rig.

1 Like

And this is improved in the next 1.4.1 release as the initial wizard step is restore from backup.