Setting up test automation for embedded devices has its unique challenges. This little podcast kicks off the first ever season of DevOps Sauna.

Heidi (00:05):

Hi, and welcome to DevOps Sauna, Eficode's brand spanking new podcast about all things DevOps, automation, and continuous delivery. So we're sitting here on the top floor of Eficode's Helsinki head office, where our sauna is. Now before we get going with our first guest, I thought I'd say something about thinking behind DevOps Sauna as a podcast.

Heidi (00:31):

Now, every Friday at Eficode we heat up the sauna, and that's where lots of the consultants go to unwind after a busy week of pipeline building and problem-solving and what have you. Now, for those of you who aren't familiar with saunas, I've got some homework for you. You need to watch a documentary called the Steam of Life, or [foreign language 00:00:54] in Finish. It's a fantastic documentary. And it embodies the philosophy of why saunas are such places of honesty and sincerity. It's where people open up to each other.

Heidi (01:06):

And saunas have swept over the world. It's the only Finnish loanword in the English language. And wherever you go, you're going to find a sauna, which is like DevOps. DevOps has gone mainstream. It's changing the way people are building software. So that's why we thought DevOps sauna would be a good idea. We wanted to bottle up some of the conversations we were having around the office and share them with you really.

Heidi (01:32):

So, how about we dive right in with our first guest. I couldn't be more excited about her. She's one of our senior consultants, and I think I'm going to call her Alice because she's always looked like an Alice to me. How's it going, Alice?

Alice (01:50):

Yeah, it's okay. We have busy day as usual, but that's what happens.

Heidi (01:56):

Fantastic. And you've made time for us, which is so great. Couldn't be more pleased to be sitting here with you. So you're a senior consultant. What kind of industries have you worked across?

Alice (02:08):

Yeah, so I've had quite a lot of different projects, I like to change often. And I've had so far health industry, the media, some wood processing and automotive.

Heidi (02:26):

Fantastic. And what kind of DevOps have you been doing in those industries?

Alice (02:31):

Yeah, so depending, most companies are trying to enter this game of automating as much as they can, including putting into automation things have been done previously by people randomly, adminning. So making it simpler, easier, and more documentable to deploy servers or to do their automation for tests and work in general in more continuous manner on their software.

Heidi (03:09):

That is well cool. What's the best-case scenario when it comes to test automation of embedded software working with a client?

Alice (03:18):

Yeah, so I haven't actually done the best-case. I mean, the companies that have it all together and have the best-case, very often just do it by themselves. They don't call on consultants. So I'm basically inferring from all the mistakes I've seen or all the problems that I've been called up to fix. And I think the best way of doing test automation, especially in embedded software, because embedded historically has been very much into the waterfall model. So you first have to plan it, then you implement it, you make device. And when you have it, you test it.

Alice (04:05):

It's very difficult to guess what kind of issues you're going to have until you actually have the device in your hands and can test with it. And that of course requires some experience, that when you sit and plan, you can already foresee what kind of trouble you're going to have. But because the iteration over hardware is usually costly and you just usually want to have as few prototypes as possible and just have one, two, and then just go into production with that. When you start planning, it's already important to think about what test automation you're going to do.

Alice (04:51):

Many embedded companies think that, "Yeah, we're just going to test it." And then the test usually means manual testing. And manual testing and automation testing is very different. The way humans interact with the devicesl, this is a completely different story.

Heidi (05:13):

Yeah, yeah.

Alice (05:13):

And very often developers that haven't done test automation, don't even consider it until you start talking with them and they realize, "Yeah, it's a catch-22. You haven't seen it before, you haven't tried it, but you have to design for it. So the earlier you do it, the better. The earlier you think about how you will be able to enter the workings of software when it's in the embedded device, and how do you do it even securely. So you don't even want to have only a QA engineer who specializes in test automation in embedded, you also want a security engineer with you to get you all those things, not only transparently easily, but also securely.

Alice (06:07):

So when this product goes into the world, it's, it's not something that you can just hack in a minute of seconds. That's possibly lives at stake, but that's maybe because I've been in the automotive where lives at stake actually are important. So [crosstalk 00:06:27] for that.

Heidi (06:27):

Yeah, you're putting actual humans in that embedded software. Yeah, so it sounds like the best-case scenario is when you get all those experts in at the very beginning. You did however mention that there are some issues that can pop up if you haven't done that, or if you're not that familiar with test automation, with embedded software. Is there anything that companies can do to prevent those issues in the beginning? What are some of the tricks that you've learned that they can take with them?

Alice (06:57):

So, first thing that you want to take into account is that even if you've done mostly embedded software, you actually want to draw on the lessons from development of regular software. Because in regular software, there is the concept of MVPs, so you very fast try to get the first minimal version and already test that. You want to test and incrementally in chunks develop, everything should be as modular as possible in the software. So then you can already, even before you have the device, even before you're sure exactly how it's going to be, when you have a skeleton that you designed to be very modular, you can already start unit testing and component testing on software that you have in your computer.

Alice (07:50):

And the next thing is that when you're going to have that software inside the device, you want to also have a way of directly interfacing with the software on the device, like plug a cable in and just talk to it. Because especially if the end result is a device that's self-contained, and the only interface intended for the customer is just buttons, touch screen, or some kind of input that is designed for hands. And then your output will be, again, some letters on a screen.

Alice (08:32):

If you want to automate those very physical inputs and outputs later in the game, it's much more difficult. It's much less reliable. It takes much more time. You can have a camera looking at the screen and it's will work somehow, but image recognition isn't the fastest. The display on the screen will take a while to display the things you could speed up much more if you just plugged it with a cable and had all that communication via there. So, you could already do much better if you thought about this, that's the [crosstalk 00:09:11] you will have it. And it is something that you might not think about at first.

Heidi (09:20):

Yeah, it's almost counterintuitive. You have this thing that's meant to be used by hands and yet in creating it, you can actually be testing it without using humans at all. It takes a bit of work in the beginning, but it's well worth it. It saves you money, it saves you time at that point.

Alice (09:36):

Yes.

Heidi (09:36):

However, if you do it at the end of the process, that might be really difficult and expensive for you.

Alice (09:41):

Yeah. It's extremely expensive. Especially when you realize that the human inputs, there will be always some errors coming out of those. You would rather have something that repeatedly does it the same way, gives you the same results. The test environment as much replicable, as much [inaudible 00:10:04] as possible. So, when you test failure in the test gives you clearer signal, that's the software has made a mistake. It's not your test set up. It's not the inputs. It's not image recognition on the screen. It was the software.

Heidi (10:24):

Let's switch gears. Could you give me a scenario that you've seen in customers in the past when it comes to test automation of embedded devices? That would be interesting.

Alice (10:36):

Yeah. So there is actually a ... When you think about it as a consultant, since I enter a situation that is already developed, I have seen situations when there was a whole device, embedded device, that was not prepared for test automation at all. So yeah, late in the game you have the device, you have human only inputs and outputs, and you have to automate that. This is very hard and I've seen lots of issues stemming from it, but I don't think that's the most interesting one. The one that I think is more interesting is that customers, to a degree, know what they're working with. I mean, they build it.

Heidi (11:18):

Yeah. [crosstalk 00:11:19], yeah.

Alice (11:20):

So I've seen it that the customer had test automation for running the tests, but then you had to have the manual testers or the testers go physically to the device, connect to it and run the tests. So you had a calendar reservation for each device. You made a reservation in advance, you took half an hour or one hour, how long you thought that the test will take. And then when the day and hour came, you had to pack your things, go to the lab, set up, connect, run the tests, collect the results, pack yourselves up and go back to your desk.

Alice (12:09):

So that took a lot of extra time, the scheduling with the calendar was not ideal. And if the testser has run the test, gotten the report, but did not share it, the results were basically lost to everybody else. So I've seen that remitted by being hooked up with a CI chain, so all the laboratory devices were connected to computers. The computers were running a CI agent. All of those CI agents were reporting to a master, and in the master, the testers could just queue nicely with configuration of choice.

Heidi (12:56):

Yep. Yep.

Alice (12:57):

And that configuration would then, by CI, select the device that would correspond, it would then run the right tests. And then the results would not only go to the tester who run it, it will also go to the database that would later be used to just do full analysis of all the results.

Heidi (13:22):

Yeah, yeah.

Alice (13:22):

So you suddenly got visibility on the whole level of the company. And that became much more, I wouldn't say even like automation, that would just give you the ease of testing. It's suddenly give you the long term statistics on progression of the quality of the software.

Heidi (13:45):

Yeah. And visibility is huge. That's, so many companies are after visibility over their software development processes, because it's like a superpower.

Alice (13:51):

Yeah. And you get trends. You're able to get much more information out of that. And then you can develop software much, much better.

Heidi (14:03):

Yeah. That is a best-case scenario. Okay. Thank you so much for that. I think that's a great place to end our first ever DevOps Sauna podcast, which ended up being about test automation and embedded devices mostly. So thank you so much for that insight, Alice.

Alice (14:21):

Thank you.

Heidi (14:22):

So just to round off, if you're not following us yet on social media, please feel to follow us on your social media channel of choice and at Eficode. And we had some exciting news a week and a half ago. Eficode is joining forces with a Scandinavian dev ops consultancy called Praqma, that's P-R-A-Q-M-A. They have some rock-solid content on their website and across their social media channels. So feel free to check those out as well. And I'm see you at the, or hear me, at the next episode of DevOps Sauna. Bye.