Maestro CEO Leland Takamine joins Jamon and Robin to discuss Maestro’s growth, new tools like Studio Desktop and cloud testing, and why reliability matters for automated testing.
Show Notes
Connect With Us!
This episode is brought to you by Infinite Red!
Infinite Red is an expert React Native consultancy located in the USA. With nearly a decade of React Native experience and deep roots in the React Native community (hosts of Chain React and the React Native Newsletter, core React Native contributors, creators of Ignite and Reactotron, and much, much more), Infinite Red is the best choice for helping you build and deploy your next React Native app.
Todd Werth:
Welcome back to React Native Radio Podcast, brought to you by my machine. Tired of Things not Working. Try My Machine. Everything works there. Episode 341, catching Up with Maestro
Jamon Holmgren:
Leland, do you remember how we got connected up? I think it was like two or three years ago, something like that.
Leland Takamine:
Yeah, it must have been a few years ago. I'm trying to think back. I think it might've been three years ago, but I think pretty soon after that we were on this podcast actually.
Jamon Holmgren:
Yeah, so maybe it was, we saw Maestro. I know we had Dema on, and it was episode 2 62. We looked it up beforehand. That was early 2023, so I don't remember. Did Maestro kind of come out right before that? Something like that? It was something like
Leland Takamine:
Late
Jamon Holmgren:
2022 or,
Leland Takamine:
Yeah, I remember because we released it right before Droid Con, New York 2022, so yeah, it was about, I guess three years ago now.
Jamon Holmgren:
Wow. A lot of water under the bridge Since then, we're going to go over some of that stuff, but Leland, can you introduce yourself?
Leland Takamine:
Yeah, for sure. I'm Leland, the CEO and co-founder here at Maestro. We've been at it for a while now. The company's been around for I think a little over five years now, but yeah, Maestro was released about three years ago.
Jamon Holmgren:
Yeah. Awesome. How did you get into Tech? What was your journey?
Leland Takamine:
Oh, let's see. Well, I was an Android engineer
Leland Takamine:
From
Leland Takamine:
Day one. Okay,
Leland Takamine:
Wow.
Leland Takamine:
And my first gig was at a company called Guidebook. They're still around now, but it was an events,
Robin Heinze:
My gosh, platform. That's so funny. I worked for a competitor of Guidebook Double Dutch, my early crowd Compass actually.
Leland Takamine:
Okay, I remember that.
Robin Heinze:
Yep.
Leland Takamine:
Nice. Which is now
Jamon Holmgren:
CNT or something.
Robin Heinze:
Yeah, it's now cnt. Yeah.
Leland Takamine:
Gotcha. Gotcha. But yeah, that was a cool gig. So I was there for, I want to say like three years, and then I was at Uber for about five, and I think that's where I cut my teeth. Got really deep on the mobile side, typical founder story, taking what I learned there and trying to scale it up to the rest of the industry.
Jamon Holmgren:
Right. Yeah, totally. I've actually found this to be kind of cool where we get this melding of the mobile world and the web world in the React native side of things, and obviously Maestro isn't just React native, but we see Maestro is pretty important here, the React native side of things.
Leland Takamine:
Yeah, I think the missions align well with React native and Expo specifically too, just of the idea that there shouldn't really be two completely separate worlds when it comes to web and mobile, especially with companies now trying to kind of align and unify the experiences. It's really tough when you have to train up two completely separate skill sets, completely separate tech stacks, development side and on the testing side, obviously. So that's what we're trying to do here as well.
Jamon Holmgren:
Awesome. Let's get into the topic real quick. Before that, I want to talk about our sponsor, my company Infinite Red. Infinite Red is a Premier React native consultancy located fully remote in the us. We're a team of 30 senior plus level React native developers, mostly staff level, although the whole leveling thing, that's a whole discussion we should talk about sometime. We've been doing this for nearly a decade. Staff,
Robin Heinze:
Principal, all the
Jamon Holmgren:
Things. Yeah, all I'll say is our average developer professional developer experience is 15 years or just under 14.8 or something like that. Cool. I did the math the other day. Pretty cool. Yeah, that's pretty cool. If you're looking for React native expertise for your next project, hit us up. Infinite Red slash Radio. Mazen is not here today. He's still out on baby leave, so it's just Robin, me and Leland today. So let's get into our topic. As I mentioned, react Native Radio 2 62, Dima came on and we were talking about the future for Maestra at that time, talking about better reliability of tests and potentially even natural language test creation, which we can talk about.
Robin Heinze:
We were still kind of introducing Maestro to the audience. It was new enough that we were kind of telling people about it as if they might not know, and two and a bit years later, it's more like, Hey, what's maestro up to now? Assuming they already know about.
Jamon Holmgren:
Yeah, exactly, and even this idea of becoming a hub for all of app reliability stuff, screenshot testing and different things like feature flags and performance monitoring product hunt. Let's start there. You actually made your debut and product hunt for Maestro Studio Desktop, so that's kind of a big deal that's happened just recently. I want to start with that. Tell us about that.
Leland Takamine:
Sure. Yeah, we just launched on Product Hunt, like you said, ended up hitting number three for the day, so we were super stoked about that. We were kind of back and forth second, third, how it goes, but yeah, we were happy with Third, it was our first product Hunt launch. We plan to do more in the future. This was the beta launch of the desktop app, but yeah, it went well and the idea behind Maestro Studio Desktop is that we're taking one of the most loved parts of Maestro, which was Maestra Studio, and the previous version was web-based. You start from the command line and making it just more accessible and kind of bundling all the good stuff about Maestra into one thing. So when you're getting started with Mara, you just have to download one and you're off to the races there. So yeah, that's the idea Behind that is better, more accessible real estate for people to use Maestro, and then somewhere where we can build in more powerful features.
Jamon Holmgren:
Yeah, yeah, absolutely. I've been working on some desktop, well, basically a rewrite of our React toron debugger app that we've been working on for a while, and I don't know, I like working on desktop apps for some reason. It's just, I guess because you're testing it in its natural environment, not in a simulator emulator removed from the computer
Robin Heinze:
You're on. Exactly. This is where it's going to run. It's not just simulation of where it's going to run
Jamon Holmgren:
Something about it.
Leland Takamine:
Maybe not everybody's right. Yeah, it's really tangible. You get that instant feedback.
Robin Heinze:
I'm just a huge fan. This feels like an unpopular opinion, but I'm actually just a huge fan of desktop apps in general for services that I use. If there's a desktop app version of it, I much prefer to use that for some reason, it's easier for my brain to switch between apps versus going into the browser and the browser is a million things.
Leland Takamine:
It's similar I think to the, I guess psychology of mobile applications versus the mobile version of a website, I think,
Robin Heinze:
Right. I need to be able to switch to the thing that I'm switching to and not switch. Think about, okay, I switch to Chrome and then find the
Leland Takamine:
Right. Yeah, totally.
Robin Heinze:
So I love desktop apps.
Leland Takamine:
Yeah, I think the bar just higher too. The expectation from a user is just that these things work really well when it's a native mobile app or a native desktop app, right?
Robin Heinze:
Yeah. You have fewer problems with getting logged out all the time, or things expiring or cash is expiring and having to refresh or the tabs getting closed. It is just a much more really good feeling experience.
Jamon Holmgren:
So I want to actually go back to a conversation that you and I had a while ago. I asked what's sort of the status with using natural language LMS AI with Maestro, because that always kind of felt like, I don't know, it felt like kind of a logical next step for you folks, especially in the era of AI now, but you had kind of a surprising but logical answer to me where you said we kind of went away from that and I think just from what I remember, it was a lot of you just can't, nobody can make the cost basis work with that. It's quite expensive to just have AI crawling your app or website or whatever all the time. Was there more that went into that?
Leland Takamine:
Well, actually, it's less about the cost of AI
Actually, but it's like you said, it's kind of a natural evolution in theory of Maestro, which is already human readable, very accessible, pretty close to human language already. So it's like, okay, why not just take it one step further? We did that. We took our swing at full natural language testing, end to end testing with our product previous product, Robin, that no longer exists, and well, the reason was very clear to us. I mean, it made for extremely impressive demos. We sold a shit ton of it because of that, but as it turns out though, it didn't provide the value that our customers or we were expecting out of it. The reason is because when you have AI at runtime as you expect, and when it comes
Robin Heinze:
To you want your test to be deterministic, you want to know that once you get it passing, it's going to pass every time.
Leland Takamine:
If you have to go and verify a pass or fail every single time because you got AI at runtime, it no longer you have to be testing the tester at that point, right?
Jamon Holmgren:
I mean, yeah, we were even talking about how you wanted it to be less flaky with dea. When we were talking with dea, it was like, how do we make this as reliable as possible? Not by adding AI at all, it's not going to make it more reliable.
Leland Takamine:
I mean, like I said, it made for a great demo,
Robin Heinze:
I'm sure,
Leland Takamine:
And there's probably 10 other companies now starting from scratch trying to do what we did with Robin, and we see the running on the wall and we're like, okay, go for it. Yeah, good luck with that, but at the end of
Robin Heinze:
The day, I'm actually really impressed that you tried it and then recognized that it wasn't fitting and then back. It wasn't for, there's a lot of companies that are just slap a coat of AI on it.
Leland Takamine:
Yeah, totally. And it wasn't for lack of commercial viability. Like I said, it was really that here at Maestro we're trying to provide real value to our customers and that wasn't it. So yeah, we ended up,
Jamon Holmgren:
Well, I got to say that's a refreshing attitude because everybody else, it seems like has the opposite.
Robin Heinze:
Can
Jamon Holmgren:
We get commercial viability with some flashy demo with AI
Robin Heinze:
And screw our users? Who cares what
Jamon Holmgren:
Cares about quality? Yeah. Yeah.
Leland Takamine:
You can get pretty far like that, but at the end of the day, the house of cart, right? There's
Robin Heinze:
A ceiling
Leland Takamine:
Tumbling down there
Robin Heinze:
Comes. Yeah, exactly. Yeah.
Jamon Holmgren:
Yeah, absolutely. I feel like Expo has taken a similar approach to you. I talked to them recently about how they, and they are partnering with AI companies, but they're really looking for the AI companies that are focused on quality, on a really quality developer experience, user experience, all of those things. So yeah, I think you two are aligned in a lot of ways.
Leland Takamine:
We're in a similar boat where we're not saying, Hey, AI is not useful. I mean, we just released Maestra MCP, so you can plug it into Cursor or Windsurf or cloud code or whatever you want. That is where we see that's the direction that we see it's going to be useful because at the end of the day, you still get a maestro script. So you've got the power of AI with the reliability of Maestro, and I think that's really where you can get some high leverage, but while also providing real value to users,
Jamon Holmgren:
You can have this underlying layer of base technology that speaks the language of LLMs really well, that can, it's very pluggable, but it's deterministic. It does what computers do very well versus what a neural net might do very well.
Leland Takamine:
Totally, totally.
Jamon Holmgren:
Yeah, that makes sense. Do you have some idea of how Maestro adoption has gone in the last few years? Have you seen quite a bit of, just for Maestro itself, have you seen quite a bit of adoption?
Leland Takamine:
Yeah, I mean, compared to when we launched it three-ish years ago at Droid Con, it's a completely different world now. We just actually got back from Droid Con New York this year, so it was kind of a full circle moment where at our booth there this time around everybody we talked to, most of them had already heard about Maestro. So that was just a huge moment for us, just back to the same event as when we launched, but now most of the people we're talking to already knowing about Maestros. That was kind of a really big moment for us. But even more than that, it's just knowing that some of the most sophisticated and largest companies in the world are adopting Maestro as their go-to end-to-end testing tool. That's really what we've been seeing, and we're really proud of that. Number wise. I think open source adoption is always hard to track, but
Jamon Holmgren:
It is.
Leland Takamine:
Yeah, we've got some metrics and I think the one that seemed the most meaningful was between a hundred thousand and 200,000 unique Mac machines have run Maestro. We have some other metrics, but it gets messy with CI running the tool and how do you count that sort of thing. So the Mac usage seemed interesting to us, but really it's kind of more about, okay, serious companies are adopting this, right? We got DoorDash, Microsoft Block Deal, Kraken app, and Stripe.
Robin Heinze:
Probably all of our clients. Most of our clients.
Jamon Holmgren:
Most of our clients, yeah,
Robin Heinze:
Few. That's what our devs reach for easily. Hands down, no one wants to use, no one wants to use Detox anymore. No one wants to use APP a anymore.
Jamon Holmgren:
I wouldn't say no one wants to. We had a discussion the other day and I was surprised that someone said, oh, I still like detox, and I was like, okay. I mean,
Leland Takamine:
Well, I need to hunt that person down and
Jamon Holmgren:
Talk to
Leland Takamine:
I Connect
Jamon Holmgren:
Person.
Leland Takamine:
You
Robin Heinze:
Personally convert
Leland Takamine:
Them. Yeah, please connect me. I would love to hear.
Robin Heinze:
Well, I love how we have clients where our developers are writing most of the M tests. We have clients where the QEs are writing most of the micro, and it works both. Either way, it's accessible to even the people who aren't necessarily writing the code, which I love.
Jamon Holmgren:
I think the only thing you said was basically with detox, because it's gray box and it has some visibility into the internals of React Native, then when there's a failure, it's a little easier to kind of quickly get to the problem. That was something he mentioned, but yeah, actually, I'll connect you up. It's Ryan Linton. He doesn't listen to the podcast, I'm pretty sure, so I can just say what I
Robin Heinze:
Want, so you can say whatever you want.
Jamon Holmgren:
Yeah. Also, I've worked with him since going back to my previous company like 12 years now.
Robin Heinze:
Well, and if you get a DM complaining about what you said, then at least he listened to the episode, so it's a win-win. Yeah, this
Jamon Holmgren:
Is true. Yeah, exactly. I should do my performance evaluations live on the podcast and see if they listen.
Leland Takamine:
Oh boy. Well, yeah, Ryan, we need to stink up. Then we got to chat. I'll connect you. Cool.
Jamon Holmgren:
But no, Robin is absolutely correct that the vast majority of our developers immediately reached for Maestro, and it's for good reason.
Leland Takamine:
And
Jamon Holmgren:
I remember saying way long ago, way long ago, that this space is really ready for a competitor to Appium and detox. It felt like nothing was really hitting the mark, and I think my store came out not long after that.
Leland Takamine:
Yeah, I mean, if you think about it, it's not surprising that this space hasn't seen much movement. If you look at the big players there, they're these open source projects that kind of, who knows who's behind that, but for us, we're one company and we're all in on Maestro, and that's all we do. So we're extremely incentivized to make sure that this is an amazing experience for our users and it's going to continue evolving an actual company behind this, driving it forward. Yeah.
Robin Heinze:
Yeah,
Jamon Holmgren:
I love that. Honestly, I love that. I mentioned earlier that we had talked with Dima about maybe making Maestro and mobile dev as sort of a hub for everything reliability, and now you're kind of talking more in terms of focus, which I like. I love the focus, I love the specialization. I love just being very, very, very good at that one thing. Do you think there's still room in the future for this kind of hub for reliability, or are you all in on let's just make this end testing thing amazing.
Leland Takamine:
Most checks that you'd want to do as a company rely on actually using the application? So I think at the core, I mean it's got obvious when I say that out loud, all these checks, you have to actually walk through the application to be able to determine if things are working as expected when it comes to quality performance, like you're saying, or localization, grammatical, spelling errors, that sort of thing.
Robin Heinze:
Well, they literally have entire teams of people that do exactly this.
Leland Takamine:
Yeah, 100%. So I mean, at the core of it, it's still automation. You still need to have something that can walk through your application, and to us that is maestro, right? So absolutely, that is still part of our vision is to expand into other quality areas aside from just functional testing. But as it turns out, functional testing and just reliable automation is a huge problem. So yeah, I mean we're still all in on that and there's plenty of room to grow and expand to other quality areas.
Jamon Holmgren:
I personally would love to see in the future some way to test performance more, and actually MyStar does actually power one of our tools for performance, which is flashlight. Oh, nice. Yeah. Which flashlight dev I think it is. Is that right, Robin?
Robin Heinze:
Yeah, that's the product. I didn't know that Misra was underneath that,
Jamon Holmgren:
Right? Yeah. Write or if you just enable their basic smoke test, it writes a quick maestro test and then runs it, and then tests startup time. I don't know if it's time to first interaction, stuff like that, but it's powering it under the hood.
Robin Heinze:
Is that something they worked with you on or did they
Leland Takamine:
Just, I remember the author or maintainer, someone reaching out and just the, Hey, thanks for building Maestro. We're using it for this tool. But yeah, it's amazing to see how Maestro is kind of moving into other ecosystems and powering some other tools. I know Block some of the folks at Block are using Maestro for some cool stuff they're building over there. I know Airbnb as well, using Maestro as a foundation for some of the stuff they're building. So yeah, it's awesome to see Maestro kind of getting its tentacles and not into other ecosystems, and also just getting deeper into the React native ecosystem in general. We've got first class integration with the EAS on the expo side, react native. The core React native team at Meta is actually using my Sure to test the framework itself. So yeah, it's great to see how Maestro's expanding the reach beyond just the immediate use of QAs and developers to test their own application.
Jamon Holmgren:
I'm actually curious, you talked about expanding into other areas. What about web? Is web a focus for you folks or is it really just going to be about mobile?
Leland Takamine:
Yeah, we now support web. So recently we released an early version of web support on Maru. So that absolutely is a focus for us. Yeah, it's still early on and obviously breaking into the web world is a much different.
Jamon Holmgren:
Yeah, there's a lot of competitors.
Leland Takamine:
It's a different journey than on the mobile side, so we're still very much focused on mobile, but absolutely web is part of the long-term vision.
Jamon Holmgren:
Yeah, I could see that being a little bit of a barrier if it doesn't because then you have to learn two tools to test across your suite of everything if you were using React native.
Leland Takamine:
Totally, and I think that's our most likely wedge and adoption path US is like mobile's where the biggest need is, let's get in there, and then we're already having our users say, Hey, we would love to only use one framework and only train our team up on one tool. So now we can say, yeah, totally you can.
Jamon Holmgren:
So what about physical device support? This is something we get questions about, and I think one of the primary reasons why people would go back to Appium was sort of this, okay, well, these device farms, they work with Appium. It's just kind of everywhere. The developer experience is not nearly as good and they're more flaky in some ways, but at least we can test on an array of physical devices in some server farm or device farm somewhere.
Leland Takamine:
I mean, it's a requirement for some of the more sophisticated companies for sure to test on physical devices or the devices that their users are actually using. Yeah, we totally understand that. It's something that's been, to be honest, slow going, but we've made some progress.
Robin Heinze:
I'm sure it's tough. It's a huge undertaking.
Leland Takamine:
Yeah. It's not just the framework support. It's figuring out how to support that in a cloud environment and run things in parallel. So we are actually having conversations about partnering with one of the big device farm players. Those are ongoing conversations. Can't talk much about that yet, but it is something that we take seriously. But yeah, it's not something that can happen overnight, but we realize the importance of it.
Robin Heinze:
Absolutely. Well, yeah, I mean I think the number one priorities for developers or QEs writing automation is that the tests are reliable. You're not getting flaky tests that you're able to test all of the features in your app, which sometimes include things that don't work on simulators, like push notifications or camera support or whatever, and that you can run it at scale and you can test all the different flavors of device that you need all at once and not have it take a whole week to run your test suite. So those are the three things, and having a physical device farm is really the only way to get that second.
Leland Takamine:
Yeah, I mean, our mission is to empower all teams to scale up test coverage across all platforms. So when it comes to supporting all teams, there are certainly some folks where their core features do rely on physical devices. It's something that our mission absolutely includes making sure that we can service those folks as well.
Jamon Holmgren:
Yeah, totally. Let's talk about your cloud service as well. So this is the easiest way to get started with testing Maestro because it's all hand tuned for Maestro. Very, very specific, and I mean, this is actually a non-trivial problem. I'm pretty sure. You can't just slap it into, oh, just go grab a $4 digital ocean droplet and spin it up and everything just starts working. This is something that will some time if you try to hand roll it yourself, even though you can. So tell me more about that. You've had this for a little while now, and how does it work? Has it been a challenge to get it up and running?
Leland Takamine:
I mean, I think a lot of folks start off using Myra, obviously locally on their own machines, and then a natural next step for a lot of folks is like, okay, let's try to run this on the GitHub actions infrastructure, just spin up a Android emulator iOS simulator there on their infra and run maestro against those devices, those virtual devices, or use Bit Rise or some other provider like that, and that will work. It's just when you get to a certain level of scale, well, how do you handle a test chart like running things in parallel, right? Bigger problem is that even if you run them in series, it's going to be really flaky because that infrastructure is not built for executing end tests against a running virtual device reliably. So unfortunately, there is no magic wand here for folks running this on your own infrastructure or a generic provider like that.
This is what our team spends a lot of their days, day in, day out, actually making this rock solid for maestro users in the cloud. So that's why Maestro Cloud is kind of the best place to run Maestro if you're thinking about actually scaling up your test coverage. But I guess I'm trying to think of if there's any suggestions I can give to folks who are trying to do this on their own, either or on GitHub actions or Bit Rise, make sure you're giving it enough resources, make sure you've got enough CPU enough ram. That's oftentimes a main culprit of just not being able to connect the simulator. It just takes too long to boot up. Just make sure you give it enough resources. Obviously that ends up getting kind of expensive, but that is one way to make it a little bit more reliable.
Jamon Holmgren:
Well, yeah, just to interject there, that's how I think about it because you might go to the pricing page on maestro.dev and say, okay, well, there's an investment happening here. Should I be just using GitHub actions? I can just throw that in, right? Well, once you actually get it up to the resource level that you need in order to not have it just fail because it ran on memory, you're actions too. Yeah, you're paying for it anyway, and this is not going to be cheap, and you're still not going to get the specialized kind of service. So I think if you're evaluating this, you need to be thinking about it in terms of just apples to apples here,
Leland Takamine:
But totally just getting started. If you're just trying it out, that's a perfectly viable option. It's just when your team starts really caring about reliability and fast turnaround times, that's when my show cloud starts to make sense.
Jamon Holmgren:
Right. Yeah, that makes sense. What about hardware, like push notifications, biometrics, haptics, things like that? Are you able to test those things?
Robin Heinze:
That kind of goes back to the physical device limitations if you're not able to run on a physical device. Yeah,
Leland Takamine:
Like I was saying, we are in the process of building out physical device support, and that's going to unlock testing some of these pieces on Android. It actually does work locally with a physical Android device, so if you plug it into your computer via USB cord, that'll work just fine. Actually, with my show locally on iOS, there's still work to be done there,
Robin Heinze:
So it's mostly a scale thing, but you can run individual tests on a device
Leland Takamine:
On Android, locally on. Yeah, there's still work to be done on the iOS side. Some great folks on the flip cart team have kind of jumped in to help out with that as well, so there's some movement there, but yeah, no timeline on full physical device support in the cloud and things like that. Yet.
Robin Heinze:
Have you had any interactions with Apple? Are there limitations on what Apple allows you to do with their hardware or their simulators? Are you doing things in a way that Apple would frown on?
Leland Takamine:
No, not to my knowledge. No. I think their main requirement is you run this on Apple hardware, which we are doing.
Jamon Holmgren:
Yeah,
Leland Takamine:
Yeah,
Jamon Holmgren:
Yeah, absolutely. Yeah, I mean at the end of the day, I think if people were to say, what's the main draw to Maestro, it's really around the reliability. Of course, the developer experience is much better, and I think that does matter, and you're focused on that a lot, but I do think the reliability of the tests, the fact that they are pretty deterministic and even when they're not often, there's a lot of things you can do to improve that, but just miles better than the other ones that we've had over the years.
Leland Takamine:
Yeah, totally. I think it's about the developer experience like you were talking about, but also the fact that once you build out a maestro test suite at scale, it really works. And I really haven't heard any success stories outside of Maestro where people, serious teams are scale. That's true. I mean, how often you talk to a team and they're like, yeah, we've got great N end test coverage and we love it. That does happen with our users, which is quite a departure from what the ecosystem's been like in the past.
Robin Heinze:
It's an incredibly hard problem to solve, and developers are notoriously unforgiving, and it is kind of annoying sometimes. Like this is so much better than what you had before.
Leland Takamine:
Yeah, totally. And from an organizational standpoint too, oftentimes the teams we're talking to, they've got the developers using it, but they also have a QE QA team who are ramping up on Maestro as well to maintain hundreds of regression tests that they have to run through, and instead of five days, it's taking four hours to run them. Right?
Robin Heinze:
Yeah.
Leland Takamine:
So that's also a big pull on the maestro side is from an organizational standpoint, it's a lot more attractive because you can scale up much easier with the staff you have.
Jamon Holmgren:
Yeah, absolutely. I mean, I can't tell you how many times that I've seen projects come in and they have some Appium tests or something like that, but then when you actually look at it, they're not running them and it's okay, why aren't you running these? Well, they got really flaky and we got tired of debugging that, so we just disabled them.
Robin Heinze:
It totally takes the wind out of your sails. If you're going for a release and you run your suite and my stuff is failing, that wasn't failing before, and you have to figure out
Jamon Holmgren:
Locally, it's fine.
Robin Heinze:
Exactly, and so you just say, screw it. We're going to skip the, and then that's the beginning
Jamon Holmgren:
Of the end for your test. Yeah, the CI doesn't know that it works, but we know it works, so let's just ship it.
Leland Takamine:
Yeah, exactly. And that's where a lot of our efforts are going to go to in the future is I can't sit here and say that when you get up to a thousand tests, you don't have to put any work into maintain that. There's still quite a bit of work to actually maintain it no matter what you're using. Maestro makes it easier, of course, but what we're looking at now is not full natural language testing at Runtime. You think using ai, but using AI to self-heal your maestro test suite, right?
Jamon Holmgren:
Oh, yeah. Oh yeah. That's a cool idea.
Leland Takamine:
Yeah, and it's there. The technology is pretty much there to be able to really accelerate folks in actually maintaining their test coverage over time, so that's one of our big areas of focus going forward as well.
Jamon Holmgren:
Yeah, I could see, okay, on GitHub it says, okay, CI failed, but it doesn't just say CI failed. It says, and Maestro thinks Maestro's AI thinks that this could be the problem, and maybe you could make this change. Maybe it even ran it with the change to see if it fixed the problem. Of course, ai, you have to be careful with because sometimes it's like, alright, make the test pass and it'll just delete the test that didn't. Yeah, yeah, exactly. That didn't pass.
Robin Heinze:
Looks good to me. Do not delete any code.
Jamon Holmgren:
This is why we know it's very human. Yeah, exactly. Because I've seen humans do it too,
Robin Heinze:
Including
Jamon Holmgren:
Myself.
Robin Heinze:
Well, the PR where the test is commented out and you just shake your head.
Jamon Holmgren:
We're coming up on the end here, but I do want to get a sense, Leland, on what you have coming up in the future, what to look for anything. Obviously, people should also go check out Maestro Studio Desktop, which of course is now out, but what else have you got going on? What's coming up?
Leland Takamine:
Yeah, I mean, that's where we're going to be building in those features I was talking about is into Maisha Studio and also on our cloud platform as well, but we do see Maestro Studio Desktop as a future of how Maestro is going to be written for anybody going forward. So like I said, the goal of Maestro Studio Desktop, that is to make Maestro more accessible and make it more powerful. So yeah, definitely check it out. We'd love feedback on that, but lots more to come there.
Jamon Holmgren:
Awesome. Well, thank you so much for coming on, Leland. I am really happy you had the time that this worked out. If people want to learn more about Maestro, obviously Maestro do Dev, where can they follow along? Do you tweet about this? Do you post about this on a blog? Where's the best place to find you on the internet?
Leland Takamine:
Yeah, just check out our website. maister.dev has got our socials down at the bottom there as well. Awesome. But actually, the easiest way to get in touch with us is to join our Slack community, so we've got over 5,000 other enthusiastic Maister users who are there to help as well. So yeah, head over to our website, click the community link at the top there and join us on Slack.
Jamon Holmgren:
Very cool. Thanks so much, Leland. Really appreciate it.
Leland Takamine:
Thank you so much, guys.
Jamon Holmgren:
That's it for us. We'll see you all next time.
Jed Bartausky:
As always, thanks to our editor, Todd Werth, our assistant editors, Jed Bartausky and Tyler Williams, our marketing and episode release coordinator, Justin Huskey and our guest coordinator, Mazen Chami. Our producers and hosts are Jamon Holmgren, Robin Heinze and Mazen Chami. Thanks to our sponsor, Infinite Red. Check us out at infinite.red/radio. A special thanks to all of you listening today. Make sure to subscribe to React Native Radio on all the major podcasting platforms.
There’s no perfect time to get started. Whether you have a formal proposal or a few napkin sketches, we’re always happy to chat about your project at any stage of the process.
Schedule a call