Yes, you can spend a lot of time and complexity on truly robust automation. But it's not always the most ROI move economically. Especially not in highly volatile businesses where your processes may need to change rapidly.
You didn't calculate ROI right. If YOU are the one that does intervention, then your time is lost constantly doing manual fixing of failed scripts. Not to mention reputation loss, end user dissatisfaction etc.
Automation routines MUST be robust, must handle all weird cases that happen frequently (at least once a year), and must notify when they fail to do so always. Then you should come back and see how to not make them fail even then.
It's easy to spend upward of $10000 on really robust automation, when the same manual process would cost only $3000 over its usable lifetime, and the economical-but-less-robust automation costs $1000 over the same period.
The robust automation, in that case, has over 10× worse ROI. What's wrong with that calculation?
The thing about really robust automation is that for it to pay off, the process have to be static over a large number of executions. For many business needs, the process, or its inputs, change every few executions, and you never get to reap the benefits of robust automation before it needs to be redone at great expense.
As for thinking that it's a dichotomy between "no automation" and "absolutely robust automation"... well, I think you're robbing yourself of a large chunk of the strategy space by refusing to see any middle ground but the two extremes.
Edit: also note that I'm not talking about "failed scripts" at any point. I'm talking about scripts that do exactly what they are supposed to, but they are performing a narrow, easily automated slice of the work. A human can chain such scripts together in the requisite sequence by spending very few minutes of their day.
> It's easy to spend upward of $10000 on really robust automation, when the same manual process would cost only $3000 over its usable lifetime.
Manual process is incomparable to automation, because $3k human will make mistakes as humans are not good robots. Also, your miserable $3k human can now do normal thing.
> The thing about really robust automation is that for it to pay off, the process have to be static over a large number of executions.
It doesn't have to be static, it just mustn't be random. Also, how often process changes is important and automation with scripts (that can be changed ad hoc) allows for quick flexibility when problems arise.
> As for thinking that it's a dichotomy between "no automation" and "absolutely robust automation"... well, I think you're robbing yourself of a large chunk of the strategy space by refusing to see any middle ground but the two extremes.
You are also robbing yourself of time to do other things which may lead to more progress, since you are fixing flaky automation all the time.
> A human can chain such scripts together in the requisite sequence by spending very few minutes of their day.
I LOLed. A minute for a single script. You must have missed that in enterprise there are hundreds of scripts. Heck, I usually have 20-30 on a single project.
Humans indeed deviate from standard process. This causes mistakes but it also prevents them.
It sounds like you think robust automation takes zero minutes to create, since you think of robust automation as always freeing up time. In my experience, robust automation is something that takes considerable time to create and maintain.
Maybe you know of some trick I don't. But since you keep writing about "failing scripts" and "flaky automation" despite my attempts to correct such misunderstandings, I'm starting to suspect you're interpreting my comments as what you want them to say for the sake of your argument, rather than what I'm trying to say.
I have never experienced that someone wants to keep human if machine could be put to use. No, humans do not prevent mistakes for highly unimaginative repetitive work. Even if it happens, its outlier.
> It sounds like you think robust automation takes zero minutes to create, since you think of robust automation as always freeing up time. In my experience, robust automation is something that takes considerable time to create and maintain.
It takes, and it gets MUCH better with experience. However, the time is finite, unlike that with human corrections.
> Maybe you know of some trick I don't.
Probably - I know how to write robust and resilient automation scripts that over time converge to almost 0 failures.
You are making some really bold claims and you really might be an expert but I think you should open your mind to the slight possibility that others might also know what they are talking about. The OP specifically started with the premise that fast changing businesses would spend all their time fixing their automation and it might not make sense in that situation. Can’t comprehend how that doesn’t make immediate sense.
Let’s say I have a crawler which automates some data gathering. It’s sources keep changing frequently, robust automation here is probably a research project and simple automation is orders of magnitude more bang for buck.
This page is report of the PowerShell framework I developed mostly in first year of development (https://github.com/majkinetor/au) that checks ~250 web sites for updates on various software. Today it has 6 errors and usually never much more. On my own location I keep ~60 packages and I I tackle errors maybe once a year. Stuff just work, and you rarely have to visit, otherwise I would be involved entire day into this and I am not, while those packages have many millions of users.
Now I spend almost 0 time maintaining packages and I am one of the top choco package owners.
Check out the options used, some of which make it so robust:
> It’s sources keep changing frequently, robust automation here is probably a research project and simple automation is orders of magnitude more bang for buck.
Even if the source changes frequently its better to automate. Its not when it keeps changing daily or more then that. By automating you learn something new, so it pays more for your experience. Manually working every day the same thing (that may move around) doesn't involve complex thinking and is just waste of time.
> Even if the source changes frequently its better to automate. Its not when it keeps changing daily or more then that.
Now I think we can get somewhere! Is this an admission that automation is not worth it when the processes or inputs change too often?
If so, then this frequency (which you have given as daily) depend entirely on the business needs in question.
Often, there's no business case to run an automated process daily.
Weekly or even monthly are very common intervals for processes in business. For a process that needs to run monthly, you only get twelve executions in a year. If the inputs change every six months, do you still think spending 60+ commits (as in your settings example) is worth it every six months, when there are cheaper ways to do it with limited human intervention?
> Often, there's no business case to run an automated process daily.
Almost 100% of the cases I have run daily, hourly and even less (5,10,20,30 minutes schedule are common). I even had one recently that executed millions of requests to some REST API daily, running every few seconds. I call those "app supporting scripts", and I offload specific features of the main app to those.
Must be architectural thing I guess, I work as principal architect and I design most of my services so that they rely heavily on automation support.
> Is this an admission that automation is not worth it when the processes or inputs change too often?
I don't work in a vacuum. For me there are no rules about anything, context is most important (patterns, best practices etc. are for newbies). That case does lean to the manual side on first thought, but it all depends on other factors.