DevOps Best Practices
Development Time: The Only Cost That Matters
by James Burns
It’s the Thursday before a holiday weekend and you’ve got a cost crisis. Someone in finance has just noticed that this month’s AWS bill is trending 15% higher than last month’s. An all-hands meeting is called, and everyone is asked to shut down as much capacity as they can “safely.” All the work your team has been trying to push out before end of sprint is going to be delayed for days. Chances of an operational outage when someone shuts down something critical? Pretty high…
The impact of lost development time, lost customer feedback on new features, operational issues — these all pale before the long-term impact of making developers scared to develop and deploy.
Delay Is Its Own Cost
When you’re looking for product / market fit (and for a surprising amount of time after), time is the only cost that matters. Most developers have experienced a story like the one above. In the name of frugality, they agonize about whether to deploy another three instances that cost a few hundred dollars a month.
The reality is that the time they’ve spent debating about it, talking with others, and escalating to management has already wasted more than what they could be saving. And that’s just considering the time of the people involved. Once you look at opportunity cost — what could have been done instead — you could be looking at orders of magnitude of lost value.
Cost vs. Waste
What can we do to change this pattern? How can we preserve the ever-important value of development time — to get features in front of customers?
The first step is understanding the difference between cost and waste for cloud resources. Cost is what it takes to scale out a software developer’s time. It’s important to be clear that software developers are paid well precisely because their work can scale out with a cost less than their ongoing attention, that instead of each widget being built by them, each customer served by them specifically, instead we can pay for computing resources which will do the work following the (programmed) directions of developers. Waste is cost that is not being used at all. You’d think this wouldn’t happen, but it’s surprisingly common to be paying for a resource, storage or compute, that literally is not doing anything. That’s waste, and needs to be surfaced and reduced. Cost we can, and should, expect to grow with the business, ideally faster than personnel costs.
Value Your Development Time
The second step is to get developers to not think of costs as if they’re personal costs. Instead of “I wouldn’t spend $1500 of my own money without thinking about it for a long time,” developers should ask, “will spending $1500 get an answer to a business question a couple weeks earlier?”
If the answer is yes (and it often is), then there should be no hesitation or pushback on spending that money. You just bought yourself not only two weeks of development time, but also access to the information two weeks earlier. It is a deal. If you can get your developers thinking “how can I spend money to make better informed decisions?” you will start operating at a different level. Of course, it’s not just about developers having this mindset, but also their managers and directors. Everyone has to understand the value of information and time to the business, instead of thinking about everything as if it was a personal purchase decision.
Trade Money for Time, Every Time
So, the next time you’re about to declare or respond to a cost crisis, consider whether you’re concerned about cost or waste, whether that’s because you’re looking at it as a business or personal purchase, and what the opportunity cost of the time to address the crisis will be. Help your developers feel empowered to trade money for time. It will transform your business.