The meaning of “DONE”

Irina Dumitrascu
2Performant Tech
Published in
3 min readJun 19, 2018

--

“When can you say that a feature is done?” is a question that has been on our mind for some years now, at 2Performant, and the answer had an interesting evolution.

Phase 1: “The independent coders”

When the tech team was small and separated from marketing and sales, “It’s done” usually meant “I have finished coding it”. Soon after, having coded some changes that broke the test suite, it started to mean

I have finished coding it and the tests pass.

In our thinking, “someone else” had to code-review it, “someone else” had to test it, and “deploying meant a couple of minutes”.

When we started developing more complex features, we have realized that

  • code review can reveal some problems or paths of improvement, that could mean non-negligible development time
  • the QA team might find some scenarios that were not properly thought out, or even bugs (yeah, I’ve said it. bugs)
  • sometimes, deploying changes on live might require significant time & attention, as well. Schema migrations, data migrations, orchestrated deploys among several micro-services are the some examples.

So the answer changed to

It’s coded, peer reviewed, tested and deployed to production.

Taking into account this full definition allowed us to improve our estimations and made everyone happier, as the development team became very predictable.

What to remember when estimating. First historical evidence at 2Performant, dated December 2015

Phase 2: “The other teams matter”

Another interesting change came some months ago. As we got much more productive, deploying frequently, the QA team took on the role of announcing changes to the rest of the company. In order to save time and colleagues’ attention capacity, they would make these announcements at least several days apart. This meant that after a feature was live, there would be some days between “deployed” and announced (and marked as “Done”). The interesting consequence of this process was that, when “Done”, the feature was live for some time, and therefore it was checked that it didn’t cause any functional or performance problem.

It’s coded, peer reviewed, tested, deployed to production and performing well.

This is enough for small features and improvements. What about large or totally new features?

After a couple of launches we have learned that, after the new code is up & running, sales and marketing might come with requests to add new tracked events, new reports or perspectives, or the user support team might realize that they need some new information or tool in order to be effective. And we have come to

The launch went fine, it’s live & performing well, we can measure KPIs and easily support customers.

Next: Phase 3

So what are we aiming for right now? We’re moving towards long-term, business-wide definition:

The new feature reached its objectives.

Under this thinking, research, planning, tech specs, developing, integration, launch, post-launch, improvements, are all milestones, just parts of getting to “DONE”. And some of them are development-only, but most of them require a cross-functional team.

--

--