Paying Off the Technical Debt in Your Agile Projects

Just as you should not take out a financial loan without having a plan to pay it back, you should also have a plan when incurring technical debt. The most important thing is to have transparency—adequate tracking and visibility of the debt. Armed with the knowledge of these pending tasks, the team can devise a strategy for when and how to “pay off” technical debt.

Learn about managing your technical debt and testing debt in agile teams and share your thoughts on my latest article published at and also at

***** Here are some excerpts from the article for my readers***

Technical debt initially referred to code refactoring, but in today’s fast-paced software delivery, it has a growing and changing definition. Anything that the software development team puts off for later—be it smelly code, missing unit tests, or incomplete automated tests—can be technical debt. And just like financial debt, it is a pain to pay off.

Forming a Plan to Pay Off Technical Debt

Let’s say a development team working on a new project started out following a certain programming standard. They even set up an automated tool to run on the code periodically and give reports on the adherence to these standards. But the developers got busy and stopped running this tool after a sprint or two, and when the development manager asked for a report after a couple of months, there were hundreds of errors and warnings, all of which now need to be corrected.

This scenario happens all the time with agile teams focused on providing as much customer value as possible each sprint. The problem then needs to be fixed immediately, because despite having all the functionalities in place, the team doesn’t want to release code that is not up to production standards.

The team is then faced with a few options for how to service the debt:

  • Negotiate with the product owner on the number of user stories planned for the upcoming sprint in order to have some extra time for refactoring the code
  • Dedicate an entire sprint to code refactoring
  • Divide all errors and warnings among the development team and let them handle the task of corrections within the next sprint, along with their regular development tasks, by scheduling extra hours
  • Plan to spread this activity over a number of sprints and have a deadline for this report before the end of the release
  • Estimate the size of refactoring stories and either plan them into upcoming sprints as new user stories or accommodate them as part of existing user stories

Though these are all viable options, the best approach depends on the team, the context, upcoming deadlines, the risk the team is willing to take, the highest priority for functionalities that need to be shipped, and the collaboration with the product owner.

Again, just like when you take out a financial loan, you should plan to pay off technical debt as quickly as possible using the resources you have. It’s a good idea to perform a risk analysis of the situation and reach a consensus with the team about the best approach to take.

Technical Debt in Testing

Technical debt doesn’t occur only in programming. Testing activities are also likely to incur technical debts over time due to a variety of factors, including incomplete testing of user stories, letting regression tests pile up for later sprints, not automating essential tests every sprint, not having complete test cases written or uploaded to test management tools, not cleaning up test environments before the next iterations, and not developing or testing with all test data combinations on the current features.

Sometimes debt may be incurred intentionally for a short term, such as not updating tests with new test data when testing on the last day of the sprint due to a time crunch, but planning to do it within the first couple of days in the next sprint. As long as the team has an agreement, it’s acceptable to defer some technical debt for a short while.

On occasion, debt may be incurred intentionally for a longer term by planning it in advance, such as deciding to postpone any nonfunctional tests, like performance or security-related tests, on the system until a few sprints are out and features are stable enough to carry out the tests. Again, as long as the team agrees with the risk and has a plan to address it, it is fine to defer certain activities.

Testing technical debt can get us out of tight situations when needed, but you still need to ensure that you plan carefully, remain aware of the debt, communicate it openly and frequently, and pay it off as soon as possible. Having a plan to service these debts reduces your burden over time and assures your software maintains its quality.


Prevention Is Better Than Cure

Avoiding having any technical debt is always preferable. As the saying goes, an ounce of prevention is worth a pound of cure.

Every team has to devise its own strategy to prevent technical debt from accumulating, but a universal best practice is to have a definition of “done” in place for all activities, user stories, and tasks, including for completing necessary testing activities. A definition of “done” creates a shared understanding of what it means to be finished so that everybody involved on the project means the same thing when they say it’s done. It becomes an expression of the team’s quality standards, and the team will become more productive as their definition of “done” gets more stringent.

Here’s a good example of criteria for a team’s definiton of “done” for every user story they work on:

  • All acceptance criteria for the user story must be met
  • Unit tests must be written for the new code and maintain a 70 percent coverage
  • Functional tests must be performed, and exploratory tests must be performed by a peer tester other than the story owner
  • No critical or high severity issues remain open
  • All test cases for each user story must be documented and uploaded in the test management portal
  • Each major business scenario associated with the user story must be automated, added to the regression test suite, and maintain a 70 percent functional test coverage

Verifying that the activities completed meet these criteria will ensure that you are delivering features that are truly done, not only in terms of functionality, but in terms of quality as well. Adhering to this definition of “done” will ensure that you do not miss out on essential activities that define the quality of the deliverable, which will help mitigate the accumulation of debt.

Despite best practices and intentions, technical debt often will be inevitable. As long as the team is aware of it, communicates openly about it, and has a plan in place to pay it off as quickly as possible, you can avoid getting in over your head.


‘INVEST’ing in good User Stories

User stories are the requirement specifications in their simplest form. Methodologies like Scrum use User story format to express the functional requirements of the software to be developed as –

As a <user persona>, I want to <do the action> so that <need of function>

This creates a deeper understanding of the behavior from a user’s perspective along with the business need and reason for the function, thus making the development of software easier.

But writing effective user stories isn’t as easy as it sounds. There are a lot of questions to be answered, like which functionality is one user story and which is too big and needs splitting up; what is the true sense of the user story; how to best express the functionality in words; which user personas are to be considered etc.

Getting the user stories right is an essential step to success of the sprints in agile, and for the teams struggling with it we have the INVEST principal as a guideline to be followed. This principal gives the attributes of a user story to be considered when writing and defining them so that they can make a robust foundation to our product backlog. Let us look at the principle in depth –

I – Independent

Means that each user story must be Independent as a functionality and be deliverable

N – Negotiable

Means that the user story be negotiable in terms of implementation, which necessarily means that the implementation details or ‘how’ to do the functionality must not be specified in the user story. User story must be business need and customer experience story.

V – Valuable

Means that the user story must create value for the customer. We should be able to see the reason and way the function will be valuable to the customer and this can be gauged by direct communication with the stakeholders, and can also be quantified in terms of ‘business value’ that we can associate with each story.

E – Estimable

Means that the User story must be clear and concise enough so that we can estimate the amount of work required to achieve it with accuracy. Any unclear parts, missing information or discrepancies must be clarified before we can finalise a user story to be taken up for development.

S – Small

Means that each user story must be a small chunk or slice of work. The development of one user story must be doable within one sprint and hence anything bigger than that would need to be further split down.

T – Testable

Means that each user story must be testable as a feature and have a unique new function get added to the product.

Letter Meaning Description
I Independent The user story should be self-contained, in a way that there is no inherent dependency on another user story.
N Negotiable User stories, up until they are part of an iteration, can always be changed and rewritten.
V Valuable A user story must deliver value to the end user.
E Estimable You must always be able to estimate the size of a user story.
S Small User stories should not be so big as to become impossible to plan/task/prioritize with a certain level of certainty.
T Testable The user story or its related description must provide the necessary information to make test development possible.

Keeping these points in mind when designing and formalising our team’s user stories will ensure that the sprint runs smoothly without unforeseen scenarios, glitches in implementation due to unclear requirements and re-work due to frequent changes.

Have more questions? How to achieve this?

Stay tuned for the next article where we will look at the steps to achieve the best user stories.

Happy Testing!


ATA community growing with yet another awesome meetup hosted on 1st July @Allscripts Bangalore

I organised and hosted the ATA 14th Meetup Bangalore @Allscripts Bangalore on 1st July 2017 Saturday, which saw a great turnout of keen testers and agile enthusiasts from various companies. We had a great line-up of speakers, including Dr. Shankar Ramamoorthy who is a noted speaker and has led Keynote talks at GTR Pune 2017 too.

Below was the planned agenda and talks for the day.IMG_20170701_110713

All talks were appreciated by the audience and were followed by interesting Q&A sessions and discussions. We also felicitated some CP-MAT awardees with certificates and goodies, and speakers were also presented with token of appreciation certificates.

It is great to see the community expanding and so many passionate testers coming together for such meetups and events for knowledge sharing , learning and networking.

This slideshow requires JavaScript.

Hope to continue this effort with help and support from corporates like @Allscripts.

Great thanks!


Conducted a full-day “Applied Agile Testing” workshop @Singapore

Hey there!

The day finally arrived when I had the opportunity to conduct a full-day workshop in the beautiful Hotel Village Changi @Singapore on 16th June 2017, organised by 1.21GWs.

The workshop was titled “Applied Agile Testing” and was designed specifically to bring in practical agile knowledge to software testers looking for answers to their software testing dilemmas in agile teams. The agenda also included practical work, team exercises and a hands-on agile testing project with testing a live web application in sprints format. We focused on lean documentation and exploratory testing and each participant had a chance to bring out their keen testing skills and find the best bugs!

The audience was a good mix of people from different domains and varying skills and levels of experience. We had great questions and discussions during the day, enjoyed the networking over coffee and lunch, and an amazing response from the delegates. The team exercises and live project were a big hit! I received amazing feedback about the session, the topics and conduct and specially for the project.

Here are a few glimpses from the day –

This slideshow requires JavaScript.

We also gave out hand-outs and certificates to all attendees for participation in the event issues by 1.21GWs-

Our delegate Mr. Rajesh Thomas from Semiconductor Technologies & Instruments was kind enough to share his feedback – here in his own words—

It was a wonderful workshop we had and it indeed helped me to learn on applying testing and its technique in Agile. The topics was very well arranged which helped us to sail gradually to the core subject. The techniques taught and applying sprint on real time project did help us to understand better”

More feedback by delegates –>


It was a great experience and Singapore being the wonderful and welcoming place that it is offered us lots of love and appreciation! Hope to get there back again soon!! 🙂 🙂



I am speaking at the 1.21GWs Agile Testing Conference @Singapore

Hello !

Check it out — I am going to speak at the

“Agile Testing, Test Automation and Non Functional Testing Summit”

being organised by 1.21GWs @Singapore on 14-16th June 2017

Check out the details of the event at 

I will present a talk on Innovation Games for Agile Teams on 15th June day of the conference.Nishi Grover Garg day2

And I will conduct a full day workshop on “Applied Agile Testing” on 16th June

Nishi Grover Garg day3

Be there! 🙂


A huge success at the 13th ATA Bangalore Meetup @CoviamTech

I organised and hosted the 13th ATA Meetup at Bangalore which was held on Saturday- 13 May 2017 at Coviam Technology office – HSR Layout. The event was a big success owing to the great support of the volunteering team @CoviamTech as well as the great talks by all the speakers. The day saw a huge turnout with a full house at the venue, and great participation and questions by the attendees.

CEO of Coviam Mr. Deepak Nachnani gave an introductory talk , followed by the below talks-

Shrinathacharya L M. (Shrinath)   from Allscripts Thick and Thin Lines in Choosing Mobile Test Cloud Environment
Mr. Sundaresan Krishnaswami from Coviam
Ms.Felicia Kartika from, Indonesia
Successful strategies to testing Microservices architecture
Mr Santhosh GS from Allscripts Machine learning telepathy for Shift Right approach of testing
Deepthi from Coviam Usability Testing

Delegates appreciated all the sessions, and had the opportunity to network over tea and lunch sponsored by Coviam at the end of the sessions.

This slideshow requires JavaScript.

CP-MAT certificates were also distributed in a felicitation ceremony for the candidates who cleared the course last month.

Over all the day was a great success, and ATA looks forward to such an awesome response every month at our meetups!



‘Mastering Agile Testing’ – Training the CP-MAT certification batch

I trained the latest CP-MAT training and certification batch held at Bangalore and it was a huge success. With wonderful participation by team from L&T, intriguing questions and enthusiastic response, it was a pleasure training and interacting with them!

CP-MAT is a wonderfully interactive course designed specifically for agile test practitioners.

CP-MAT stands for “Certified Professional – Master Agile Testing” certification prepared and honored by “Agile Testing Alliance” & “Universiti Teknologi Malaysia”.

The course is applicable for all roles and not just “testers”. Knowledge, experience & certification is consciously designed to focus on “agile testing” and not on “agile testers”.

CP-MAT helps the participants get into the testing mindset in an agile project. It helps you utilise your testing experience in learning hands on Agile testing. It instils the “Quality is everyone’s responsibility” concept in the minds of the participants.

It is useful for even to an experienced tester to apply the “regular” testing techniques to an Agile project. The concepts of Agile process with context to testing are covered during the course along with the associated best practices for Testing in an Agile Project. The course takes a hands-on approach while covering Release Planning, User Stories Review, Estimation, Sprint Planning, Agile Test Strategy, Testing debt, Testing DoD, Test Reporting and Metrics. This also introduces the participants to the concepts of TDD, ATDD, BDD and Continuous Integration (which are covered in detailed in the next level course CP-AAT).

Here is sneak peek into the training room–



Let the ‘Agile Manifesto’ guide your testing efforts!

Hello readers

My article on the relationship of Agile Manifesto to the efforts and dilemmas of software testing has been published at

Here are excerpts from the article – Please visit and share your views too!


The Agile Manifesto is the basis of the agile process framework for software development. It sums up the thought process of the agile mind-set over the traditional waterfall methodology, and it’s the first thing we learn about when we set out to embrace an agile transition.

The Agile Manifesto applies to all things agile: Different frameworks like Scrum, DAD (Disciplined Agile Delivery), SAFe (Scaled Agile Framework), and Crystal all stem from the same principles.

Although its values are commonly associated with agile development, they apply to all people and teams following the agile mind-set, including testers. Let’s examine the four main values of the Agile Manifesto and find out how they can bring agility to teams’ test efforts.


Individuals and Interactions over Processes and Tools

Agile as a development process values the team members and their interactions more than elaborate processes and tools.

This value also applies to testers. Agile testing bases itself in testers’ continuous interaction and communication with the rest of the team throughout the software lifecycle, instead of a one-way flow of information from the developers or business analysts on specific milestones on the project. Agile testers are involved in the requirements, design, and development of the project and have constant interaction with the entire team. They are co-owners of the user stories, and their input helps build quality into the product instead of checking for quality in the end. Tools are used on a necessary basis to help support the cause and the processes.

For example, like most test teams, a team I worked on had a test management system in place, and testers added their test cases to the central repository for each user story. But it was left up to the team when in the sprint they wanted testers to do that. While some teams added and wrote their test scenarios directly on the portal, other teams found it easier to write and consolidate test cases in a shared sheet, get them reviewed, and then add them all to the repository portal all at one go.

While we did have a process and a tool in place to have all test cases in a common repository for each sprint, we relied on the team to decide what the best way for them was to do that. All processes and tools are only used to help make life easier for the agile team, rather than to complicate or over formalize the process.

Working Software over Comprehensive Documentation

With this value, the Agile Manifesto states the importance of having functioning software over exhaustively thorough documents for the project.

Similarly, agile testers embrace the importance of spending more time actually testing the system and finding new ways to exercise it, rather than documenting test cases in a detailed fashion.

Different test teams will use different techniques to achieve a balance between testing and documentation, such as using one-liner scenarios, exploratory testing sessions, risk-based testing, or error checklists instead of test cases to cover testing, while creating and working with “just enough” documentation in the project, be it through requirements, designs, or testing-related documents.

I worked on an agile project for a product where we followed Scrum and worked with user stories. Our approach was to create test scenarios (one-liners with just enough information for execution) based on the specified requirements in the user story. These scenarios were easily understood by all testers, and even by the developers to whom they were sent for review.

Execution of test scenarios was typically done by the same person who wrote them, because we had owners for each user story. Senior testers were free to buddy test or review the user story in order to provide their input for improvements before finalizing the tests and adding them into the common repository.

Customer Collaboration over Contract Negotiation

This is the core value that provides the business outlook for agile. Customer satisfaction supersedes all else. Agile values the customer’s needs and constant communication with them for complete transparency, rather than hiding behind contract clauses, to deliver what is best for them.

Agile testing takes the same value to heart, looking out for the customer’s needs and wishes at all points of delivery. What is delivered in a single user story or in a single sprint to an internal release passes under the scrutiny of a tester acting as the advocate for the customer.

Because there is no detailed document for each requirement, agile testers are bound to question everything based on their perception of what needs to be. They have no contract or document to hide behind if the user is not satisfied at the end of the delivery, so they constantly think with their “user glasses” on.

As an agile tester, when I saw a feature working fine, I would question whether it was placed where a user would find it. Even when the user story had no performance-related criteria, I would debate over whether the page load time of six seconds would be acceptable. After I saw that an application was functionally fine, I still explored and found that the open background task threads were not getting closed, leading to the user’s machine getting hung up after few hours of operation. None of these duties were a part of any specification, but they were all valuable to the user and needed correction.

Responding to Change over Following a Plan

Agile welcomes change, even late in development. The whole purpose of agile is to be flexible and able to incorporate change. So, unlike the traditional software development approaches that are resistant to change, agile has to respond to change and teams should expect to replan their plans.

In turn, such is the case for agile testing. Agile testing faces the burden of continuous regression overload, and topped with frequent changes to requirements, rework may double itself, leading to testing and retesting the same functionalities over and over again.

But agile testing teams are built to accommodate that, and they should have the ability to plan in advance for such situations. They can follow approaches like implementing thorough white-box testing, continuously automating tested features, having acceptance test suites in place, and relying on more API-level tests rather than UI tests, especially in the initial stages of development when the user interface may change a lot.

These techniques lighten the testing team’s burden so that they can save their creative energies to find better user scenarios, defects, and new ways to exercise the system under test.


Let the Agile Manifesto Guide Your Testing

When agile testers have dilemmas and practical problems, they can look to the Agile Manifesto for answers. Keep it in mind when designing and implementing test efforts; the Agile Manifesto’s values will guide you to the best choice for your team and your customers.


Hope you liked my write-up, please share your views too!

Happy Testing!



ATA 12th Meetup @Moolya software, Bangalore

I organised and hosted the ATA 12th Meetup last Saturday 25th March 2017 @Moolya software, Bangalore. The event saw good participation from many enthusiastic testers and some insightful talks by great speakers. We had talk on

>>Problem solving techniques: An attempt to apply ideas across disciplines — by Ajay Balamurugadas, Tyto Software (Creators of Sahi)-

>>Behavior Driven Development – What, Why and How – from a tester’s perspective — by Mr. Vinay Krishna, Agile Technology Coach

>>Challenges of Agile for a Manager — by Preeth Pandalay, Techno Agilist, Agile Coach, Trainer

>>Create 100 mindmaps in 1 minute (demo) — by Dharamalingam K – Moolya Software

The event was a great success owing to the awesome speakers and the amazing discussions and questions by the participants. We hope to continue these meetups and bring the community together with more such open events!

This slideshow requires JavaScript.

Happy Testing!