The 12 Agile Principles: What We Hear vs. What They Actually Mean

The Agile Manifesto gives us 12 principles to abide by in order to implement agility in our processes. These principles are the golden rules to refer to when we’re looking for the right agile mindset. But are we getting the right meaning out of them?

In my latest article for Gurock TestRail blog, I examine what we mistakenly hear when we’re told the 12 principles, what pain points the agile team face due to these misunderstandings, and what each principle truly means.

 

Principle 1: Our Highest Priority is to Satisfy the Customer Through Early and Continuous Delivery of Valuable Software

What we hear: Let’s have frequent releases to show the customer our agility, and if they don’t like the product, we can redo it.

The team’s pain points: Planning frequent releases that aren’t thought out well increases repetitive testing, reduces quality and gives more chances for defect leakage.

What it really means: Agile requires us to focus on quick and continuous delivery of useful software to customers in order to accelerate their time to market.

Principle 2:

Check out the complete post here —- Click Here to Read more–>

 

Do share your stories and understanding of the 12 Agile Principles!

Cheers

Nishi

A Day in the Life of an Agile Tester

An agile tester’s work life is intriguing, busy and challenging. A typical day is filled with varied activities like design discussions, test planning, strategizing for upcoming sprints, collaborating with developers on current user stories, peer reviews for teammates, test execution, working with business analysts for requirement analysis and planning automation strategies.

In my article for Gurock TestRail blog, I have explored a typical day in the life of an agile tester and how varied activities and tasks keep her engaged, busy and on her toes all the time!

agile tester.png

Let’s sneak a peek into a day in the life of an agile tester — > You will go through the daily routine of an agile tester and will experience their complicated schedule in real time.

Read full article

https://blog.gurock.com/agile-tester-work-life/

 

5 Mistakes to avoid in Agile Retrospectives

Retrospectives are an integral part of every project we undertake, as well as a key ceremony in the Scrum lifecycle. Agile principally stresses the need to perform periodic meetings to reflect on the functioning of the team, their processes and actions and try to improve their shortcomings, so retrospectives are essential. The team gets to look back on their work and answer three key questions: What went well? What did not go well? How can we improve?

Even if agile teams perform retrospectives as a regular part of their project lifecycle, there are a few common mistakes they may be making due to a lack of understanding, perspective or communication, and these mistakes can prevent obtaining the maximum benefits of the retrospective.

In my article for Gurock TestRail blog, I have discussed five common mistakes that we must avoid in Agile Retrospectives.

 

Click Here to Read more

Do let me know your thoughts!

Cheers

Nishi

 

The crucial guide to Software Testing for Project Managers

Being a Project manager you often need to take on new challenges and create guidelines for projects in a field you are not always familiar with.

You might have some experience working with a team of software developers, which gives you insight into the relevant testing disciplines. Or you may have directly come in as a project manager and need to begin understanding the process from scratch. Whatever the case may be, we are sure you already have enough on your plate. That is why I have gathered a few basic guidelines – both technical and methodological – to help you succeed in your new assignment as a test project leader!

My guest post for PractiTest is now up on the QA Learning Centre-

Dedicated to all PMs – here I discuss the Software Testing 101 making this a guide to all PMs to all things crucial in test process management. Read More..

https://www.practitest.com/qa-learningcenter/thank-you/software-testing-guide-project-managers/

state of mind

Do give it a read and share your thoughts!
-Nishi

 

This website is now featured in the 75 Best Software Testing blogs!

It’s a big day for me as my personal blog has been featured in the ’75 Best Software Testing Blogs’ by 🙂

Check out the complete list at — https://abstracta.us/blog/75-best-software-testing-blogs/

Elated and Excited! Please give a thumbs up and follow me for testing and agile related articles.

To follow this blog –> Add your email ID on the right side panel, and receive periodic updates with new articles and posts!

Listed in top 75 blogs

Thanks a lot!

Nishi

Guest Post- “5 Tips to manage your Outsourced testing”

Want to Outsource your testing? Here are my “5 tips to manage your outsourced testing”

I have begun collaborating with PractiTest and with the help of Rachel, my article has now been published @PractiTest Learning Center.

In this article I have discussed about the practical risks for teams that outsource their testing efforts. I have brought forward 5 key tips and tricks to manage their outsourced software testing along with team and people issues as follows:

  • Treat Them like your Team
  • Invest in training the Outsourced Team
  • Meet often, and also in person
  • Centralised System for Test Management
  • Account for Cultural Differences

Please give it a read and share your thoughts!

https://www.practitest.com/qa-learningcenter/thank-you/manage-your-outsourced-testing/

Thanks

Nishi

I am speaking at the ‘Selenium Summit 2018’ @Pune

Hello!

Check it out!

I am speaking at the Selenium Automation Summit 2018 being organised on 22nd March 2018 by ATA @ Pune.

Find more details about the event at : http://seleniumsummit18.agiletestingalliance.org/

I will be presenting a 90 minute- hands-on workshop on:

“Selenium with Cucumber for an extended BDD Framework”

Are you interested in looking into the trend of Behavior Driven Development? Would you like to see it in action using Cucumber? Would you like to integrate your functional tests in such a framework using integration of Selenium within Cucumber? Then this is the workshop for you!

This workshop will cover

  • Practical issues faced by most testing teams
  • Behavior Driven Development – the definition and need
  • Extending the Agile User stories and acceptance criteria in BDD scenarios
  • Cucumber as a BDD tool
  • Integration of Cucumber with Selenium in order to perform functional tests
  • Demo using Cucumber with Selenium with a real use case
  • Usage and Benefits of BDD In agile teams

Let’s meet there!

-Nishi

Training on Selenium – CP-SAT Certification Batches @Bangalore

CP-SAT stands for “Certified Practitioner – Selenium Automation Testing” is a certification prepared and honoured by “Agile Testing Alliance” & “University Teknologi Malaysia (UTM)”, which is the Selenium training course I have been conducting in Bangalore. We conducted a public batch over the last weekend as well as a corporate batch this month where participants got to build, enhance and maintain the scripts in Eclipse IDE and Selenium 3.x WebDriver.

Training Approach:   This course is designed to train agile professionals with the basics of testing web applications using Selenium leading to advanced topics. I approached the training as a combination of theory as well as hands-on execution of scripts using the features of Selenium with ample time given to practice and kept the focus on the practical application of Selenium to resolve common web automated testing challenges.

Agenda: This course focuses on latest Selenium 3.x, its advantages,  WebDriver 3.x configuration and execution related concepts using JUnit and TestNG frameworks, Selenium Reporting mechanism, Data Driven Testing, getting started with Selenium Grid concepts, handling various types of web elements, iframes, dynamic lists etc. To know more about course syllabus – please click here

Course Schedule: The course consists of 3 full days of training, hands on assignments and practical, continuing on later with 5 days of 2-hour web sessions live with the trainer for more learning and queries and clarifications. Thereafter the candidates are given a mock exam to attempt which gives an idea about the real certification exam. The final exam consists of 2 sections – Theory which is Online Objective type Quiz and Practical which a 2 hour exam with given case studies implementation and submission.

We have received tremendous response from the CP-SAT training batches and many more interested candidates for upcoming scheduled training sessions at Bangalore.

Here is a sneak peek into the training room and also some wonderful feedback shared by our candidates-

Public CP-SAT Batch @Bangalore

Corporate CP-SAT Batch @Bangalore

If interested please check the upcoming batches calendar at – http://ataevents.agiletestingalliance.org/

Happy Learning!
Nishi

 

 

 

 

Better Software Design Ideas for the Hawaii Emergency Alert System

Continuing the discussion on the Hawaii Missile Alert which made headlines in January 2018 and turned out to be a false alarm and ended up raising panic amongst almost a million people of the state all for nothing, (read here for detailed report) I would like to bring back the focus on implications of poor software design leading to such human errors.

Better software design is aimed at making the software easier to use, fit for its purpose and improving the overall experience of the user. While software design focuses on making all features easily accessible, understandable and usable, it also can be directed at making the user aware of all possibilities and implications before performing their actions. Certain actions, if critical, can and should be made more discrete than the others, may have added security or authorisations and visual hints indicating their critical nature.

Some of the best designers at freelancer.com came together to brainstorm ideas for better software design and to revamp the Hawaii government’s inept designs. They ran a contest amongst themselves to come up with the best designs that could avoid such a fiasco in future.

Sarah Danseglio, from East Meadow, New York, took home the $150 grand prize, while Renan M. of Brazil and Lyza V. of the Philippines scored $100 and $75 for coming in 2nd and 3rd, respectively.

Here is a sneak peek into how they designed the improved system :Read More »

Paying Off the Technical Debt in Your Agile Projects

Just as you should not take out a financial loan without having a plan to pay it back, you should also have a plan when incurring technical debt. The most important thing is to have transparency—adequate tracking and visibility of the debt. Armed with the knowledge of these pending tasks, the team can devise a strategy for when and how to “pay off” technical debt.

Learn about managing your technical debt and testing debt in agile teams and share your thoughts on my latest article published at www.stickyminds.com and also at www.agileconnection.com

***** Here are some excerpts from the article for my readers***

Technical debt initially referred to code refactoring, but in today’s fast-paced software delivery, it has a growing and changing definition. Anything that the software development team puts off for later—be it smelly code, missing unit tests, or incomplete automated tests—can be technical debt. And just like financial debt, it is a pain to pay off.

Forming a Plan to Pay Off Technical Debt

Let’s say a development team working on a new project started out following a certain programming standard. They even set up an automated tool to run on the code periodically and give reports on the adherence to these standards. But the developers got busy and stopped running this tool after a sprint or two, and when the development manager asked for a report after a couple of months, there were hundreds of errors and warnings, all of which now need to be corrected.

This scenario happens all the time with agile teams focused on providing as much customer value as possible each sprint. The problem then needs to be fixed immediately, because despite having all the functionalities in place, the team doesn’t want to release code that is not up to production standards.

The team is then faced with a few options for how to service the debt:

  • Negotiate with the product owner on the number of user stories planned for the upcoming sprint in order to have some extra time for refactoring the code
  • Dedicate an entire sprint to code refactoring
  • Divide all errors and warnings among the development team and let them handle the task of corrections within the next sprint, along with their regular development tasks, by scheduling extra hours
  • Plan to spread this activity over a number of sprints and have a deadline for this report before the end of the release
  • Estimate the size of refactoring stories and either plan them into upcoming sprints as new user stories or accommodate them as part of existing user stories

Though these are all viable options, the best approach depends on the team, the context, upcoming deadlines, the risk the team is willing to take, the highest priority for functionalities that need to be shipped, and the collaboration with the product owner.

Again, just like when you take out a financial loan, you should plan to pay off technical debt as quickly as possible using the resources you have. It’s a good idea to perform a risk analysis of the situation and reach a consensus with the team about the best approach to take.

Technical Debt in Testing

Technical debt doesn’t occur only in programming. Testing activities are also likely to incur technical debts over time due to a variety of factors, including incomplete testing of user stories, letting regression tests pile up for later sprints, not automating essential tests every sprint, not having complete test cases written or uploaded to test management tools, not cleaning up test environments before the next iterations, and not developing or testing with all test data combinations on the current features.

Sometimes debt may be incurred intentionally for a short term, such as not updating tests with new test data when testing on the last day of the sprint due to a time crunch, but planning to do it within the first couple of days in the next sprint. As long as the team has an agreement, it’s acceptable to defer some technical debt for a short while.

On occasion, debt may be incurred intentionally for a longer term by planning it in advance, such as deciding to postpone any nonfunctional tests, like performance or security-related tests, on the system until a few sprints are out and features are stable enough to carry out the tests. Again, as long as the team agrees with the risk and has a plan to address it, it is fine to defer certain activities.

Testing technical debt can get us out of tight situations when needed, but you still need to ensure that you plan carefully, remain aware of the debt, communicate it openly and frequently, and pay it off as soon as possible. Having a plan to service these debts reduces your burden over time and assures your software maintains its quality.

Debt-Solutions

Prevention Is Better Than Cure

Avoiding having any technical debt is always preferable. As the saying goes, an ounce of prevention is worth a pound of cure.

Every team has to devise its own strategy to prevent technical debt from accumulating, but a universal best practice is to have a definition of “done” in place for all activities, user stories, and tasks, including for completing necessary testing activities. A definition of “done” creates a shared understanding of what it means to be finished so that everybody involved on the project means the same thing when they say it’s done. It becomes an expression of the team’s quality standards, and the team will become more productive as their definition of “done” gets more stringent.

Here’s a good example of criteria for a team’s definiton of “done” for every user story they work on:

  • All acceptance criteria for the user story must be met
  • Unit tests must be written for the new code and maintain a 70 percent coverage
  • Functional tests must be performed, and exploratory tests must be performed by a peer tester other than the story owner
  • No critical or high severity issues remain open
  • All test cases for each user story must be documented and uploaded in the test management portal
  • Each major business scenario associated with the user story must be automated, added to the regression test suite, and maintain a 70 percent functional test coverage

Verifying that the activities completed meet these criteria will ensure that you are delivering features that are truly done, not only in terms of functionality, but in terms of quality as well. Adhering to this definition of “done” will ensure that you do not miss out on essential activities that define the quality of the deliverable, which will help mitigate the accumulation of debt.

Despite best practices and intentions, technical debt often will be inevitable. As long as the team is aware of it, communicates openly about it, and has a plan in place to pay it off as quickly as possible, you can avoid getting in over your head.

*************