Exploratory testing, Session based testing, Scripted testing…concertedly

In my last assignment a mix of exploratory, session based and scripted testing helped won an award, appreciation by managers & stake holders at our organization. The quotes from our award certificate

QA team has come up with some intelligent testing techniques to validate the product. The customer test team was not able to find even a single critical bug after delivery

Though the award was a small one in terms of the award categories our organization have, it meant a lot to me and the team, because this was the first time the team used exploratory approach. The assignment was not pure exploratory or session based, in fact it was supposed to be scripted, but a mix of all three exploratory, session based and scripted testing helped us learn, enjoy, and meet our mission.

The assignment started off with developing test cases. We wrote test cases based on the initial requirement document provided. We did come up with a large number of test cases, of which most were re-written, some ended obsolete, deferred, because of changes in software or hardware or firmware or incorrect requirement or technical limitations or schedule limitations or etc. The test cases as part of the process were then reviewed by the stakeholders and were signed off.

The releases started flowing from the development team to the testing team. In the first 2 releases we executed the test cases found some bugs and were quite happy. After the second build release for testing, the bugs started decreasing and we felt bored to execute the same tests. To add to the misery, our hardware boards got limited and we were 5 testers with 1 board to test on.

This to me was the right time to get in exploratory and session based testing.
• I divided the team into 2 with 2 testers in each team,
• Session time was set to 1.5 hrs per session, mission for the session was provided before the session would start.
• After one team completes the exploratory session, the notes, bugs, issues, data used, tasks performed would be discussed with me, while the other team sets out to meet the mission.
Initially I did observe couple of testers finding it hard to meet the destination (mission) without a GPS (test cases), but observed 2 other testers enjoying their testing, switched the pairs in the team, and the bugs started flowing. This session based testing allowed everyone to be occupied with testing, enjoy testing and also helped me keep track on all the proceedings.

Though exploratory was much more fun and the rate at which bugs were found was huge, after some releases, when we had no new features left to be added to the software the bugs reported by team started dropping, when we sat together, I felt the ideas running out in the team – we initially focused on exploring the system with functional and domain related missions. After the discussion we changed the focus from functional & domain to Flow tests - explained the team on what to be targeted in flow tests and what could be the observations to note running flow tests – and what a change, the energy was back, they liked testing the same module again without any new changes in it.

From then on test case execution was approached as an activity to provide only test case execution report, but there were times wherein a test case would remind us of a requirement we missed in our exploratory, a test scenario in one of the test case we missed to run when we followed only exploratory approach. I feel the reason why we could have missed out a requirement or a scenario, while exploring could be because the missions were not planned as carefully as they should have been. Well, a lesson for the next phase.

We are thru with the release, and are pretty happy on meeting our mission.

Hope this little summary inspires other teams to have a look at exploratory, and how you could fit it in your context.


Pradeep Soundararajan said...


This experience report is fantastic. I shall get you a beer or two more to listen to this story in more detail :)

Congratulations and I wish India had more Sharath.

daman said...

A great (small) acheivement! Interesting the way that rearranging the pairs restarted the enthusiasm
Erik Petersen

P.S intersting posting bug, I am not daman!!!

Keshav Ram Narla said...

Wow! You got it right on. Sounds like a lot of fun to try. I will try to share this with my team. Thanks keep writing more.

Michele Smith said...

Congratulations on the award Sharath!

I am always impressed by the team oriented approach you have. Keeping the team aware of their mission and their strategy seems that it could have some wonderful consequences.

Using your mission as a training ground, instilling enthusiasm in your team, guiding your team in a way that they enjoy their work, and completing the mission successfully - as much can be learned from your approach to people as from your approach to testing.

Thanks for sharing the story.

Sharath Byregowda said...

@ Pradeep,

WOW! Free beer and sharing testing stories with Pradeep - I am :)

@ Erik Petersen

Thankfully I had 2 testers who picked up exploratory testing soon, and two other who were a little late to come to terms, I also had loads of exercises with the team, including the famous triangle example. I feel all of these helped us.

It would have been interesting if none liked exploratory, but I guess testers who test know the limitation of scripted testing.

@ Keshav Ram Narla

I feel testing is always fun. Let me know your team's response when you try exploratory.

@ Michele Smith

Thanks for the kind words Michele.


abhilash said...


I was one of the testers who took part in this exploratory testing session. We all thoroughly enjoyed it and guess what the total number of bugs which we found were around 350 out of which test cases got us maximum 10 bugs. So you can get a statistical idea of how the session helped up.

Thanks and Regards,

Pradeep Soundararajan said...


I congratulate you and your other team members as well on this success.


Get your team someday for a meeting. I'd be interested to talk to them as well but of course, I can't sponsor beer for everybody :)

Keshav Ram Narla said...


can you give us a quick 15-20 minute reader of a HOWTO to start of a similar exercise.


Sharath Byregowda said...


This should get you started http://www.satisfice.com/sbtm/


PlugNPlay said...

Sharath - I am impressed with the way you adjusted your pairing to distribute the energy throughout the team. I have found that when deciding on pairs, I need to take into account domain expertise, testing expertise, and energy level and try to make groups with sufficient amounts in all three areas. That is how I integrate non-testers into testing, by making sure that they have something to contribute to the pair in at least one of the three areas, and pairing them up with someone who can complement them.

- Geordie Keitt

Sharath Byregowda said...

@ PlugNPlay - You make a very good point, pairing testers with different expertise might help each other a lot. But, I must admit I had paired my testers because we had a limitation on hardware resources, and when the initial pairing did not work, I shuffled them after observing what they might be good at, luckily it worked :)


Geordie Keitt said...

Sounds like (to use my terms) you made sure the energy level was sufficiently high on both teams by realigning the pairs. Well done.

vivek said...

@Sharath - It's amazing experience report… This reminded me of our journey of test cases to scenario based exploratory approach. On a very similar line…
We have an application with more than 300 complex and many medium and simple Business rules... don’t have the exact numbers. Requirements are scattered and are in many SRS's and functionalities are heavily dependent on each other and of course implicit requirements. We had more than 3000 Test cases with detailed steps. It was becoming very boring for the team as most of the team members were on the same project for more than 1.5 years. Despite doing many innovative things to keep the team motivated it was difficult to keep the team with full of energy.
At that time, one of my Team members (It is the right place to give him the credit… Shaham Yusuf) coined the concept of exploratory testing in our project and he did it in a style. He followed the process of executing test cases but side by side he also did some exploratory. He demonstrated first by discovering many defects in the application (which was more than defects discovered by the rest of team of 7) and by commenting this I have no intention of saying that the rest of the team has not worked and not tried for finding out issues in the application. This has opened the eyes of all the members of the team. The team started finding out on what he is doing extra. We had a briefing session with him on how he finds issues in the application by doing exploratory. All were excited with the new idea of finding issues. As a test lead my responsibility was to get some extra time for this activity and it was really tough to get even a single day for doing EXPLORATORY. We did a trick, convinced all the team members to sit for extra hours on the first few days of the testing and completed all the test cases one day before the actual end date and convinced the client for one day of exploratory. We found many issues in the application on the last day of testing and of course none were directly fall under those test cases.
We moved from test cases to first hybrid scenario (Combination of high level scenario and test cases) so that all the team members understand the concept of scenario based testing. We also made sure that the so called “coverage” is maintained in the scenarios as well. Later on based on the team’s comfort level we decided to move to high level scenarios and gave more flexibility to the testers to go for session and scenario based exploratory testing.
We saved our time and effort in creating those thousands of test cases every time due to the changes in the requirement. The time we saved in creating test cases vs. creating scenario was utilized by us to build the competency and sharpening our skills in testing and better functional knowledge which helped us in doing proper exploratory testing. In the next release with the Data Points in hand it was much easier for me to convince our client to move from writing test cases to the Scenario based approach to test. The number of defects raised by us was much more than the all previous releases also we reduced the number of UAT defects significantly.
Challenges Faced:
• Not easily acceptable by the management
• Need Demonstration
• Need focused approach and clear mission
• Initial challenge of migration from Test case to scenarios

Better than the best practices:
• BR Traceability
• Session based scenario testing
• Pair Testing
• Competency building exercise
• Scenario Templates

@Pradeep – Yes, we do have few more Sharath like passionate exploratory testers in India 

Sharath Byregowda said...

@ vivek.

Thanks for sharing your experience with me, I loved it. I know Shaham Yusuf, facilitator for the Mumbai chapter for weekend testing.

What thrilled me the most is YOU (Vivek). What a manager! I feel we have passionate testers across India, but very few leaders like you who appreciate the effort they put in and take their ideas forward to the next level. Shaham is lucky and fortunate that he has a manager like you. Thanks again Vivek.

Since you have been able to implement session based testing so well, I recommend you to look at the tool Session Tester (if you haven’t looked into it). The tool will help you record sessions very easily and there are some very good features which could be useful? You could download it from http://sessiontester.openqa.org/ .


shankar k said...

hi sharath,i have been hearing about exploratory testing though i have not asked about it anywhere... i hope this is a right place to know about it in detail
can you kindly give an insight of how do you start exploratory testing, when do we feel we need to go for exploratory testing, if you have any docs or any links to know about it in detail, kindly share the link