A Software Tester's journey from manual to political tester

A Software Tester's journey from manual to political tester

Test automation is an interesting activity. When teams start their journey, interesting things happen. Teams become more efficient, test coverage increases, communication between software testers and developers increases, fewer defects are reported by customers and so on, isn’t it? But does it happen every single time?

Let me tell you a story, after a disclaimer! 

All persons portrayed in this story are fictitious, and any resemblance to living or dead, manual, automated or political tester is purely coincidental.

A small team of few developers and testers was working on a product at company X. Test team at company X wasn't experienced, and lead tester of the team was Jim. Jim was a good tester. He was excellent at finding defects. Unfortunately, he never got an opportunity to work on the test automation projects.

Management at company X had no interest in the testing approach – they wanted results, and they never supported or pushed team for automated testing. Management was happy with the level of testing Jim, and his team was providing, however with every new release testing cycle was becoming longer and longer.

Let's follow this team's journey in the test automation land.

Jim had a dream about test automation

Jim was tired of checking everything for every single release. After an exhausting day at work, Jim went out with devs for a drink, and one of the devs sympathetically mentioned that test team is invaluable – as they are taking care of all the mundane and repetitive activities.

Jim felt good; he was in good company, but he also felt bad about the tons of checks he needs to perform every single day. That night, Jim had a dream about test automation.

In his dream, Jim had an automated suite to take care of all the repetitive tasks he was doing. He used his saved time to learn more about the domain, explored different parts of the system and trained new members of his test team. He woke up fresh, inspired and full of energy - Jim was determined, he wanted to bring changes and introduce automation in the project.

Dream triggered thinking

When Jim went to the office, there was only one thing in his mind - Test Automation. What if we automate our regression pack? What if the entire team can start using automation and no one has to do any mundane and repetitive tasks? How much time would I save and how would I use this saved time? How much would it cost to get the license for everyone? How long would it take to recover this cost? Jim was on fire - he got convinced that test automation would be a good thing for the project.

Jim thought, he would need time, resources and money and without getting appropriate support from management - he will not be able to proceed.

Jim sold test automation to management

Jim was convinced that he has found a perfect way to solve current and future problems around testing in company X. He approached management and requested that project should start considering test automation. He gave his reasons – new features are increasing testing footprints, defects are missing, time it takes to complete a test cycle is increasing. Jim mentioned that it would be possible to reduce testing cycle and release more frequently - management was cautious, but Jim's passion and perseverance won in the end.

Management agreed, and they created a small task group to evaluate if test automation is a viable strategy or not for them. Unfortunately, everyone in the group had their opinion, and Jim had to fight with many folks to sell this idea. They eventually started inviting vendors to see if their tool can automate their product. Folks in the task-group started looking at the budget, presentations, and POCs developed by the vendor. Jim thought things are moving in the right direction.

Jim was involved in all of these activities. Finally, task-group got tired of discussions, and they agreed to use Robtomatic Magical's Wonder Tool for their test automation need. Wonder tool was full of promises - they were first to build their POC, it was possible to record and playback, they had a 24*7 technical support line, and they also made it possible to write automation in Java.

Jim was on the high - he finally had the tool and thought - my dream is going to become a reality soon.

Fast track of failure - Record all tests

Jim was aware that he has lost plenty of time in selling this idea, management is looking up to him, and it is important to demonstrate progress quickly. He looked up at his options and decided that record and playback might be the easiest option to get started. He wanted to automate pretty much everything as soon as possible.

Jim's team got started on a good note, but soon they started facing problems. It was okay when they had few scripts, but they found it difficult to scale with record and playback. The product they were working on was evolving, and scripts were broken most of the time. Unfortunately, the dream he had was not going to become the reality that easily. On top of that, because many people were spending time in maintaining and creating automation script - it was affecting testing.

Testing cycle suddenly became a bit longer and when the product was released – it had few more defects than usual.

Jim learned the hard way that test automation using record and playback wasn't a scalable solution. During the post-mortem of release, everyone agreed that automation played a role in releasing software with bad quality - but it was a step in the right direction.

He checked with vendors and they explained that since their product is complex, it might be worthwhile to explore Java instead of record and play back for their automation efforts. Jim organized a vendor training for everyone. Jim promised himself - next release would be different.

Java - a new territory

After a three-day vendor training, Jim and his team started writing test automation in Java. They tried to maintain old scripts but dropped the idea eventually. They also experimented with converting those scripts in Java - but had to leave that as well. Pressure on Jim's team was building as devs were churning out new features quickly and test team was struggling with test automation. 

Testing was on the backseat and quality of the product was deteriorating.

Jim and his team were excited about the usage of Java. However, they were new to the programming world. Test automation, tool, and the programming language - everything was new for them. As a result, they could not use core programming concepts such as abstraction, inheritance, polymorphism, etc. Most of the scripts they created had copy-pasted code with few minor changes.

Jim got these scripts reviewed from the devs. Unfortunately, automation was new for them as well. They thought automation code is never going to be released on the production and if it is serving the purpose well - that should be good enough.

These scripts in Java were a bit more useful and robust than recorded script - but they were tightly coupled with the system. As a result -

Every change in the system (In GUI or business logic) had an impact on nearly all the scripts.

Management soon realized that test team is lagging, quality is deteriorating, and they asked Jim to find a solution. Priority for Jim was to ensure that quality is not degraded - so he created two groups. One group would focus on manual testing, and another group would focus on test automation.

Pressure is building - progress hampered by automation

A group of manual testers helped devs, and they picked up more speed as they did not need to spend any time in test automation. This increased speed made Jim's life a bit more difficult. Management started asking tough questions to Jim - about ROI of test automation, why devs are more productive with manual testers in the team and why automation is not helping with regression defects.

By this time, Jim was on a negative track. He mentioned that automation is not in shape because the application is not stable.

Constant changes in the application are making it difficult for his team to cope up with the automation.

Jim was so frustrated that he demanded code freeze and a stable application to maintain existing and write new automation scripts.

Management was not impressed - they could not keep their customers waiting, and they could not keep their developers idle. They gave more time to Jim to sort out the automation and made it clear that code freeze is not going to happen!

Time to cut the corners - sacrifice reliability for speed

Speed and automation support for new features became the new goal for Jim. Jim tried to cover automation for every new feature to reduce regression cycle in future. Jim also wanted to prove that there is no problem with automation and his approach as such - and his automation group started working hard to prove their worth.

Jim was busy adding new tests.

He rarely paid attention to the tests which were occasionally failing.

Writing tests for new features and accommodating changes in the old tests was more than what his team could handle. Jim was focussing on the speed and coverage and was okay with the flaky tests. In his mind, it was better for him to check failed tests manually instead of spending time in removing flakiness.

Unfortunately, the size of automated suite increased and flakiness became a real problem.

After every test run, someone was spending the entire day running failed tests manually to check if there are any real failures. They never found any issue with the application and all the failures were because of flakiness in the automated tests.

If failed tests are not defects, there is a possibility that passed tests are hiding defects? In test automation, false positives are annoying, but false negatives are dangerous.

Reliability of the test suite was at risk. After working so hard, Jim and his team were not prepared for this, and they became defensive.

Automation became a political issue - Jim started fighting for it

Cutting corners make Jim defensive

Jim was a quick learner and because of the pressure he was feeling from peers and management - he quickly learned works on my machine phrase from the developers. If tests were running on the machines of his team, he was okay with it.

Automated suite was not only coupled with the system, but it was also coupled with the machines, test environment and test data.

Everyone tried, and everyone failed to run automated suite on their machines. Jim, a busy man, never had time to solve problems for everyone. For him, it was good enough to be able run these tests from machines his team own, using test environment which is rarely changed and on a static dataset.

People who were genuinely interested in test automation - tried to solve few problems and made some changes in the automated suite. As a result, tests stopped working on Jim's controlled infrastructure. Jim reverted to an old version and this cycle repeated couple of times.

Finally, Jim got angry, and he declared -

Automation is our baby - please don't touch it.

People fought against it; Jim fought with management - he said shared responsibility would mean everyone is wasting their time. He got exclusive rights for his team - automation became his baby. Management, however, insisted that Jim should organize code review and brain-storming sessions with everyone.

The team lost interest in test automation. As far as they could see - automation wasn't adding any value.

Review and brainstorming increase the rift in the team

As suggested, Jim organized code review and brainstorming sessions. During code reviews, Jim, and his team would agree with the review comments and suggestions devs were making. However, they always had so much on their plate that feedback would end up in the backlog of nice-to-have stuff.

The team realized that these meetings are generating tons of good suggestion - but no actions. It was only a matter of time - people lost interest in reviews and brainstorming exercise.

Jim felt offended. Devs do not come to meetings he organizes around test automation. They have no interest in test automation and quality.

Since no one else was coming - they stopped doing reviews and brainstorming sessions.

Functional to dysfunctional team

Jim responded by not going to the sessions organized by devs. His argument was - If they are too busy to come to my session, I am not free either. I have tons of things to do.

However, it reduced the visibility Jim could have had in the design and architecture. Jim also lost the opportunity to raise concern around test automation and testability of the application in general. He also lost context - many times when he was with devs; he felt clueless when they were discussing design and architecture.

Jim started feeling - they don't involve me and developers started thinking - he does not add any value to the discussion.

Test automation was causing more troubles to the team.

Political tester is evolved

Automation was becoming more and more difficult - it was generating lots of negativity. Since Jim was the guy who thought about and fought for automation - everyone looked at him for the explanation.

Jim was clear - it was all fault of developers and management. They were moving too fast, and applications were changing every day; he had no support from devs. The list was endless, and Jim was sure - only if he had some support from the devs and management - things would have been different.

His main task now was to highlight why automation has not worked till now and defend his team, action, and approach. As a result - he stopped asking devs what he need - instead, he started explaining why automation isn't working and cannot work for the current project.

Jim wasn’t doing manual testing, wasn’t doing automated testing, wasn’t even managing test activities – he was defending his actions.

From an excellent manual tester, Jim was now a semi-technical, political tester.

Not sure what's happening at the company X these days, I will try to follow company X and Jim's story in future – but does this story ring any bells for you? What mistakes Jim made? What could he have done differently? How would you act or have acted in similar situations?

Please comment and we can discuss. Also, please share this story of Jim in your network - folks might be on their way to becoming a political tester and can probably learn from Jim.

Karthikeyan Murugesan

Director - Testing at Celcom Solutions Global Inc

7y

Following are my views, 1st Mistake of Jim - he might have failed to identify and classify the testable and non testble requirements for Automation. Basically WHAT TO AUTOMATE. 2. TOOL selected for Automation. He might have failed to check the compatibility of the application with the tool features. HOW TO AUTOMATE. 3. Team selection for Automation. Not all the functional tester are comfortable with scripting. As the mind set they adopt for testing is different from scripting. They may not be able to build the right logic for automation. The reusability of the scripts. 4. Experience Matters. While implementing automatin for the first time, its always better to start with a well experienced resources (in terms of domain and technology) 5. Automation of all the functionalities in the Application may not be feasible.

Goutham Kumar Pagadala

Senior Software Engineer in Test

7y

Here i can find the mistake of jim. Inorder to prove himself he lost everything Faults: 1. To kickstart the automation he started with direct record n playback but haven't got any insight of how to drive it actually for long run. 2. Jim has took time to sell the idea but there is no proper plan and the ability to spent time in finding real challenges in automation inorder to avoid the same. 3. Having a myth that Automation is tool driven and by clicking one start and everything will run. 4. Lack of skills with the team. While selling the idea itself not highlighting that this requires skilled resources equivalent to developers. 5. Over commitment makes team to struggle and lose the intrest on the project. Learnings: 1. "A Fault is always a Start". 2. Explaining management and stakeholders on automation life cycle and its challenges. 3. We cannot halt a change/ add on to application but we need to fight for the effort inorder to be in sync with test pack and application Having said more effort then where is the ROI may have a question But here coverage of testing increases. Also Being with automation is like planting a tree, ROI is once the application gets into BAU and Stable then will see the results of it.

Kumar Shanmugam

Enterprise Architect |Technology Strategy| IT- Business Strategy Alignment | Solution Architect | Digital Transformation | EA practice setup

7y

Very well written

parul prajapati

QA Lead @Augmont | Passionate about Quality and Agile Delivery | Certified ScrumMaster®

7y

Interested to know Jim's mistake and the solution .

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics