We all know Testing is vital for quality assurance, and best practices in software testing can yield high-end software at the end. Here are some best practices for software QA in the Agile world:
- Learning is the key: Learn your product, business logic, testing type, automation with new technology.
- Never assume: Always check/verify/validate and communicate to define the Actual vs Expected results. Otherwise it may end up as a production defect.
- Involve QA from the beginning: QA should be involved in the project from the planning stage itself. They should participate in requirement gathering, review and share the feedback due to find early state of bugs. QA should be aware of the project goals and objectives. This helps in setting up the right expectations and ensures that the QA team understands the project’s requirements.
- Define testing strategy: A clear testing strategy should be defined for each sprint. This includes identifying the scope of testing, defining test cases, and determining the type of testing to be performed. It is also essential to define the test environment and data requirements.
- Prioritize tests: Prioritizing tests is critical in Agile development. Not all tests are equally important, and some may have a higher impact on the overall quality of the product. Prioritizing tests ensures that the most critical tests are executed first, and the risk of defects is minimized.
- Automate testing: Automated testing is a critical component of Agile software development. Automating tests helps reduce testing time, increases test coverage, and improves the accuracy of test results. It is also helpful in detecting regressions quickly.
- Perform Continuous Integration and Continuous Deployment: Continuous Integration (CI) and Continuous Deployment (CD) are essential in Agile development. CI ensures that all code changes are integrated and tested regularly. CD ensures that the product is deployed to production frequently, reducing the risk of large-scale defects
- Manage Test data & Environment: Enable QA Members to maintain their own testing environment including test data which should be more closer to production environment.
- Collaborate with the team: QA should collaborate closely with the development team, product owners, and business stakeholders. This helps in clarifying requirements, identifying potential issues, and ensuring that the product meets the expectations of all stakeholders.
- Own the bugs: Quality is not about only testing and detecting bugs. It’s more about to fixing the bugs. So take ownership of the bugs until get fixed.
- Replicate issue: Reproduce the issue at least twice and check application log before you file a bug
- Avoid over plan: Plan but not over Plan to avoid manage risk.
- Test Early, Test Often: Find issues as early as possible. Communicate, report and share it. Reduce the cost significantly and save development time.
- Create QA documents: Create and maintain clear, lean documentation to keep records as well to have everyone on the same page.
- Pesticide Paradox: The pesticide paradox principle states that no new defects will discover if you run the same test cases again and again after a certain period. So need to keep reviewing and updating your test suites to ensure that they are effective in detecting new defects.
- Think out side of the box: to cover not only happy path but also negative scenarios, edge cases including different devices, OS and user profiles.
- Adaptability: Adopt agile methodology, build quality culture and deploy quality product.
Once requirements have been created and approved, while coding is in process, it is time to create test cases. The idea is to have a complete list of test cases completed before coding done, so that you do not waste time during the QA process. Use the Test Type field (choice list) to identify the type of test:
Functional/Sprint Testing : Negative & Positive
Unit and Integration Testing
Positive Testing: Positive testing is testing the software in the exact way it was designed. To create these test cases, you will want to follow the requirements document to ensure that all features of the software are tested. In positive testing, you are not trying to “trick” the system; you are simply testing all the features per the design.
Negative Testing: Negative testing is used to try to break the software in ways that a typical user might, by entering erroneous data. Here are some examples of negative testing: Date formats – Try entering invalid dates (like 02/30/2006). Try entering alphabetic dates (like Jan 1,2006), try entering in totally bogus information (xxxxxxx). Numeric formats – If you know a field must allow only numeric entry, try entering character data (abcdefg). Try entering commas, decimals, hyphens, dollar signs. Field Sizes – If you know the size of a field (let’s say up to 20 characters), try entering a large amount of data (like 100 X’s), it should handle that gracefully. Many times this will cause a crash. I like to go to every field and just enter tons of xxxxxxxxxxxxxxxxxx and see what happens.
Release Preparation / Regression Test Cases: Each new release has the potential to break existing features of your software. It is a great idea to have a list of regression test cases that are run to ensure that all the existing features work as originally designed. Prior to the testing process for the release, the existing test cases should be updated to ensure a high quality test process for the release.
Unit and integration tests: Unit tests will isolate each component of your app, while integration tests will assess how well each subsystem works. Run unit tests in parallel to save time, but don’t move onto integration tests until you have ensured that individual components work like they should.
Smoke/Sanity Testing: Could be use as very high level production testing on the same release day. It can also perform if you do a minor code change due to injecting the new changes did not break the system.