Overall Satisfaction with SpiraTest
SpiraTest is being used currently for a large, complex project with an ambitious time frame in the entertainment/hospitality industry.
- A clean view of test case steps, expected results and the ability to record a result for each step quickly on one page per test set. This really helps testers work through executing manual test cases.
- Hierarchical structure of releases and builds, requirements, and test cases.
- Can quickly build test sets and/or test runs per build/release.
- Simple identification of each test case, requirement, test run, build, release, etc.
- While there are some nice, live reports, test results per build/release is an obvious omission. The canned reports that can be configured are mostly text based. Some graphs included in those would be a nice addition.
- Recording a Fail for a test step does not facilitate recording a defect. Defects must be manually reported separately. It would be nice if some standard fields were offered when a failure was recorded (e.g. summary, actual result, expected result, description) were recorded with system-known data such as the test steps (steps to reproduce), release, tester's name, etc. and turned into a defect to be recorded in Spira, or sent to an integrated 3rd party defect tracker (JIRA, Mantis, TFS, etc.).
- While the import (from Excel) tools are nice and work efficiently (2 way sync of data, hierarchy easy to implement, tests can be linked to requirements before import/export), there are some default fields I would like to have seen included in there so I could import the 1500+ test cases with less post import/export data manipulation required in Spira across the imported data
- Early days for me. This is still being determined.
- Better reports would improve this.
I migrated out of another test management tool mid project to Spira. The Spira import tools made this relatively easy.