1. Do you have a regression test suite identified?
A. Yes, we have a full regression suite (or multiple if
applicable)
B. Most, less than 75% of our org is regression tested
C. Somewhat, less than 50% of our org is regression
tested
D. Somewhat, less than 25% of our org is regression
tested
E. No, not at all
2. Do you have a Salesforce QA CoE within your project /
company?
A. Yes, the QA process and strategy is centrally owned
B. Somewhat, responsibilities are identified
C. Somewhat, though not organized or official
D. We have discussed or are making plans for this
E. Not at all
3. Do you have all of your test cases documented?
A. Yes, 100% of test cases are documented
B. Somewhat, under 75%
C. Somewhat, under 50%
D. Somewhat under 25%
E. No, we have not begun documentation of our test cases
4. How do you manage your test cases?
A. With an integrated tool (Provar Manager, Zephyr,
Testrail)
B. With a standard documentation tool (word, excel, google
sheets)
C. Documented, but unorganized and difficult to maintain
D. Scarcely, or unorganized
E. Not at all
5. What is the frequency of releases to production?
A. Daily
B. Weekly
C. Bi-Weekly
D. Monthly
E. Ad Hoc, as needed
6. How often are your releases delayed due to testing time?
A. Never, always on time, 0%
B. Rarely, but more frequently than we'd prefer / 25%
C. Sometimes, 50%
D. Often, we struggle to release on our planned schedule
>50%
E. Almost always, it is very rare that we release on time / as
planned
7. How much time is spent documenting and maintaining your test
scripts?
A. <10%
B. 10-29%
C. 30-59%
D. 60-89%
E. 91-100%
8. How often are you able to run your regression suite prior to
release to production?
A. Always, 100% of the time
B. Most of the time, 75-99% of the time
C. Sometimes, 50-75%
D. Occasionally, <49% of the time
E. Never
9. What % of Stories have defects in production after release
on average?
A. 0%
B. <10%
C. <30%
D. >31%
E. We don't track this
10. Are you testing end to end (cross application) scenarios /
flows?
A. Yes, testing end to end across different applications with
every release
B. Yes, testing end to end across different applications as
needed
C. No, we don't have any end to end scenarios today
D. No, but we want to and would like guidance
E. No, why would we do that?
11. How do you manage releases?
A. Integrated 3rd party tool (JIRA, Flosum, Copado, Gearset,
etc) or SFDX
B. Internal process and/or in-house solution (documented)
C. Changesets, ANT Migration tool, or otherwise
D. We have tried to manage releases but it is very unorganized
so we release ad-hoc without structure
E. We don't have a release management process identified
(everything is done manually)
12. Is your SDLC automated?
A. Yes, 100% - all of our tooling is integrated and our
processes are automated
B. Most of our SDLC and tooling is integrated and automated
C. Some of our SDLC tooling is integrated and automated
D. We are working towards this and are creating a plan to
automate and integrate tooling and CI/CD
E. No CI/CD in use today or any relevant integrations
13. Is everyone in the testing team enabled to build and
maintain automated tests?
A. Yes, all functional users, QA and developers are enabled to
automate and maintain tests
B. Most functional users, QA and developers are enabled to
automate and maintain tests
C. Our QA team is enabled to automate and maintain tests
D. We are highly dependent on a small number of people who can
automate and maintain tests
E. No, nobody is enabled to automate and maintain test
scripts
14. What percentage of your total tests are currently
automated?
A. 100%
B. >75%
C. >50%
D. >25%
E. 0%, we have not automated any tests yet
15. Of defects opened, how many are found by automated test
cases per release (vs. opened by users)?
A. 100%
B. >75%
C. >50%
D. >25%
E. <25%