Test Generation Context in TestGenie

TestGenie uses a Test Generation Context model to combine multiple inputs and produce realistic test cases.


Scenario Description

  • Primary driver of the test behavior.

  • Describes the user journey or system behavior under test.

  • The clearer and more detailed the scenario, the better the AI can generate:

    • Main flows

    • Alternate paths

    • Edge cases


Requirement Issue

  • Represents the requirement or feature being validated.

  • Commonly a Story, Task, Feature, or Bug.

  • Used for:

    • Traceability: which tests cover which issues

    • Additional context for Jira test case generation


Requirement Issue Description

  • Provides richer detail such as:

    • Business rules

    • Acceptance criteria

    • Edge cases or constraints

  • TestGenie combines this with the scenario description to ensure:

    • More complete AI test case generation

    • Lower risk of missing important behaviors


Preconditions

  • Represent states that must hold before executing a test, for example:

    • "User exists in system with active subscription."

  • TestGenie uses preconditions to:

    • Guide which tests are realistic

    • Avoid generating invalid scenarios (e.g., tests that assume nonexistent data)


Precondition Descriptions

  • Provide detailed explanation of the setup or environment.

  • Used to:

    • Enrich the AI's understanding of system assumptions

    • Influence how Test Steps and expected results are generated


How Inputs Are Combined

The Test Generation Context model:

  1. Takes the scenario description as the central narrative.

  2. Enriches it with:

    • Requirement issue summary and description (requirements, rules, acceptance criteria).

    • Precondition summaries and descriptions (setup, environment).

  3. The AI then:

    • Proposes test case ideas (planned tests).

    • Expands each into a detailed Test Case draft with steps and expected results.

This results in realistic, context-aware AI test case generation tailored to your Jira environment.


Where Generated Test Cases Appear

After the Rovo agent generates test cases, you review and approve them in the generation panel. Once approved:

  • Each test case is created as a child issue of the requirement issue in Jira, using the Test Case issue type configured in Issue Mapping.

  • All generated test cases appear in the Test Cases screen — the central library where you can view, search, and manage every test case across your project.

  • From the Test Cases screen, test cases can be added to a Test Plan and executed in a Test Execution run.

See Test Cases Screen for the full workflow after generation.