TestGenie uses a Test Generation Context model to combine multiple inputs and produce realistic test cases.
Scenario Description
Primary driver of the test behavior.
Describes the user journey or system behavior under test.
The clearer and more detailed the scenario, the better the AI can generate:
Main flows
Alternate paths
Edge cases
Requirement Issue
Represents the requirement or feature being validated.
Commonly a Story, Task, Feature, or Bug.
Used for:
Requirement Issue Description
Preconditions
Represent states that must hold before executing a test, for example:
TestGenie uses preconditions to:
Guide which tests are realistic
Avoid generating invalid scenarios (e.g., tests that assume nonexistent data)
Precondition Descriptions
The Test Generation Context model:
Takes the scenario description as the central narrative.
Enriches it with:
Requirement issue summary and description (requirements, rules, acceptance criteria).
Precondition summaries and descriptions (setup, environment).
The AI then:
This results in realistic, context-aware AI test case generation tailored to your Jira environment.
Where Generated Test Cases Appear
After the Rovo agent generates test cases, you review and approve them in the generation panel. Once approved:
Each test case is created as a child issue of the requirement issue in Jira, using the Test Case issue type configured in Issue Mapping.
All generated test cases appear in the Test Cases screen — the central library where you can view, search, and manage every test case across your project.
From the Test Cases screen, test cases can be added to a Test Plan and executed in a Test Execution run.
See Test Cases Screen for the full workflow after generation.