Reports
Last updated
Last updated
The report generated by this tool includes the title, chosen by the user during generation, the path of the report, the date, and the time of execution. These details are essential for categorizing the reports and referring to those relevant to past test executions. Additionally, all test files that were executed in the session are listed. This list is found in the section below the file path, and it can be expanded or collapsed for reference as needed.
In the main section of the report there are two visualizations.
The first one concerns the test session's status and consists of a pie chart showing the percentage of tests passed, failed, and ignored. Alongside, the quantities of total tests, passed, failed, and ignored tests are provided. Additionally, at the bottom of the tab, the percentage of tests passed relative to the total and the overall duration of test execution are reported.
The second visualization, titled "Top Failed Features in Percentage," displays features with the highest percentage of failed tests relative to the total using a bar chart. This chart features feature names on the x-axis and the percentage of failed tests on the y-axis. Moreover, to keep the chart organized and readable, only the top 5 features with the highest percentage of failures are selected and arranged in descending order.
Following that, there is a summary of feature statistics, including information such as the name, quantities of tests divided by their respective states, the percentage of tests passed relative to the total, and the duration of each feature. Additionally, next to each name, there is an icon reflecting the overall feature status. A green checkmark icon identifies features with only passed tests, while a red cross icon is associated with features with at least one failed test. Lastly, a yellow question mark icon represents features containing at least one ignored test.
In addition to displaying this data, each feature name serves as a hyperlink to quickly navigate to its details. This functionality makes the summary also an index for easy access to each feature. For this reason, a persistent link has been inserted at the bottom right of the screen to return to the index if the user is analyzing a feature and needs to quickly consult others.
The report offers some commands to filter scenarios based on their status, achieved through checkboxes to show or hide the respective tests. Therefore, it is possible to view only passed, failed, or ignored scenarios, or combine multiple filters to obtain the desired scenarios. This functionality facilitates user search and navigation activities.
The following section, depicted in the "Features with Passed Scenarios" figure, is dedicated to test details. Here, information regarding features and their scenarios is presented. Organized in this hierarchy, they allow for a clear arrangement of the test session.
For each feature, the name, file path, and duration are provided. Additionally, if specified, the description and list of associated tags are included. Next to the name, there are several badges of different colors indicating the quantities of scenarios for each state within the specific feature: green for the number of passed scenarios, red for failed ones, and yellow for ignored ones. As highlighted in the example illustrated in the "Features with Passed Scenarios" figure, there are 2 passed scenarios and 1 failed scenario, while the badge for ignored scenarios is not visible as there are none present.
The scenarios, contained within a feature, are characterized by the background color of their respective tab. Green for passed scenarios, red for those with one or more failed steps, and yellow for ignored ones. To convey the same information about the state of each scenario, there is also a state icon next to the name. The state icon can be of three different types: a green checkmark for passed scenarios, a red cross for failed ones, and a yellow question mark to represent ignored ones. Each tab provides information about the scenario name, execution time, the steps it comprises, and any error message if applicable. Additionally, the description and tags are also included if these pieces of information have been defined.
The steps that compose the scenarios are characterized by the state icon, the Gherkin syntax keyword, the action specified in natural language, and the duration, listed on the left. The state icon can be a green checkmark for passed steps or a red cross for failed ones.
The error message can be displayed and hidden, as shown in the "Feature with Failed Scenario and Error Detail" figure, through the arrow in the bottom left corner of the scenario tab. This allows for error checking when necessary and hiding it to avoid distractions.
In the case of ignored scenarios, as shown in the "Feature with Skipped Scenarios" figure, they are distinguished by a yellow-colored tab. This tab contains essential information, namely the scenario name, its description, and the associated tags. The last two pieces of information are provided only if they have been defined. Next to the associated feature, there is also a yellow-colored badge indicating the number of ignored scenarios present.
Scenario outlines allow for defining a test scenario in a generic manner by inserting placeholders that will later be populated with the values specified in the example table. This approach eliminates the need for duplications, allowing the execution of the same scenario for each row of the table and thus for different values. In the report, these scenarios are displayed in an expanded manner, creating a single scenario for each row of the table. Besides the list of arguments associated with the name, these scenarios are managed similarly to standard scenarios.