{{ analysis.summaryTitle }}
{{ analysis.primary }}
{{ analysis.summaryLine }}
{{ badge.label }}
{{ toolName }} inputs
{{ field.help }}
{{ field.suffix }}
{{ sourceField.help }}
{{ sourceMetaLabel }}
{{ sourceField.dropHint }}
Customize
Advanced
:

JUnit XML reports are the compact test-result files that many continuous integration systems use to summarize a run. A report can contain one suite or many suites, and each suite can contain individual cases with names, classes, elapsed time, and outcome tags such as failure, error, or skipped.

Those files matter because a CI job log is often too noisy for quick release triage. The XML report turns a test run into a smaller set of decisions: which suites failed, whether the pass rate is high enough, which cases deserve attention first, and whether a skipped or retried case is hiding risk that a simple green build badge would miss.

Diagram showing testsuites, testsuite, testcase, and outcome tags used to classify JUnit XML results.

A report analyzer is most useful before a failed run turns into a long debugging session. It should separate assertion failures from execution errors, keep skipped tests visible, call out repeat-run signals, and make slow passing tests easy to spot. It should also avoid treating a pass percentage as the whole story, because one critical failure can matter more than a high overall pass rate.

JUnit XML is a shared exchange shape rather than a single perfectly uniform standard. Pytest, Maven Surefire, Gradle, Jest adapters, Jenkins, GitLab, and other CI systems can all produce or consume JUnit-style reports with small differences. The safest reading is to trust the fields that are present in the file, then verify surprising counts against the original job output when a release, deployment, or incident response depends on the answer.

Technical Details:

A JUnit-style report usually has a testsuites root, one or more testsuite nodes, and zero or more testcase nodes inside each suite. Suite attributes such as tests, failures, errors, skipped, and time give aggregate counts when individual cases are sparse. Case attributes such as classname, name, and time identify a specific test and how long it ran.

Outcome child elements decide the case status. A failure usually means the test reached an assertion and the assertion did not hold. An error usually points to setup, teardown, fixture, environment, or uncaught-exception trouble. A skipped child means the case did not run as an ordinary pass/fail assertion. Retry markers such as flakyFailure, flakyError, rerunFailure, and rerunError are not part of every producer, but they are valuable when present because they show a case needed another attempt or carried rerun evidence.

The analyzer reads the common report structure and uses a deterministic counting rule. When concrete testcase rows exist for a suite, the case rows drive the suite counts. When a suite has no direct case rows, the suite attributes provide the fallback counts. A report that contains nested aggregate suites without direct cases is reduced to the non-aggregate suite rows that can produce useful totals.

Rule Core:

Pass rate is calculated from counted cases after failures, errors, and skipped cases are removed from the passed count. The release gate then adds a stricter rule: the pass rate must meet the selected target, and the report must have no failures or errors.

passed = tests-failures-errors-skipped passRate = passedtests×100 gatePass = (passRatetarget)(failures+errors=0)
JUnit XML fields used by the analyzer
XML item How it affects the result Important limit
testsuite name Becomes the suite label in Suite Health and Failure Triage. If missing, the case classname or a default suite name may be used.
tests, failures, errors, skipped Provide aggregate suite counts when direct case rows do not provide them. Case-level rows take priority when they are available for that suite.
testcase classname and name Identify the case shown in triage rows and exported JSON. A missing name is replaced with a generated case label for display.
testcase time Feeds suite duration and slow-test detection after seconds are converted to milliseconds. Missing or nonnumeric time is treated as zero seconds.
failure, error, skipped Set the case status and supply the message text shown as the triage signal. Only the direct outcome child is used for the case status.
Retry children Raise the retry count and can make a passing case appear as flaky pass. Retry markers are producer-specific, so their absence does not prove a suite is never flaky.

Slow-test detection uses the case duration in seconds multiplied by 1000, then compares it with the selected slow-test cutoff. A passing case at or above that cutoff enters Failure Triage as slow pass. Failed, errored, skipped, and retry-marked cases enter triage for their outcome first, so the table stays focused on cases that should be reviewed before a gate decision.

Result fields and how to audit them
Output cue Meaning Best audit check
gate passes or gate fails Compares total pass rate with the target and blocks any failure or error count above zero. Check Failures, Errors, and Pass rate together.
Suite Health Shows suite-level totals, retry signals, time, pass rate, and the gate note. Confirm that suite totals match the CI artifact for the same job.
Failure Triage Lists up to 40 failed, errored, skipped, retry-marked, or slow passing cases. Use the Status, Signal, and Action columns before rerunning tests.
Suite Outcome Map Shows the first eight suites as stacked counts for clean passes, retry signals, failures, errors, and skips. Use it for quick comparison, then use the tables for exact row-level work.
JSON Exposes totals, suites, cases, and gate values for copying into tickets or automation notes. Inspect sensitive suite and case names before sharing an export.

Everyday Use & Decision Guide:

Start with the XML artifact from the same CI job you are investigating. Paste the report into JUnit XML report, drop an XML or TXT file on the textarea, or use Browse XML/TXT. File input is capped at 2 MiB for browser-side analysis, so split unusually large reports by job or suite before using the page.

Set CI job or suite to the job name that will make sense in an exported table. Leave Pass-rate target at 99% for a strict release gate, lower it only when the team has an explicit temporary policy, and remember that any failure or error still makes the gate fail even when the percentage is high.

  • Use Normalize when the XML is minified or pasted as one long line. It only improves spacing between tags; it does not repair malformed XML.
  • Open Advanced and set Slow test cutoff to the team threshold for acceptable case duration. The default 500 ms is intentionally sensitive for unit-style suites.
  • Add project-specific words to Flaky signal terms when test names or failure messages include markers such as timeout, retry, intermittent, or quarantine.
  • Read Suite Health before the chart. The table tells you which suite missed the gate and why.
  • Use Failure Triage for the next action. The Action column separates assertion work, fixture or environment work, skip review, retry review, and slow-test profiling.

A common misread is to treat gate passes as proof that the test run is healthy. A report can pass the gate while still showing retry signals or slow passing cases, and those rows can become tomorrow's hard failure. Review nonzero retry and slow badges before closing a release ticket.

Calculations run in the page from the pasted text or selected file. The report contents are not uploaded to a backend by this analyzer, but copied tables, downloaded files, and shared JSON can still expose internal suite names, case names, and failure messages.

Step-by-Step Guide:

Use one report from one CI job first, then compare additional jobs after the first result looks coherent.

  1. Paste the report into JUnit XML report or choose a file with Browse XML/TXT. After a file loads, the badge near the field shows the filename and character count; if the file is larger than 2 MiB, use a smaller report.
  2. Press Normalize if the pasted XML is hard to inspect. If the warning says JUnit XML parse failed, fix the XML source and rerun instead of trusting any table.
  3. Set CI job or suite to the job label you want in exports and summary text. The summary line should update to show the total number of tests from that label.
  4. Set Pass-rate target. The summary badge should switch between gate passes and gate fails as the target crosses the computed pass rate or as failures and errors remain present.
  5. Open Advanced when performance or flake review matters, then adjust Slow test cutoff and Flaky signal terms. Watch the slow and retry badges for changes.
  6. Review Suite Health. Confirm the Tests, Passed, Failures, Errors, Skipped, Pass rate, and Gate note columns before using the chart.
  7. Open Failure Triage for row-level work. If the table says there are no rows, the current input has no failed, errored, skipped, retry-marked, or slow passing cases within the first triage limit.
  8. Use Suite Outcome Map to compare the first eight suites visually, then use JSON or table exports only after the result matches the CI artifact you intended to analyze.

A useful finish is to copy the Failure Triage rows into the release or incident ticket, then attach the original CI artifact for anyone who needs the full stack trace.

Interpreting Results:

Read the gate badge as a release triage cue, not as a full quality verdict. gate fails means either the pass rate missed the selected target or at least one failure or error exists. gate passes means those specific checks passed; it does not clear retry signals, slow tests, skipped tests, coverage gaps, or missing test types.

  • Failures point first to assertions and expected behavior. Start with the named case and recent code path.
  • Errors point first to setup, fixtures, services, uncaught exceptions, or environment assumptions.
  • Skipped cases should have a reason you are willing to defend. A skip can be valid, but it still lowers pass-rate confidence.
  • Retry signals need stabilization review even when the suite meets the gate. A retry-marked pass is weaker evidence than a clean pass.
  • slow pass rows are performance warnings, not correctness failures. Use them to identify cases that may cause long queues or flaky timeouts later.

When the result will decide a release, verify the reported totals against the CI job that produced the artifact. A mismatch usually means the wrong file was pasted, a report was truncated, or the producer's aggregate attributes differ from its case rows.

Worked Examples:

A failing backend unit job

The sample report has seven tests across backend.auth and backend.billing. One case has a failure, one case is skipped, and one passing case takes 0.600 seconds. With Pass-rate target set to 99 and Slow test cutoff set to 500 ms, the summary should show a 71.4% pass rate and gate fails. In Suite Health, backend.auth gets the note Fix failed or errored tests before gate approval., while Failure Triage shows the failed assertion, the skipped legacy export, and the slow passing case.

A green suite with retry evidence

A report with three checkout cases, no failure, no error, and one flakyFailure child can still show 100.0% in Pass rate and gate passes when the target is 99. The useful warning is in Retry signals and Failure Triage, where the case appears as flaky pass with a retry-related Signal. The release decision may still be yes, but the owner should stabilize the case before it becomes a hard failure.

A malformed XML paste

A copied artifact that ends halfway through a testcase element should not produce suite rows. The page reports JUnit XML parse failed, the summary changes to Check input, and the chart area has no chartable rows. The fix is to copy the full XML artifact again, use Normalize only for readability, and confirm that Suite Health totals appear before exporting anything.

FAQ:

What JUnit XML shapes can I paste?

The analyzer reads reports with testsuites, testsuite, and testcase nodes. It uses common attributes such as name, classname, time, tests, failures, errors, and skipped, plus outcome children such as failure, error, and skipped.

Why can the gate fail when the pass rate is high?

The gate has two checks. Pass rate must meet Pass-rate target, and Failures plus Errors must equal zero. One failure or error is enough to show gate fails.

Why did a passing test appear in Failure Triage?

A passing case can appear when it has a retry marker or when its duration meets or exceeds Slow test cutoff. In those cases the Status column shows flaky pass or slow pass instead of an assertion failure.

Does the analyzer upload my test report?

No. Pasted text and selected files are parsed in the page for this analyzer. Be careful with exports and copied rows, because suite names, case names, and failure messages can still reveal internal project details.

What should I do when XML parsing fails?

Use the original CI artifact again, make sure the report is complete XML, and keep the file under the 2 MiB browser-side limit. Normalize can space tags out for reading, but it cannot fix a missing closing tag or a truncated file.

Glossary:

JUnit XML
A common XML report shape used by test frameworks and CI systems to exchange test results.
testsuite
A group of related test cases, often matching a class, namespace, job, module, or framework suite.
testcase
An individual test entry with a name, optional class, optional time, and optional outcome child.
failure
A case outcome usually tied to an assertion that did not match expected behavior.
error
A case outcome usually tied to setup, fixture, environment, or uncaught-exception trouble.
Retry signal
A producer-specific marker such as flakyFailure or rerunFailure that indicates rerun evidence.

References: