| Path | Pointer | Type | Detail | Preview | Copy |
|---|---|---|---|---|---|
{{ row.path }} |
{{ row.pointer }} |
{{ row.type }} | {{ row.detail }} | {{ row.preview }} | |
| No nodes match the current filter. | |||||
{{ ndjsonText }}
{{ row.text || ' ' }}
JSON formatting is not only about making braces line up nicely. In real work, the same payload often needs several views at once: a readable draft for review, a compact transport version for APIs, and a stable byte sequence for comparison or signing workflows. This formatter treats those as separate outputs from the same source.
It can parse pasted JSON or an uploaded file, emit review and wire variants, build a deterministic key-sorted signature view, convert arrays to one-record-per-line output, and explain what it found through a node ledger, warnings, type counts, a change delta, and a short runbook.
A developer may want a review draft but also need to see how much a payload shrinks in transport form. A platform engineer may be checking whether a signed payload is stable for hashing. A support engineer may care more about duplicate keys and the JSON Pointer path involved.
The page also has a clear boundary. It formats what the browser can parse and analyse; it does not validate against a schema, enforce application semantics, or prove that a package-specific signing protocol will accept the result. The so-called signature view is a deterministic package output, useful for repeatable hashing and review, but it is not presented as universal canonical JSON compliance for every external standard.
That distinction is important because JSON looks simple while hiding a lot of operational edge cases. Duplicate object member names, large integers, precision-heavy decimals, comments in draft payloads, and giant files can all produce output that appears tidy yet still causes trouble downstream. This tool is strongest when you want those risks surfaced alongside the formatted text instead of after a deployment or signature mismatch.
The first practical choice is the formatting profile. Review is the best default when humans need to read the payload. Transport is the better fit when byte count and compactness matter more than line breaks. Signature matters when you want deterministic key ordering and no decorative whitespace. Custom is for the cases where you know exactly which knobs need to differ from those presets.
The second choice is whether your source is strict JSON or a draft that behaves more like JSONC. If comments are present, the package can strip them before parsing. That is convenient for hand-edited drafts, but it is also the point where users can fool themselves. A payload that formats successfully with comment stripping enabled is still not strict RFC 8259 JSON until those comments are removed at the source.
Use the tab set according to the job you are doing. Editor Draft is the human-facing version. Wire Payload is the compressed transport string. Signature Canonical is the deterministic sorted version. Node Ledger answers structure questions. Line Stream is useful for one-record-per-line ingestion. Change Delta shows how far the formatted result moved away from the normalized input. Type Footprint and Runbook are better for diagnostics than for raw editing.
The warning strip should be treated as a gate, not decoration. Duplicate keys are especially important because many parsers keep only the last occurrence, which means a payload can appear valid and still hide discarded data. Unsafe integers and over-precise number literals deserve the same respect when the JSON carries identifiers or amounts that must survive round-tripping without numeric drift.
If you need to hand off the result, choose the export that matches the audience. The output panes can be copied or downloaded directly, the node ledger can be exported as CSV or DOCX, the type chart can be saved as an image or CSV, and the bundled JSON result can preserve metrics, warnings, recommendations, and generated outputs together. That makes the tool usable not just for formatting, but for documenting what changed and why.
The formatter begins with the raw text in the editor or uploaded file. If comment stripping is enabled, the script removes both line comments and block comments before it hands the result to JSON.parse. Once parsing succeeds, the package optionally sorts keys recursively in ascending or descending order for the working value. From that working value it derives the readable draft, the minified wire payload, and the package's signature-oriented canonical string.
The three main JSON outputs are deliberately not identical. Editor Draft uses the chosen indent width, can gain one trailing newline, and can escape slashes or the HTML-sensitive characters <, >, and &. Wire Payload is the compact JSON.stringify form of the same working value and can receive the same escaping rules. Signature Canonical is different: it always serializes the original parsed value after recursive ascending key sort and does not inherit the draft-only newline rule or the optional slash and HTML escaping toggles.
That means the so-called signature output is deterministic inside this package, but it is narrower than a general canonicalization standard. The code does not claim full RFC 8785 JSON Canonicalization Scheme handling for every edge case. Instead, it gives you a repeatable, key-sorted representation without added spacing, useful for local hash, HMAC, and diff workflows so long as you remember its package-specific boundary.
Diagnostics are generated in parallel with formatting. Duplicate keys are detected from the source text rather than from the parsed object, which matters because a normal JSON parser will only keep the last value for repeated member names. The script also scans numeric tokens for values with more than fifteen significant digits, counts integers beyond JavaScript's safe precision range, measures parse time, and raises warnings when the input exceeds 5 MB or formatting takes at least 120 milliseconds.
The support views are built from that same parsed structure. Node Ledger walks the tree and records a dot path, a JSON Pointer, type, detail summary, and preview value for every node. Line Stream emits either the whole payload on one line or one line per array item, depending on NDJSON mode. Change Delta compares the normalized input against the selected output target. Type Footprint counts objects, arrays, strings, numbers, booleans, and nulls, then renders those counts in a pie chart and CSV export.
| Stage | What the package does | Resulting output or risk |
|---|---|---|
| Input normalization | Reads editor text or uploaded file and optionally strips comments | Creates the parse source used for formatting and duplicate-key scanning |
| Parsing | Runs JSON.parse against the normalized source |
Stops immediately on invalid JSON and reports a line and column |
| Working-order transform | Recursively sorts keys if the selected order is ascending or descending | Feeds Editor Draft, Wire Payload, NDJSON, ledger, diff, and metrics |
| Canonical transform | Always serializes a recursively ascending-sorted version of the parsed value | Produces the deterministic Signature Canonical pane |
| Safety checks | Detects duplicate keys, unsafe integers, excessive numeric precision, large size, and slow parse time | Populates warnings and the runbook |
| Structure analysis | Builds node ledger rows, type counts, byte counts, line metrics, and diff data | Supports the ledger, chart, delta, and bundle exports |
| Output | Serialization rule | What to use it for |
|---|---|---|
| Editor Draft | Formatted with the chosen indent, optional newline, and optional slash or HTML escaping | Human review, code comments, troubleshooting, and documentation |
| Wire Payload | Minified JSON.stringify output from the working value |
Transport-size inspection and API payload handoff |
| Signature Canonical | Serialization of the parsed value after recursive ascending key sort, without added spacing | Deterministic hashing or signing workflows inside this package's boundary |
| Line Stream | One line for the whole value or one line per array item, depending on mode | Streaming ingestion or one-record-per-line processing |
| Result bundle | Pretty-printed JSON containing metrics, warnings, recommendations, and generated outputs | Archiving the formatter session or sharing diagnostics |
| Condition | How the package detects it | Why it matters |
|---|---|---|
| Duplicate keys | Text-level duplicate-member scan on the normalized input | Parsers commonly keep only the last value, so earlier values can be silently lost |
| Unsafe integers | Walk of parsed values against JavaScript safe-integer bounds | Large numeric identifiers can drift after parse and stringify cycles |
| Precision-heavy decimals | Number-token scan for more than fifteen significant digits | High-precision numeric literals may not round-trip cleanly in JavaScript |
| Large payloads | Input byte count above 5 MB | Formatting and rendering can become slow on low-memory devices |
| Slow parse time | Measured runtime of at least 120 ms | Signals that chunking or upstream reduction may be safer than in-browser formatting |
The summary box is a quick health check, not a score. Root type, node count, depth, duplicate count, compression delta, and parse time tell you what kind of payload you are dealing with and whether the formatting run surfaced obvious risk. A clean headline does not mean the payload is semantically correct.
The Signature Canonical pane should be read carefully. It is useful precisely because it is deterministic, but determinism alone does not make it universal canonical JSON. If duplicate keys or unsafe integers are present, the package marks readiness as blocked for good reason, even though it can still render a sorted output string. The right interpretation is "not trustworthy for signing yet," not "the feature is broken."
Change Delta is easiest to misread. A large diff may simply mean the input was messy, compact, or comment-heavy. A small diff may still hide a serious issue if one of the warnings is about duplicate names or precision. The delta view tells you how much the bytes changed, not whether the payload became safer.
Node Ledger and Type Footprint are structural tools. If you are tracing where a suspicious field lives, use the ledger path and JSON Pointer. If you are trying to understand overall complexity or payload mix, use the footprint counts and chart. They answer different questions, and together they are often more useful than staring at formatted braces alone.
An engineer pastes a hand-edited configuration file that still contains // comments. With comment stripping enabled, the package can parse and re-emit the payload so the review draft is readable and the change delta shows what disappeared. The runbook reminder to disable comment stripping for strict production pipelines remains important because comments are not valid JSON in the standard format.
A platform team needs deterministic bytes for a hash or detached signature step. They switch to the signature profile, check duplicate-key warnings first, then inspect the signature output and canonical-readiness label. If duplicate names or unsafe integers are present, the right next step is to repair the producer rather than sign the current payload just because the pane looks tidy.
A support engineer receives a deeply nested API response and wants to explain it without sending raw application logs. They format the payload, use Node Ledger to isolate a field path and its JSON Pointer, export the ledger as DOCX, and include the result bundle JSON for context. The formatted text helps readability, while the ledger and warnings make the explanation more precise.
No. The package creates a deterministic, ascending-key serialization without added spacing that is useful for local hashing and signing workflows, but it does not claim full standards compliance for every canonicalization edge case.
Because a normal parser keeps only one value for a repeated object member name. The payload may look valid after parse, while earlier values have already been discarded.
Editor Draft is the readable, optionally indented view. Wire Payload is the compact serialized form intended for transport-size inspection or API handoff.
It is most useful when the root value is an array and your downstream system expects one JSON value per line. In single mode, the package emits the whole value as one line instead.