Compression Result
{{ savingPercent }} %
SHA-256 {{ sha256Hex.slice(0,8) }}…
{{ formatBytes(originalSize) }} {{ formatBytes(totalCompressedSize) }} {{ parts.length }} parts
File

{{ drag ? 'Drop a file here' : 'Drag & drop one file here' }}

Ignored {{ ignoredCount }} extra file(s). This tool processes one file at a time.
  • {{ displayPath(f) }} {{ formatBytes(f.size) }}
{{ compressionLevel }}
{{ '.' + outputExt }}:
{{ progressPct }}%
Metric Value Copy
Original size {{ formatBytes(originalSize) }}
Compressed size {{ formatBytes(totalCompressedSize) }}
Saving {{ savingPercent }} %
Parts {{ parts.length }}
SHA-256 {{ sha256Hex }}

                
:

Introduction

Lossless compression shrinks data by encoding repeated patterns more efficiently, while archiving wraps bytes into a container that is easier to move or store. Those two ideas solve different problems. Sometimes you want the smallest practical file for transfer; sometimes you want a familiar package format; sometimes you need both.

This compressor focuses on one selected file at a time and turns that file into a downloadable ZIP, TAR, TAR.GZ, TAR.BR, GZ, or BR artifact. After the run finishes, the page reports the original size, the produced size, the percentage saved, and a SHA-256 checksum so you can compare transfer efficiency and verify what was generated.

That scope makes it useful for concrete jobs such as shrinking a large log export, packaging a CSV before attaching it to a ticket, or preparing a text-heavy asset for distribution in a format another system expects. You can drag and drop a file or pick it from disk, choose the algorithm and level, optionally split the finished artifact into parts, and then export both human-readable and machine-readable summaries.

A realistic example is a 180 MB text log that needs to move through an attachment limit. ZIP or GZ may cut the size dramatically, while part splitting lets you break the output into smaller numbered downloads. A second example is a video or already-zipped package, where the size may barely change because most of the easy redundancy is already gone.

The main caution is simple: compression is not encryption. This tool does not add passwords, secrecy, or authenticity guarantees. It creates a smaller or differently packaged artifact and then hashes that artifact so you can confirm integrity later.

Everyday Use & Decision Guide

The first decision is format, because the best choice depends on what the receiver expects. ZIP (Deflate) is the compatibility-first option for many desktop workflows. ZIP (Store) keeps the ZIP container but skips compression entirely. GZ and BR compress the selected file directly, which is a good fit when you want a single compressed stream instead of an archive container.

TAR, TAR.GZ, and TAR.BR add a tar wrapper first. For one file, that wrapper does not magically improve compression, but it can matter when another toolchain expects tar-based packaging or when you want a .tar.gz or .tar.br style output. If you only want the smallest practical single-file result and you do not need a tar container, direct .gz or .br output is usually the simpler choice.

The compression level slider controls the size-versus-speed tradeoff. Higher levels usually spend more time searching for patterns in exchange for smaller output, especially on text, logs, source code, and other repetitive data. TAR and ZIP Store ignore the level because they are packaging modes rather than active compression modes.

The smart compression switch matters most when you pick ZIP Deflate for files that are already compressed, such as JPEG images, MP4 video, MP3 audio, ZIP archives, PDFs, font files, or disk images. In those cases, trying to deflate the file again can waste time and sometimes even make the result a little larger. Smart compression avoids that by storing those file types without re-compressing them inside the ZIP.

  • Choose ZIP Deflate when broad compatibility matters and the source file is likely compressible.
  • Choose ZIP Store when you want a ZIP package but do not want compression overhead.
  • Choose GZ for direct single-file gzip output, especially in Unix-style workflows.
  • Choose BR when text-heavy data benefits from Brotli and the receiver can read Brotli output.
  • Use split parts when the finished artifact is too large for email, ticket systems, or manual transfer limits.

Technical Details

The tool processes the file in the browser. There is no package-level server helper for the compression path, so the selected bytes stay in the local session unless you intentionally download or copy an export. The page builds the compressed artifact, optionally slices that artifact into numbered parts, and then computes SHA-256 over the produced bytes so the checksum matches what you actually downloaded.

Only one file is compressed in a run. If you select or drop more than one file, the page keeps the first file and reports how many extras were ignored. The default output name comes from that file's base name, and the extension changes with the chosen algorithm: .zip, .tar, .tar.gz, .tar.br, .gz, or .br. When splitting is enabled and the compressed artifact exceeds the chosen threshold, numbered files such as .part01 and .part02 are created.

The reported saving percentage comes from the finished artifact, not from an estimate. The page compares the original byte size with the compressed byte size and computes:

Saving = ( 1 Sc So ) × 100 %

Here So is the original size and Sc is the produced size, or the sum of all produced parts after splitting. Positive values mean the output is smaller. Values near zero mean there was little useful redundancy to remove. A negative value means the chosen format added more wrapper or compression overhead than it saved.

Algorithm behavior in the file compressor
Format What the tool does Level range Typical use
ZIP (Deflate) Creates a ZIP archive and deflates the file unless smart compression stores it raw 0 to 9 Compatibility-first compression
ZIP (Store) Creates a ZIP archive with no compression Ignored Packaging without size reduction
TAR Wraps the file in a tar container without compression Ignored Tar-based packaging
TAR.GZ Builds a tar container and then gzip-compresses it 0 to 9 Common tar-and-gzip delivery
TAR.BR Builds a tar container and then Brotli-compresses it 0 to 11 Tar-based output with Brotli
GZ Gzip-compresses the selected file directly 0 to 9 Direct single-file compression
BR Brotli-compresses the selected file directly 0 to 11 Direct single-file Brotli output

The result surfaces are intentionally broader than a single download button. The Compression Metrics tab provides a table that can be copied as CSV, downloaded as CSV, exported as DOCX, or used to copy the checksum directly. The Size Trend tab draws a bar chart comparing original and compressed sizes, and that chart can be exported as PNG, WebP, JPEG, or CSV. The JSON tab serializes the chosen inputs, the computed totals, the ignored-file count, the part list, the checksum, and the file list as structured output.

Part splitting uses decimal megabytes, because the code multiplies the chosen value by 1,000,000 bytes. That is useful when a mail system or attachment rule talks about “MB” in decimal terms, but it also means the threshold is not a binary MiB setting. If exact transport limits matter, leave a little headroom.

The checksum is an integrity aid, not a trust label. A matching SHA-256 tells you that the received artifact matches the produced artifact byte for byte. It does not mean the content is safe to execute, and it does not identify who created it.

Step-by-Step Guide

  1. Select one file with the picker or drag and drop it into the drop zone. If you accidentally provide several files, the page keeps the first and tells you how many extras were ignored.
  2. Choose the output format that matches your transfer goal or the receiver's expectations.
  3. Adjust the compression level if the selected format supports it, and rename the output if you do not want the default base name.
  4. Leave smart compression on for ZIP Deflate unless you have a reason to force compression of media or archive formats that are already compact.
  5. Set a split size if the finished artifact needs to fit under an attachment or upload limit. Leave it at zero for a single download.
  6. Run compression, review the metrics, copy or save any exports you need, and then download the finished artifact or all part files.

Interpreting Results

The headline percentage is the quickest summary, but it is only meaningful alongside the original and compressed sizes. A 60% reduction on a large text file is excellent. A 1% reduction on a JPEG or MP4 is completely normal. A slightly negative value usually means the file was already compressed and the chosen wrapper added overhead.

The checksum row becomes valuable once you send the artifact elsewhere. If a recipient re-hashes the file and gets the same SHA-256 value, the bytes match exactly. If the file was split, the tool hashes the combined produced bytes after splitting so the checksum still describes the artifact as delivered.

  • Strong positive saving: the file had enough repeated structure for the selected algorithm to shrink it materially.
  • Near-zero saving: the data was already dense, already compressed, or too small for the wrapper overhead to matter.
  • Negative saving: the format overhead outweighed any compressible patterns in the source file.
  • Multiple parts: the artifact was sliced for transfer convenience, so you need every part to reconstruct the full output later.

Worked Examples

  1. Compressing a plain-text log for transfer
    Select the log file, choose GZ or ZIP (Deflate), and leave the default level unless you have a reason to optimize more aggressively. Text logs usually compress well, so the metrics table should show a clear reduction and the chart will make that difference obvious.
  2. Packaging a video that is already compressed
    Select the video, choose ZIP (Deflate), and keep smart compression enabled. The file will usually be stored inside the ZIP rather than re-compressed, which avoids wasting time for little or no gain. If you only need the wrapper, ZIP (Store) may be an equally sensible choice.
  3. Splitting a large artifact for a ticket attachment
    Set the format you need, then enter a split size such as 45 or 50 so the result is broken into numbered parts. After compression, review the parts count in the metrics table, copy the SHA-256 if the receiver will verify integrity, and download every part file instead of only the first one.

FAQ

Does the file leave my browser during compression?

The shipped compression path runs in the browser. There is no package-level server helper for the compression workflow, so the selected file stays in the local session unless you choose to download or copy results.

Can I compress several files or a whole folder?

No. This tool processes one file per run. If you select or drop multiple files, only the first is used and the rest are counted as ignored extras.

Why would I use TAR for one file?

Because some workflows expect tar-based outputs such as .tar.gz or .tar.br. TAR is a packaging choice here, not a promise of better compression.

Why did the compressed file get bigger?

Already-compressed media and archives often have little redundant data left to remove. In those cases, archive headers and compression metadata can make the result slightly larger.

What does the SHA-256 value prove?

It proves byte-level integrity of the produced artifact when another party computes the same hash. It does not encrypt the file, identify the author, or say anything about whether the content is safe.

Does split size use MB or MiB?

It uses decimal megabytes. A split size of 50 means chunks of about 50,000,000 bytes rather than 52,428,800 bytes.

Glossary

Lossless compression
Compression that reduces size without discarding any original bytes.
Archive
A container format that packages data for storage or transfer.
Deflate
A widely used lossless compression method commonly found inside ZIP and Gzip workflows.
Gzip
A compressed data format commonly used for single-file streams.
Brotli
A lossless compression format that often performs well on text-heavy data.
SHA-256
A cryptographic hash function used here to verify that the produced artifact has not changed.
Split archive
A finished artifact sliced into numbered parts for easier transfer.