{{ summaryTitle }}
{{ summaryPrimary }}
{{ summaryLine }}
{{ badge.label }}
SQL INSERT generator inputs
Use the destination table name, such as public.users or audit_event.
Pick the database syntax you plan to paste into.
Choose whether columns come from the field below or from the first source row.
Enter comma-separated column names matching each source value.
Paste one data row per line, or drop a CSV/TXT file onto the textarea.
{{ fileStatus || 'Drop CSV, TSV, or TXT onto the textarea.' }}
Auto detects comma, tab, semicolon, and pipe-separated data.
Use infer for seed data; use text mode when every source value should stay a string.
Choose NULL for database imports, or empty string when blanks are meaningful text.
Use one-per-row for logs and compact multi-row for seed fixtures.
Leave on unless your destination style guide forbids quoted identifiers.
{{ quote_identifiers ? 'On' : 'Off' }}
Keep enabled for most SQL consoles and migration snippets.
{{ include_semicolons ? 'On' : 'Off' }}
{{ result.sql || result.placeholderSql }}
Row Status Cells Issues Statement Preview Copy
{{ row.index }} {{ row.status }} {{ row.cellSummary }} {{ row.issueText }} {{ row.preview }}
No row ledger entries.
Ordinal Source Name SQL Identifier Filled Rows Blank Rows Sample Copy
{{ column.ordinal }} {{ column.name }} {{ column.sqlIdentifier }} {{ column.filledRows }} {{ column.blankRows }} {{ column.sample }}
No column audit entries.
{{ jsonOutput }}
Customize
Advanced
:

Introduction

SQL INSERT scripts turn rows of data into statements that add new records to a database table. They are common in seed files, small migration notes, support reproductions, documentation examples, and quick handoff work where a person needs to inspect every row before running it.

A useful INSERT script depends on four facts lining up: the destination table, the column order, the values in each row, and the SQL dialect that will execute the statement. If any one of those is wrong, the statement can still look valid while loading data into the wrong column, treating text as a number, or failing on an identifier that needs quoting.

Delimited source data adds one more source of error. A comma, tab, semicolon, or pipe can separate fields, but commas can also appear inside names and addresses when the field is quoted correctly. Header rows can name columns, or they can be mistaken for data. Blank cells may mean unknown values, empty strings, or missing trailing fields.

INSERT scripts are best for small, reviewable batches and reproducible examples. They are not a replacement for a bulk loader, schema migration, or parameterized application query. The safest habit is to inspect the generated statements, run them against a scratch database or transaction first, and let the destination table's constraints confirm what the script cannot know by itself.

Technical Details

The basic INSERT shape is stable across SQL systems: a target table is followed by a column list, and each value tuple is matched to that column list from left to right. Explicit column lists matter because they make the script independent of the table's stored column order. They also leave omitted columns to their database defaults, nullable rules, or identity behavior instead of forcing a value for every column in the table.

Delimited text needs to become a rectangular grid before it can become SQL. Quoted fields protect separators that belong inside a value, such as the comma in Chen, Mei. Doubled quote characters preserve literal quotes inside quoted fields. Once the grid is parsed, the header row or manual column list supplies the SQL column names, and each following row becomes either one INSERT statement or one tuple inside a multi-row INSERT.

Rows CSV, TSV, semicolon, pipe quoted fields kept together Columns manual list or header row duplicates renamed Literals NULL, text, numbers booleans by dialect INSERT script plus row and column checks Every generated value is tied back to the parsed row and column position.
The SQL text is only as trustworthy as the parsed grid, column mapping, literal typing, and dialect choice that produced it.

Literal handling is deliberately conservative. Empty cells can be emitted as NULL or as an empty string. A case-insensitive source token of NULL always becomes SQL NULL. In inferred mode, plain integers, decimals, and exponent notation stay unquoted, while true and false become dialect-specific boolean output. Text values are trimmed and enclosed in single quotes, with embedded apostrophes doubled so O'Brien becomes 'O''Brien'.

Identifier quoting protects table and column names that contain mixed case, spaces, punctuation, or reserved words. Dotted table names are split into segments before quoting, so a schema-qualified PostgreSQL target such as public.users becomes "public"."users" when quoting is enabled. Turning quoting off is sometimes useful for house style, but it raises warnings when names are no longer portable identifier tokens.

SQL dialect output choices used by the INSERT generator.
Dialect Identifier quote style Boolean output in inferred mode Best fit
PostgreSQL "table" and "column" TRUE or FALSE Seed rows and reviewable fixtures for PostgreSQL tables.
MySQL `table` and `column` TRUE or FALSE Scripts for MySQL-style object names and backtick quoting.
SQLite "table" and "column" 1 or 0 Small local databases, test data, and portable fixtures.
SQL Server [table] and [column] 1 or 0 Transact-SQL scripts where bracketed names are expected.
ANSI SQL "table" and "column" TRUE or FALSE Generic examples when no vendor-specific output is needed.
How source cell situations map into generated INSERT values and review notes.
Source situation Generated value behavior Review cue
Missing trailing cell Filled as NULL or '' based on Blank cells. Row Ledger marks the row as Review.
Extra source cell Ignored after the last target column. Generation notes and row issues should be checked before copying SQL.
Duplicate column name Renamed with a numeric suffix for the generated script. Column Audit shows the resulting SQL identifier.
Quoted field left open Remaining text stays in that field rather than being split loosely. Generation notes warns that the quoted field was not closed.
Identifier quoting disabled with unsafe names Names are emitted unquoted. Warnings call out table or column names that are safer with quoting enabled.

Everyday Use & Decision Guide

Start with Table name and SQL dialect. Those two choices decide the target name and quoting style, so they should be correct before you judge the script text. Leave Quote identifiers enabled for the first pass, especially when names include schema prefixes, reserved words, mixed case, or punctuation.

Use Column source to match the data you actually have. If the pasted text begins with labels such as id,name,email, choose Use first source row as columns. If you are pasting bare rows from a spreadsheet selection, keep Use column list field and enter the target columns in the exact order the values should load.

Most comma and tab files work well with Delimiter set to Auto detect. Pick a fixed delimiter when the previewed row count or column count feels wrong, or when semicolons and pipes are part of your normal export format. Normalize lines is useful after pasting from email or chat because it standardizes line endings and trims trailing whitespace without changing the field order.

Choose Value typing based on what the destination columns mean. Infer numbers, NULL, and booleans is a good first pass for seed data with numeric IDs, flags, and real nulls. Quote every non-NULL value as text is safer for account codes, ZIP codes, SKUs, and other values where leading zeros or exact text shape matters.

Use Blank cells before deciding whether a short row is acceptable. Emit NULL represents unknown or missing values. Emit empty string represents a known text value with no characters. That distinction can matter when the destination table treats NULL and '' differently.

The SQL Script tab is the final copyable script. Row Ledger is the quickest review surface when some rows are shorter or longer than the column list. Column Audit helps confirm the generated identifiers, filled-row counts, blank-row counts, first sample value, and detected type mix. JSON is useful when you need a machine-readable record of the options, generated SQL, rows, warnings, and audit tables.

Do not copy the script while the summary says Check input or the alert says SQL cannot be generated yet. Clear missing table names, missing columns, and missing source rows first, then read any Generation notes before running the SQL in a database client.

Step-by-Step Guide

Use one of the two column paths first, then review the same result tabs before copying SQL.

  1. Enter Table name, such as public.users or audit_event, and pick the target SQL dialect.
  2. For manual columns, leave Column source on Use column list field and fill Columns with comma-separated names in the load order.
  3. For header-based input, switch Column source to Use first source row as columns and make sure the first non-empty source row contains labels, not data.
  4. Paste rows into Source rows, use Browse CSV, or drop a CSV, TSV, or TXT file onto the textarea. Files over 2 MB are rejected with a browser-side file error.
  5. Open Advanced only when the defaults need adjustment. Confirm Delimiter, Value typing, Blank cells, Statement format, Quote identifiers, and Statement terminators.
  6. Read the summary. A ready run shows SQL INSERT script ready, row count, column count, dialect, delimiter, and statement count.
  7. If an error alert appears, fix the named issue first. Missing table name, missing columns, or missing source rows must be corrected before the SQL Script tab has real output.
  8. Review Generation notes and the Row Ledger. Rows marked Review need a decision about missing cells, extra cells, or malformed quoted fields.
  9. Check Column Audit for unexpected renamed columns, blank-row counts, or text values that should have stayed numeric.
  10. Copy or download from SQL Script only after the review tabs match the destination table you intend to load.

The best final check is outside the page: run the script in a transaction or scratch database, inspect the inserted rows, then commit only when the table constraints and row values match your intent.

Interpreting Results

Trust the SQL Script tab only after the summary, row ledger, and column audit agree with the source data. A script that says INSERT INTO "public"."users" ("id", "name") VALUES ... is syntactically shaped for the selected dialect, but the page cannot prove that public.users exists, that the columns have compatible types, or that constraints will accept the rows.

Ready in the Row Ledger means a row's cell count matched the active column list. It does not mean the row is semantically correct. Review means at least one structural issue needs attention, such as a missing trailing cell filled as NULL or an extra cell ignored after the last target column.

Column Audit is the practical check for false confidence. If a column sample looks shifted, if blank rows are higher than expected, or if the type summary says mostly text for a numeric field, correct the source rows or value typing before copying the script.

The generated SQL is a literal script for reviewed data, not a safe application pattern for untrusted user input. Production application code should use prepared statements or database loader APIs so values are bound by the database driver instead of pasted into SQL text.

Worked Examples

Seed three PostgreSQL users from manual columns

A developer enters public.users, keeps PostgreSQL, leaves Columns as id, name, email, active, created_at, and pastes three comma-separated rows. The summary reports 3 rows, 5 columns, and one per row. In SQL Script, the name Ben O'Brien is emitted as 'Ben O''Brien', while true and false become TRUE and FALSE. That output is suitable for a scratch seed file after the row ledger shows every row as Ready.

Use a header row for a compact MySQL fixture

A test fixture starts with sku|name|active followed by pipe-delimited rows. The user chooses MySQL, switches Column source to Use first source row as columns, sets Delimiter to Pipe, and changes Statement format to One multi-row INSERT. The generated script uses backtick identifiers such as `sku` and one VALUES block with several tuples. Column Audit then confirms each generated identifier and shows whether any active values were blank.

Spot a shifted row before it reaches SQL Server

A support engineer pastes rows for dbo.import_queue with four manual columns but one source row has five cells. The alert area shows Generation notes, and Row Ledger marks that row Review with an extra-cell issue. The corrective path is to fix the source row or add the missing target column before copying SQL. If the fifth value is ignored, the generated statement may still run, but it will not load the data the engineer expected.

Preserve text codes instead of inferred numbers

A small SQLite lookup table contains product codes such as 00127 and 00084. With Value typing on infer, those values match the numeric pattern and would lose their leading zeros if emitted unquoted. Switching to Quote every non-NULL value as text makes SQL Script use quoted string literals instead, and Column Audit shows the sample value in the same shape the destination column should keep.

FAQ:

Why does the page say SQL cannot be generated yet?

The generator needs a destination table name, at least one target column, and at least one data row. If Column source uses the first source row as columns, there must also be a data row after that header.

Why did true and false become 1 and 0?

The selected dialect controls boolean output. SQLite and SQL Server emit inferred booleans as 1 and 0, while PostgreSQL, MySQL, and ANSI SQL emit TRUE and FALSE.

Can the first pasted row become the column list?

Yes. Set Column source to Use first source row as columns. The first parsed row supplies column names, and the remaining parsed rows become data rows.

What happens when a row has too few or too many cells?

Missing cells are filled using the Blank cells setting, either NULL or an empty string. Extra cells are ignored after the final target column. In both cases, Row Ledger marks the row for review.

Does the file content leave the browser?

CSV, TSV, and TXT files are read into the browser session for generation, and the file picker rejects files over 2 MB. Standard site assets still load normally, but there is no dedicated conversion endpoint for the pasted rows.

Should I use this for untrusted application input?

No. The output is a reviewable SQL script for controlled data. Application code that accepts user input should use prepared statements or loader APIs so the database driver binds values safely.

Glossary:

INSERT statement
A SQL command that adds one or more rows to an existing table.
Column list
The ordered set of target columns that each value tuple must match from left to right.
Value tuple
A parenthesized set of SQL values that represents one row inside an INSERT statement.
SQL dialect
The vendor-specific SQL style that affects identifier quoting and boolean literals here.
Identifier
A database object name, such as a table, schema, or column name.
Literal
A value written directly in SQL text, such as NULL, 42, or 'Alice'.
Row Ledger
The result table that shows each generated row's status, cell count, issues, and statement preview.
Column Audit
The result table that summarizes generated identifiers, filled and blank counts, sample values, and detected value types.

References: