JSON to SQL Converter
Convert JSON arrays to SQL CREATE TABLE and INSERT statements. Paste your JSON:
Convert JSON to SQL
This tool transforms JSON data into SQL statements for database import. It generates CREATE TABLE statements with inferred column types and INSERT statements with properly escaped values.
Supported Databases
| Database | Identifier Quotes | JSON Support |
|---|---|---|
| PostgreSQL | "column" | JSONB type |
| MySQL | `column` | JSON type |
| SQLite | "column" | TEXT (store as string) |
| SQL Server | [column] | NVARCHAR (store as string) |
Type Mapping
The converter infers SQL types from JSON values:
| JSON Type | PostgreSQL | MySQL |
|---|---|---|
| string | VARCHAR / TEXT | VARCHAR / TEXT |
| integer | INT / BIGINT | INT / BIGINT |
| decimal | DOUBLE | DOUBLE |
| boolean | BOOLEAN | TINYINT(1) |
| null | TEXT NULL | VARCHAR NULL |
| date string | DATE | DATE |
| datetime string | TIMESTAMP | DATETIME |
| object/array | JSONB | JSON |
Example Output
From this JSON:
[
{"id": 1, "name": "Alice", "active": true},
{"id": 2, "name": "Bob", "active": false}
]PostgreSQL output:
CREATE TABLE "users" (
"id" INT NOT NULL,
"name" VARCHAR(50) NOT NULL,
"active" BOOLEAN NOT NULL
);
INSERT INTO "users" ("id", "name", "active")
VALUES
(1, 'Alice', TRUE),
(2, 'Bob', FALSE);Options Explained
Table name
The name used for the CREATE TABLE and INSERT statements. Automatically quoted with the appropriate syntax for your database.
Output type
- CREATE + INSERT — Full table creation with data
- CREATE TABLE only — Schema definition only
- INSERT only — Data insertion only (for existing tables)
Include NULLs
When enabled, null values are inserted as NULL. When disabled, they use DEFAULT (requires column defaults defined).
Handling Special Cases
Nested objects
Nested JSON objects and arrays are stored as JSON/JSONB columns in PostgreSQL and MySQL, or as TEXT strings in SQLite and SQL Server.
String escaping
Single quotes in strings are automatically escaped by doubling them (''). This prevents SQL injection in the generated statements.
Large datasets
Insert statements are batched (default 100 rows per statement) for better performance and to avoid query size limits.
Common Use Cases
- Database seeding — Initialize test/dev databases
- Data migration — Move data from NoSQL to SQL
- API data import — Import JSON API responses
- Schema generation — Create tables from JSON structure
Related Tools
- JSON Validator — Validate JSON before converting
- JSON to CSV — Export as CSV instead
- JSON to Excel — Export as Excel
- JSON Flatten — Flatten nested objects first
Frequently Asked Questions
Is the output safe from SQL injection?
The generated SQL properly escapes string values. However, always review generated SQL before running on production databases, and prefer parameterized queries in application code.
Can I convert a single JSON object?
Yes. Single objects are automatically wrapped in an array to generate a single-row insert.
How are column types determined?
Types are inferred from the first row of data. If your data has varying types, you may need to adjust the generated schema manually.
What about primary keys and indexes?
The generator creates basic column definitions. Add PRIMARY KEY, UNIQUE, INDEX, and other constraints manually based on your requirements.