How to Convert JSON to SQL INSERT Statements
Last updated: 2026-04-14
Working with a REST API that returns JSON but need to store the data in a relational database? Instead of writing a script to loop through objects and construct INSERT statements by hand, you can paste the JSON into Convertee and get production-ready SQL in seconds. Column types are inferred automatically, and you can choose your target dialect.
Step-by-Step: Convert JSON to SQL with Convertee
- Open Convertee, set source to JSON and target to SQL.
- Paste a JSON array of objects into the input panel. Each object becomes a row; keys become column names.
- Configure SQL options: table name, dialect (MySQL, PostgreSQL, SQLite, SQL Server), CREATE TABLE toggle, and batch size.
- Hit Convert.
- Copy or download the SQL and execute it in your database client.
The entire process runs locally in your browser. No JSON payloads are sent to any server.
Understanding the Formats
JSON
JSON (RFC 8259) is the standard data interchange format for web services. For SQL conversion, the expected input is an array of objects with consistent keys. Nested objects are serialized as TEXT columns by default.
SQL INSERT Statements
An INSERT statement (ISO/IEC 9075) adds one or more rows to a table. Multi-row INSERT syntax (supported by all four major dialects) is more efficient than individual statements because it reduces round-trips and lets the database engine optimize the write path.
Before / After Example
Input JSON:
[
{ "sku": "A100", "name": "Keyboard", "price": 49.99, "in_stock": true },
{ "sku": "B200", "name": "Mouse", "price": 29.99, "in_stock": false }
]Output SQL (MySQL dialect):
CREATE TABLE IF NOT EXISTS `import_data` (
`sku` TEXT,
`name` TEXT,
`price` DOUBLE,
`in_stock` TINYINT(1)
);
INSERT INTO `import_data` (`sku`, `name`, `price`, `in_stock`)
VALUES
('A100', 'Keyboard', 49.99, 1),
('B200', 'Mouse', 29.99, 0);Common Use Cases
- API data archival. You pull data from a third-party API (payment transactions, user events, product catalog) and need a fast way to store snapshots in your SQL database.
- Backend prototyping. You are designing an API and want sample data in your local database without writing a seeder script.
- Data engineering. JSON logs or event streams need to be loaded into a data warehouse for ad-hoc analysis.
- Cross-format pipelines. Combine this with the CSV-to-JSON step to go from spreadsheet to database in two conversions.
Options and Configuration
| Option | Default | What It Does |
|---|---|---|
| Table name | import_data | The target table for INSERT and optional CREATE TABLE. |
| Dialect | PostgreSQL | Controls identifier quoting, type mapping (e.g., BOOLEAN vs. TINYINT(1)), and syntax details. |
| CREATE TABLE | On | Generates a schema definition with types inferred from the first object in the array (TEXT, INTEGER, DOUBLE/REAL, BOOLEAN). |
| DROP TABLE | Off | Adds DROP TABLE IF EXISTS before the schema. Handy for idempotent migration scripts. |
| Batch size | 100 | Rows per INSERT. Large datasets are split into multiple INSERT blocks to avoid exceeding query-length limits. |
Frequently Asked Questions
- What happens if my JSON objects have different keys?
- Convertee collects all unique keys across every object and uses them as columns. Missing values in a given row are inserted as NULL.
- Can I change the table name?
- Yes. The 'Table name' field in the options panel lets you set any valid SQL identifier as the target table.
- How are boolean values mapped to SQL?
- It depends on the dialect. PostgreSQL uses BOOLEAN (TRUE/FALSE), MySQL uses TINYINT(1) (1/0), and SQLite uses INTEGER (1/0).