PDFFlare
8 min read

How to Convert JSON to CSV (and CSV to JSON) for Excel and Sheets

You've got a JSON response from an API — a list of users, orders, analytics events, whatever — and the request is the same one developers have heard a thousand times: “can you put this in a spreadsheet?” Or the reverse: an analyst hands you a CSV export from Salesforce and the ingestion API only accepts JSON.

In this guide you'll learn how to convert JSON to CSV (and CSV back to JSON) for free in your browser using PDFFlare's JSON to CSV converter — bi-directional, handles quoted fields, embedded commas, and newlines, and never uploads your data anywhere.

Why Convert JSON to CSV?

JSON is what APIs and databases speak. CSV is what humans, spreadsheets, and most legacy data tools speak. Converting between them comes up constantly:

  • Open API responses in Excel or Google Sheets: Filter, sort, pivot, chart — all the tools non-developers know.
  • Hand off data to non-technical teammates: Marketing, ops, and finance live in spreadsheets. CSV is the lingua franca.
  • Bulk data uploads: Plenty of admin panels and SaaS tools accept CSV but not JSON. Convert and upload.
  • Analytics and reporting: Quick ad-hoc analysis is faster in a spreadsheet than writing a script.
  • Database imports: Most databases have a COPY FROM CSV or equivalent bulk-load path.

How to Convert JSON to CSV (Step by Step)

PDFFlare's converter is bi-directional and runs entirely in your browser.

  1. Open PDFFlare's JSON to CSV tool — no signup needed.
  2. Paste your JSON into the input pane. The tool expects an array of objects (the most common shape) — for example [{"name":"alice","age":30},{"name":"bob","age":25}].
  3. Click Convert. The right pane shows a clean CSV with one row per object and a header row built from the union of all keys.
  4. Copy or download. Save as data.csv and open in Excel, Sheets, Numbers, or any text editor.
  5. To go the other way — toggle the direction selector to CSV → JSON, paste your CSV, and click Convert. The first row is treated as headers and each subsequent row becomes an object.

Handling Tricky CSV Edge Cases

A real CSV converter has to handle the messy reality of CSV — quoted fields, embedded commas, newlines inside values, escaped quotes. PDFFlare follows RFC 4180, the de facto CSV spec, so the round-trip preserves data faithfully.

  • Embedded commas: A value like "Smith, John" stays inside its single CSV cell because it's wrapped in quotes.
  • Embedded newlines: Multi-line values (like a long address) inside a quoted field stay multi-line — Excel imports them into a single cell.
  • Embedded quotes: A single " inside a value gets escaped as "" per the RFC. The reverse direction unescapes them.
  • Header row: The first CSV row is always treated as field names. The JSON output is an array of objects keyed by those names.

What About Nested JSON?

CSV is a flat format — rows and columns, no nesting. If your JSON has nested objects ({"user": {"name": "alice"}}), the converter has two options:

  • Flatten first: Use PDFFlare's JSON Flatten to turn {"user": {"name": "alice"}} into {"user.name": "alice"}, then convert to CSV. You'll get one column per leaf path, fully spreadsheet-compatible.
  • JSON-encode the nested value: The converter stringifies nested objects so they fit in one cell as {"name":"alice"}. Useful when you need to keep structure intact for re-parsing later.

CSV Dialects: Excel, RFC 4180, and TSV

Not all CSVs are created equal. The format has historical baggage and regional flavors that catch people off-guard:

  • RFC 4180 (the “real” spec): Comma separators, double-quote field wrapping, doubled quotes for escaping. PDFFlare emits RFC 4180-compliant CSV by default.
  • Excel-flavoured CSV:Mostly RFC 4180-compatible, but Excel uses the regional list separator from your OS settings. On many European systems, that's a semicolon, not a comma. If your CSV opens with all data in one column, the semicolon vs comma mismatch is usually why.
  • TSV (tab-separated): Often safer than CSV when data contains commas. Tabs almost never appear in text data, so quoting is rarely needed. Useful as an intermediate format.

Date Formats: The Silent CSV Trap

JSON has no native date type — dates are strings. CSV has no types at all. Excel auto-detects and reformats anything that looks like a date, often destructively:

  • ISO 8601 strings (2026-05-01):Excel converts to a serial number internally, displays in your locale's date format. Saving back as CSV may emit 5/1/2026 or 01/05/2026 depending on locale — round trip is no longer the same string.
  • The gene-name disaster: Excel famously corrupts biology data because gene names like SEPT1 are interpreted as dates. The fix is to format the column as Text on import, not after.
  • Mitigation: Prefix sensitive cells with a single quote character before pasting ('2026-05-01) — Excel keeps the value as text. Or import via Data → From Text and mark the column as Text explicitly.

When to Use JSON Lines (NDJSON) Instead

Big-data pipelines (BigQuery, Athena, Spark, Snowflake) often prefer JSON Lines — one JSON object per line, no top-level array. Pros: streamable (you can process line by line without holding the whole file in memory), append-friendly, plays well with grep and awk. If your downstream consumer is one of those tools, output JSONL instead of CSV — it preserves types and handles nested data natively. PDFFlare's converter targets CSV; for JSONL, just drop the brackets and commas from a JSON array manually.

Common Scenarios

Exporting an API Response to Excel

Hit an endpoint, copy the JSON response, paste into the converter, download the CSV, double-click to open in Excel. From start to spreadsheet in under a minute. Faster than installing a Python package.

Postman or Insomnia → Spreadsheet

Postman lets you copy JSON results to clipboard. Paste straight into the converter, get CSV, share with your team. No need for a custom export script.

CSV → API JSON Body

Going the other way: an analyst gives you a CSV, you need to bulk-import it through a JSON-only API. Convert CSV to JSON, wrap it in your request body, fire off a script. The converter handles the tedious quoting and escaping for you.

Log Analysis

Application logs often emit JSON lines. Concatenate them into a JSON array, drop into the converter, open the resulting CSV in Excel. Pivot tables and filtering are dramatically faster than grep + awk for ad-hoc analysis.

Database Bulk Import

PostgreSQL COPY FROM, MySQL LOAD DATA INFILE, and most cloud warehouses support CSV import. Convert your JSON dataset to CSV first, then bulk-load — orders of magnitude faster than row-by-row INSERT statements.

Generating Test Fixtures

Need 100 rows of dummy data for a unit test? Use PDFFlare's JSON Generator to produce realistic JSON, then convert to CSV for tools that expect CSV input. Saves you from hand-rolling a fixture file or writing a generator script for a one-off test.

Sharing Data with Non-Technical Stakeholders

Marketing wants the latest signups. Finance wants this month's orders. They live in spreadsheets. Pull the data from your API as JSON, drop it in the converter, hand them a CSV. No back-and-forth, no “can you export it differently?” — they get exactly what their tools speak.

Best Practices

  • Always inspect the JSON shape first. Run it through PDFFlare's JSON Formatter so you can see if it's an array of flat objects (CSV-ready) or has nested structure (flatten first).
  • Watch out for type drift. CSV has no types — every cell is a string. When you re-parse a CSV → JSON, numbers become "42" not 42. Cast where needed.
  • Excel mangles leading zeros. ZIP codes, IDs, and phone numbers starting with 0 lose the zero on import. Format the cell as Text in Excel before importing if this matters.
  • UTF-8 is your friend. Save CSV with UTF-8 encoding for emoji, accented characters, and non-Latin scripts. Excel sometimes defaults to UTF-16; choose UTF-8 explicitly.
  • For repeat conversions, automate.The converter is great for one-offs. If you're doing this daily, write a script — but for ad-hoc analyst requests, the browser tool wins.

Common Mistakes When Converting JSON to CSV

  • Forgetting that the JSON must be an array of objects. A single object {"a": 1, "b": 2}won't convert cleanly into CSV rows. Wrap it in an array first if you want a single-row CSV.
  • Inconsistent keys across objects.If the first object has 5 keys and the second has 7, the converter unions all keys for the header and fills missing cells with empty strings. That's usually what you want, but it means your CSV is sparse — sort by completeness if downstream tools choke on empty cells.
  • Losing array values inside cells. A JSON value like "tags": ["a","b","c"] ends up as a single cell containing ["a","b","c"]. If you want one row per tag, you need to denormalize — typically with a script, not a generic converter.
  • Encoding troubles. If your data has non-ASCII characters (emoji, accents, CJK), save the CSV as UTF-8 with BOM so Excel opens it correctly on Windows. Save as plain UTF-8 for Mac, Linux, or Google Sheets.

Privacy: Your Data Stays on Your Device

PDFFlare's JSON to CSV converter runs entirely in your browser. Conversion logic is JavaScript executed locally — your data never uploads to any server. This is critical for production datasets that contain customer PII, financial records, or anything covered by GDPR, HIPAA, or similar regulations. Close the tab, the data is gone.

Wrapping Up

JSON ↔ CSV conversion is one of the most-asked-for tasks in data plumbing. Done right, it preserves your data faithfully through the round trip; done sloppily, embedded commas and quotes corrupt rows silently. PDFFlare follows the RFC and gives you a clean, browser-based conversion in one click.

Got a JSON dataset that needs to land in Excel — or a CSV that needs to feed an API? Open PDFFlare's JSON to CSV converter and you're unblocked.

Related Tools

  • JSON Flatten — flatten nested JSON before CSV conversion
  • JSON Formatter — inspect the shape of your JSON before converting
  • JSON to YAML — convert to YAML for config files and Kubernetes
  • JSON Stats — inspect key counts and types in your dataset