Common JSON Errors and How to Fix Them

· 9 min read

"Unexpected token" errors are the worst. You're looking at JSON that looks perfectly fine, but the parser refuses to accept it. The error message points to some random position and you're left playing detective.

Nine times out of ten, it's one of these common mistakes. Here's how to spot and fix them.

Trailing Commas

This is the number one JSON killer. JavaScript lets you put a comma after the last item in an array or object. JSON does not.

// BROKEN - trailing comma after "blue"
{
  "colors": ["red", "green", "blue",]
}

// FIXED
{
  "colors": ["red", "green", "blue"]
}
// BROKEN - trailing comma after age
{
  "name": "Alice",
  "age": 28,
}

// FIXED
{
  "name": "Alice",
  "age": 28
}

The error message usually says something like "Unexpected token ] at position X" or "Expected property name". If you see either, check for trailing commas first.

Single Quotes Instead of Double

Python developers run into this all the time. Python dicts use single quotes by default, but JSON requires double quotes. Always.

// BROKEN - single quotes
{'name': 'Alice', 'active': true}

// FIXED - double quotes
{"name": "Alice", "active": true}

This applies to both keys and string values. Numbers, booleans, and null don't need quotes at all.

If you're converting Python output to JSON, use json.dumps() instead of str(). It'll handle the quote conversion for you.

Unquoted Keys

In JavaScript, you can write { name: "Alice" } without quotes on the key. That's not valid JSON.

// BROKEN - unquoted key
{name: "Alice"}

// FIXED
{"name": "Alice"}

Every key in JSON must be a string, and every string must have double quotes. No exceptions.

Comments in JSON

JSON doesn't support comments. At all. Not //, not /* */, not #.

// BROKEN - comments aren't allowed
{
  // This is a user object
  "name": "Alice",
  "age": 28 /* years old */
}

// FIXED - remove all comments
{
  "name": "Alice",
  "age": 28
}

This is annoying for config files. If you need comments, consider using JSONC (JSON with Comments, supported by VS Code), YAML, or TOML instead.

Some parsers have a "relaxed" mode that allows comments, but standard JSON parsers will reject them.

Unescaped Special Characters

Certain characters inside strings need to be escaped with a backslash. The most common culprits:

  • Double quotes inside strings
  • Backslashes
  • Newlines and tabs (sometimes)
// BROKEN - unescaped quote
{"message": "She said "hello""}

// FIXED - escaped quote
{"message": "She said \"hello\""}
// BROKEN - unescaped backslash in Windows path
{"path": "C:\Users\Alice"}

// FIXED - escaped backslashes
{"path": "C:\\Users\\Alice"}
// BROKEN - literal newline in string
{"text": "Line one
Line two"}

// FIXED - use \n for newlines
{"text": "Line one\nLine two"}

Wrong Value Types

A few values trip people up:

undefined vs null

JavaScript has undefined. JSON doesn't. Use null instead.

// BROKEN
{"value": undefined}

// FIXED
{"value": null}

NaN and Infinity

These aren't valid JSON numbers. You'll need to use strings or null.

// BROKEN
{"result": NaN, "limit": Infinity}

// FIXED
{"result": null, "limit": "Infinity"}

True/False capitalization

Booleans must be lowercase. True and False (Python style) don't work.

// BROKEN
{"active": True}

// FIXED
{"active": true}

Encoding Issues

JSON must be UTF-8 encoded (or UTF-16/UTF-32, but UTF-8 is standard). If you're getting weird errors with special characters, check your file encoding.

Common symptoms:

  • Characters like é, ñ, or Chinese characters causing errors
  • Invisible characters at the start of the file (BOM)
  • The file looks fine but the parser chokes

In VS Code, you can check and change the encoding in the bottom right corner of the window. Make sure it says "UTF-8".

Advanced Error Patterns

In my experience building AI translation systems and processing JSON from various sources, I've encountered some less obvious errors that can be maddening to debug.

BOM (Byte Order Mark) issues

I spent an entire afternoon once debugging JSON that looked perfect but wouldn't parse. The culprit was an invisible BOM character at the start of the file. This happens when files are saved with UTF-8 BOM encoding.

// The file starts with invisible bytes: EF BB BF
// These appear before the first {
// Most parsers reject this

// Solution: strip BOM before parsing
const stripBOM = (str) => {
  if (str.charCodeAt(0) === 0xFEFF) {
    return str.slice(1);
  }
  return str;
};

const data = JSON.parse(stripBOM(jsonString));

When uploading product data to PIM systems using translated JSON, BOM issues would break the entire upload. Now I always strip it first.

Floating point precision errors

JSON numbers are parsed as JavaScript floats, which can lead to precision issues with certain decimal values. I've seen this cause problems in financial data and coordinates.

// What you write
{"price": 0.1, "tax": 0.2, "total": 0.3}

// What JavaScript sees after parsing
{"price": 0.1, "tax": 0.2, "total": 0.30000000000000004}

// For financial data, use integers (cents) or strings
{"price": 10, "tax": 20, "total": 30} // cents
{"price": "0.10", "tax": "0.20", "total": "0.30"} // strings

Large number handling

JavaScript can only safely represent integers up to 2^53-1 (9007199254740991). When translating e-commerce product catalogs with 19-digit IDs from external systems, I learned that precision loss is silent and destructive.

// ID from external system
const json = '{"productId": 9007199254740993}';
const parsed = JSON.parse(json);
console.log(parsed.productId);
// 9007199254740992 - wrong! Lost precision

// Solution: use strings for large integers
const safe = '{"productId": "9007199254740993"}';
// Or use BigInt-aware parsers

Control characters in strings

Some control characters (ASCII 0-31) besides the common ones (newline, tab) are invalid in JSON strings and must be escaped. When building content translation workflows, user input sometimes contains these.

// BROKEN - contains literal control character (ASCII 8)
{"text": "Hello\bWorld"}

// FIXED - properly escaped
{"text": "Hello\\bWorld"}

Debugging Tools & Techniques

Here are the tools and techniques I rely on when tracking down JSON errors in production systems.

Online validators

When I get a parse error, my first step is always a validator. They pinpoint the exact character position and explain what's wrong. I use our JSON tool for quick checks, or JSONLint for detailed error messages.

IDE extensions

VS Code has excellent JSON validation built in. Install the "JSON" extension for real-time error highlighting and auto-formatting. It's saved me countless times when editing config files for translation pipelines.

Command-line validation

For automated checks in CI/CD pipelines:

# Using jq (install with: brew install jq)
jq empty file.json
# If invalid, it prints the error

# Using Python
python -m json.tool file.json
# Pretty prints if valid, errors if not

# Using Node.js
node -e "JSON.parse(require('fs').readFileSync('file.json'))"

Understanding error messages

Parser error messages can be cryptic. Here's what they mean:

  • "Unexpected token } at position 45" - Usually a trailing comma before the }
  • "Expected property name" - Missing quote on a key, or trailing comma
  • "Unexpected end of JSON input" - Truncated file or missing closing brace
  • "Unexpected token < in JSON" - Server returned HTML error page instead of JSON

Debugging Tips

When you hit a parse error and can't find the problem:

1. Use a JSON validator

Paste your JSON into a validator tool (like ours). It'll point to the exact line and character where things go wrong.

2. Look at the error position

Most parsers tell you the character position of the error. Count to that position and look at what's there, plus a few characters before.

3. Binary search it

If you have a huge JSON file, delete half of it and try again. If it works, the problem is in the deleted half. If it still fails, the problem is in what's left. Repeat until you find it.

4. Check for invisible characters

Sometimes there are weird invisible characters (like zero-width spaces) hiding in your JSON. Copy paste into a plain text editor and look for anything suspicious.

5. Regenerate from source

If the JSON was generated from code, check your serialization. Using the wrong function (like Python's str() instead of json.dumps()) is a common source of malformed JSON.