Merging JSON Objects and Arrays: A Developer's Guide
You've got config from multiple environments that need combining. Or API responses from paginated calls that should become one list. Or split files that need reassembling. Merging JSON is a common task, but the right approach depends on your data structure.
Types of Merging
Array concatenation
Combining multiple arrays into one. Order is preserved, duplicates are kept.
// Input
[1, 2, 3] + [4, 5, 6]
// Output
[1, 2, 3, 4, 5, 6]
Shallow object merge
Combining objects at the top level. If both have the same key, the second value wins.
// Input
{"a": 1, "b": 2} + {"b": 3, "c": 4}
// Output
{"a": 1, "b": 3, "c": 4}
Deep object merge
Recursively merging nested objects instead of just replacing them.
// Input
{"user": {"name": "Alice"}} + {"user": {"age": 28}}
// Shallow merge (nested object replaced)
{"user": {"age": 28}}
// Deep merge (nested objects combined)
{"user": {"name": "Alice", "age": 28}}
Merging Arrays
The simplest case. Just concatenate them.
JavaScript
const arr1 = [1, 2, 3];
const arr2 = [4, 5, 6];
// Spread operator
const merged = [...arr1, ...arr2];
// Or concat
const merged = arr1.concat(arr2);
Python
arr1 = [1, 2, 3]
arr2 = [4, 5, 6]
merged = arr1 + arr2
# or
merged = [*arr1, *arr2]
Removing duplicates
If you want unique values only:
// JavaScript
const unique = [...new Set([...arr1, ...arr2])];
// For arrays of objects, you need a different approach
const unique = [...arr1, ...arr2].filter(
(item, index, self) => index === self.findIndex(t => t.id === item.id)
);
Merging Objects
Shallow merge in JavaScript
const obj1 = { a: 1, b: 2 };
const obj2 = { b: 3, c: 4 };
// Spread operator (most common)
const merged = { ...obj1, ...obj2 };
// { a: 1, b: 3, c: 4 }
// Object.assign
const merged = Object.assign({}, obj1, obj2);
Shallow merge in Python
obj1 = {"a": 1, "b": 2}
obj2 = {"b": 3, "c": 4}
# Dict unpacking (Python 3.5+)
merged = {**obj1, **obj2}
# {"a": 1, "b": 3, "c": 4}
# Or with update (modifies in place)
merged = obj1.copy()
merged.update(obj2)
Remember: with shallow merge, later values overwrite earlier ones for the same key.
Deep Merging
Shallow merge doesn't work well when you have nested objects and want to preserve data at all levels.
JavaScript deep merge
function deepMerge(target, source) {
const result = { ...target };
for (const key of Object.keys(source)) {
if (source[key] instanceof Object && key in target) {
result[key] = deepMerge(target[key], source[key]);
} else {
result[key] = source[key];
}
}
return result;
}
// Usage
const config1 = { db: { host: 'localhost', port: 5432 } };
const config2 = { db: { port: 3306, user: 'admin' } };
const merged = deepMerge(config1, config2);
// { db: { host: 'localhost', port: 3306, user: 'admin' } }
Using lodash
If you're using lodash, it has built in deep merge:
const _ = require('lodash');
const merged = _.merge({}, config1, config2);
Python deep merge
def deep_merge(dict1, dict2):
result = dict1.copy()
for key, value in dict2.items():
if key in result and isinstance(result[key], dict) and isinstance(value, dict):
result[key] = deep_merge(result[key], value)
else:
result[key] = value
return result
Deep Merge Strategies
When building AI translation systems that merge configuration from multiple sources, I learned that shallow merge destroys nested data. Here's the difference with real examples.
Shallow vs Deep Merge Comparison
In my experience managing product catalogs across environments, shallow merge causes data loss. Here's why:
const defaultConfig = {
database: {
host: 'localhost',
port: 5432,
pool: { min: 2, max: 10 }
},
api: {
timeout: 5000,
retries: 3
}
};
const prodConfig = {
database: {
host: 'prod.example.com',
pool: { max: 50 }
}
};
// Shallow merge (LOSES DATA)
const shallow = { ...defaultConfig, ...prodConfig };
console.log(shallow.database);
// { host: 'prod.example.com', pool: { max: 50 } }
// Lost: port: 5432 and pool.min: 2
// Deep merge (PRESERVES DATA)
const deep = deepMerge(defaultConfig, prodConfig);
console.log(deep.database);
// { host: 'prod.example.com', port: 5432, pool: { min: 2, max: 50 } }
// Everything preserved, only specified values overridden
Custom merge functions for specific use cases
When translating product data across languages, I need merge logic that handles arrays and objects differently. Here's the pattern I use:
function smartMerge(target, source, options = {}) {
const { arrayStrategy = 'replace', maxDepth = 10 } = options;
function merge(t, s, depth) {
if (depth > maxDepth) return s; // Prevent infinite recursion
const result = Array.isArray(t) ? [...t] : { ...t };
for (const [key, value] of Object.entries(s)) {
if (!(key in result)) {
result[key] = value;
} else if (Array.isArray(value) && Array.isArray(result[key])) {
// Handle arrays based on strategy
if (arrayStrategy === 'concat') {
result[key] = [...result[key], ...value];
} else if (arrayStrategy === 'unique') {
result[key] = [...new Set([...result[key], ...value])];
} else {
result[key] = value; // replace (default)
}
} else if (typeof value === 'object' && value !== null &&
typeof result[key] === 'object' && result[key] !== null) {
result[key] = merge(result[key], value, depth + 1);
} else {
result[key] = value;
}
}
return result;
}
return merge(target, source, 0);
}
// Usage examples
const base = { tags: ['a', 'b'], meta: { version: 1 } };
const update = { tags: ['c'], meta: { author: 'Alice' } };
smartMerge(base, update, { arrayStrategy: 'concat' });
// { tags: ['a', 'b', 'c'], meta: { version: 1, author: 'Alice' } }
smartMerge(base, update, { arrayStrategy: 'replace' });
// { tags: ['c'], meta: { version: 1, author: 'Alice' } }
Performance considerations
Deep merge is slower than shallow merge. When processing thousands of product translations, I've measured the difference:
- Shallow merge: O(n) where n is number of keys at top level
- Deep merge: O(n × d) where d is average nesting depth
For large datasets, I pre-flatten structures when possible, merge, then reconstruct. This is 3-5x faster than naive deep merge.
Handling Conflicts
What happens when both objects have the same key? A few strategies:
Last wins (default)
The second value overwrites the first. This is what spread operator does.
First wins
Keep the original value, ignore the new one.
const merged = { ...obj2, ...obj1 }; // Just reverse the order
Combine arrays
If the values are arrays, concatenate instead of replacing.
function mergeWithArrayConcat(obj1, obj2) {
const result = { ...obj1 };
for (const [key, value] of Object.entries(obj2)) {
if (Array.isArray(result[key]) && Array.isArray(value)) {
result[key] = [...result[key], ...value];
} else {
result[key] = value;
}
}
return result;
}
Custom resolver
Pass a function to decide what to do on conflicts.
function mergeWith(obj1, obj2, resolver) {
const result = { ...obj1 };
for (const [key, value] of Object.entries(obj2)) {
if (key in result) {
result[key] = resolver(result[key], value, key);
} else {
result[key] = value;
}
}
return result;
}
// Usage: keep the larger value
const merged = mergeWith(obj1, obj2, (a, b) => a > b ? a : b);
Conflict Resolution Patterns
When merging translated content from multiple language files or combining product data from different sources, systematic conflict resolution prevents data corruption. Here's what I've learned works in production.
Timestamp-based resolution
Keep the most recent value when both objects have timestamps. I use this pattern when merging product updates from multiple systems.
function mergeByTimestamp(obj1, obj2) {
const t1 = new Date(obj1.updated_at).getTime();
const t2 = new Date(obj2.updated_at).getTime();
return t2 > t1 ? obj2 : obj1;
}
// Usage
const local = { name: 'Product A', price: 19.99, updated_at: '2026-01-20' };
const remote = { name: 'Product A', price: 24.99, updated_at: '2026-01-25' };
const merged = mergeByTimestamp(local, remote);
// Uses remote data because it's newer
Priority-based resolution
Assign priority levels to data sources. When building translation workflows, production translations always override draft translations.
function mergeByPriority(sources) {
// Sources: array of { data, priority } objects
// Higher priority wins
sources.sort((a, b) => b.priority - a.priority);
return sources.reduce((merged, source) => {
return { ...merged, ...source.data };
}, {});
}
// Usage
const result = mergeByPriority([
{ data: { text: 'Draft' }, priority: 1 },
{ data: { text: 'Production' }, priority: 10 },
{ data: { author: 'Alice' }, priority: 5 }
]);
// { text: 'Production', author: 'Alice' }
Validation-based resolution
When merging configuration or product attributes, validate values and reject invalid ones. This saved me from deploying broken configs multiple times.
function mergeWithValidation(obj1, obj2, validators) {
const result = { ...obj1 };
for (const [key, value] of Object.entries(obj2)) {
if (validators[key] && !validators[key](value)) {
console.warn(`Invalid value for ${key}: ${value}, keeping original`);
continue;
}
result[key] = value;
}
return result;
}
// Usage
const validators = {
port: (v) => v > 0 && v < 65536,
timeout: (v) => v > 0,
email: (v) => v.includes('@')
};
mergeWithValidation(
{ port: 5432, timeout: 5000 },
{ port: -1, timeout: 3000 }, // Invalid port
validators
);
// { port: 5432, timeout: 3000 } - kept valid original port
Common Scenarios
Merging API pagination results
async function fetchAllUsers() {
let allUsers = [];
let page = 1;
while (true) {
const response = await fetch(`/api/users?page=${page}`);
const data = await response.json();
if (data.users.length === 0) break;
allUsers = [...allUsers, ...data.users];
page++;
}
return allUsers;
}
Merging config files
const defaultConfig = require('./config.default.json');
const envConfig = require('./config.production.json');
const config = deepMerge(defaultConfig, envConfig);
Combining multiple JSON files
import json
from pathlib import Path
def merge_json_files(directory, output_file):
all_data = []
for json_file in Path(directory).glob('*.json'):
with open(json_file) as f:
data = json.load(f)
if isinstance(data, list):
all_data.extend(data)
else:
all_data.append(data)
with open(output_file, 'w') as f:
json.dump(all_data, f, indent=2)
Quick Tool Solution
For one off merges, use our JSON merger tool. Paste multiple JSON objects or arrays, and it combines them automatically.
- Arrays get concatenated
- Objects get shallow merged (later overwrites earlier)
- Mixing types throws an error (can't merge array with object)