FAQ - uhop/stream-json GitHub Wiki
Why do I get "Cannot find module" or "ERR_MODULE_NOT_FOUND"?
Always include the .js extension when importing stream-json modules:
// ✅ Correct
const {parser} = require('stream-json/parser.js');
const {pick} = require('stream-json/filters/pick.js');
// ❌ Wrong — will fail with Node.js ESM resolution
const {parser} = require('stream-json/parser');
const {pick} = require('stream-json/filters/Pick');
Node.js does not automatically resolve file extensions for packages that use subpath exports. The stream-json package uses "./*": "./src/*" in its exports field, so the full path including .js is required.
This applies to both require() and import.
How do I use stream-json with ESM import?
stream-json is a CommonJS package, but Node.js can import CommonJS modules from .mjs files or when your package.json has "type": "module". Use the .js extension in all import paths:
import {parser} from 'stream-json/parser.js';
import chain from 'stream-chain';
import pick from 'stream-json/filters/pick.js';
import replace from 'stream-json/filters/replace.js';
import ignore from 'stream-json/filters/ignore.js';
import filter from 'stream-json/filters/filter.js';
import streamValues from 'stream-json/streamers/stream-values.js';
import streamArray from 'stream-json/streamers/stream-array.js';
import streamObject from 'stream-json/streamers/stream-object.js';
import stringer from 'stream-json/stringer.js';
import emitter from 'stream-json/emitter.js';
import Assembler from 'stream-json/assembler.js';
import disassembler from 'stream-json/disassembler.js';
import batch from 'stream-json/utils/batch.js';
import verifier from 'stream-json/utils/verifier.js';
import emit from 'stream-json/utils/emit.js';
import jsonlParser from 'stream-json/jsonl/parser.js';
import jsonlStringer from 'stream-json/jsonl/stringer.js';
Note that some modules export a default function while others use named exports. Filters and streamers export their factory as a named property (e.g., pick.pick, streamArray.streamArray), so you can use either style:
// default import — the module itself is the factory
import pick from 'stream-json/filters/pick.js';
chain([source, parser(), pick({filter: 'data'}), streamValues()]);
// named import — destructure the named export
import {pick} from 'stream-json/filters/pick.js';
chain([source, parser(), pick({filter: 'data'}), streamValues()]);
How do I parse a large JSON array?
The most common use case: read a huge JSON file containing an array of objects and process each one with minimal memory.
const fs = require('node:fs');
const chain = require('stream-chain');
const streamArray = require('stream-json/streamers/stream-array.js');
const pipeline = chain([fs.createReadStream('large-file.json'), streamArray.withParser()]);
pipeline.on('data', ({key, value}) => {
// key is the array index, value is the parsed object
console.log(key, value);
});
pipeline.on('end', () => console.log('done'));
streamArray.withParser() combines a JSON parser with StreamArray in one step. Each array element is emitted as {key, value} without keeping the whole array in memory.
For async/await, Node.js streams are async-iterable:
const pipeline = chain([fs.createReadStream('large-file.json'), streamArray.withParser()]);
for await (const {key, value} of pipeline) {
console.log(key, value);
}
See Recipe: streaming basics for more shapes (objects, JSONL) and tips.
How can I improve performance?
There is a special document for that: Performance.