A powerful and efficient Node.js library for streaming JSON processing. Transform JSON strings to objects and objects to JSON strings with support for custom separators, multiple encodings, and high-performance streaming operations.
- Dual Package: Full ES Modules (ESM) and CommonJS (CJS) support
- TypeScript: Complete type definitions included
- High Performance: Based on native Node.js stream methods
- Multiple Encodings: Support for utf8, base64, latin1, binary, and hex
- Custom Separators: Configure start, middle, and end separators
- Memory Efficient: Streaming approach for large JSON datasets
- Zero Dependencies: No external dependencies
npm install @sergdudko/objectstreamimport { Parser, Stringifer } from '@sergdudko/objectstream'; // String to Object conversion const parser = new Parser(); parser.on('data', (obj) => { console.log('Parsed object:', obj); }); parser.write('{"name":"John","age":30}'); parser.end(); // Object to String conversion const stringifer = new Stringifer(); stringifer.on('data', (jsonString) => { console.log('JSON string:', jsonString.toString()); }); stringifer.write({ name: 'John', age: 30 }); stringifer.end();const { Parser, Stringifer } = require('@sergdudko/objectstream'); // Or using default export const objectstream = require('@sergdudko/objectstream'); const { Parser, Stringifer } = objectstream.default;import { Parser, Stringifer } from '@sergdudko/objectstream'; interface User { name: string; age: number; } const parser = new Parser(); parser.on('data', (user: User) => { console.log(`User: ${user.name}, Age: ${user.age}`); });Transform stream that converts JSON strings to JavaScript objects.
new Parser(start?: string, middle?: string, end?: string)start(optional): First separator character (default: none)middle(optional): Middle separator character (default: none)end(optional): End separator character (default: none)
setEncoding(encoding): Set input encoding (utf8,utf-8,base64,latin1,binary,hex)
data: Emitted when an object is parsederror: Emitted when parsing failsend: Emitted when stream endsfinish: Emitted when stream finishes
Transform stream that converts JavaScript objects to JSON strings.
new Stringifer(start?: string, middle?: string, end?: string)start(optional): First separator character (default: none)middle(optional): Middle separator character (default: none)end(optional): End separator character (default: none)
setEncoding(encoding): Set output encoding (utf8,utf-8,base64,latin1,binary,hex)
data: Emitted when JSON string is generatederror: Emitted when stringification failsend: Emitted when stream endsfinish: Emitted when stream finishes
import { Parser, Stringifer } from '@sergdudko/objectstream'; const parser = new Parser(); const stringifer = new Stringifer(); // Parse JSON string parser.on('data', (obj) => { console.log('Parsed:', obj); }); parser.write('{"message":"Hello World"}'); parser.end(); // Stringify object stringifer.on('data', (data) => { console.log('Stringified:', data.toString()); }); stringifer.write({ message: 'Hello World' }); stringifer.end();import { Parser, Stringifer } from '@sergdudko/objectstream'; // Process JSON array with custom separators const parser = new Parser('[', ',', ']'); const stringifer = new Stringifer('[', ',', ']'); stringifer.on('data', (data) => { console.log('JSON Array chunk:', data.toString()); }); // Write multiple objects stringifer.write({ id: 1, name: 'Alice' }); stringifer.write({ id: 2, name: 'Bob' }); stringifer.write({ id: 3, name: 'Charlie' }); stringifer.end(); // Output: [{"id":1,"name":"Alice"},{"id":2,"name":"Bob"},{"id":3,"name":"Charlie"}]import { Parser, Stringifer } from '@sergdudko/objectstream'; // Base64 encoding const stringifer = new Stringifer(); stringifer.setEncoding('base64'); stringifer.on('data', (data) => { console.log('Base64 JSON:', data); // Base64 encoded JSON string }); stringifer.write({ encoded: true }); stringifer.end(); // Parse Base64 encoded JSON const parser = new Parser(); parser.setEncoding('base64'); parser.on('data', (obj) => { console.log('Decoded object:', obj); }); // Write base64 encoded JSON parser.write(Buffer.from('{"decoded":true}').toString('base64')); parser.end();import { Parser, Stringifer } from '@sergdudko/objectstream'; import { Transform } from 'stream'; // Create a processing pipeline const parser = new Parser(); const processor = new Transform({ objectMode: true, transform(obj, encoding, callback) { // Process each object obj.processed = true; obj.timestamp = Date.now(); callback(null, obj); } }); const stringifer = new Stringifer(); // Pipe the streams together parser .pipe(processor) .pipe(stringifer) .on('data', (data) => { console.log('Processed JSON:', data.toString()); }); // Input data parser.write('{"name":"test"}'); parser.end();import { Parser, Stringifer } from '@sergdudko/objectstream'; const parser = new Parser(); parser.on('data', (obj) => { console.log('Valid object:', obj); }); parser.on('error', (errors) => { console.error('Parsing errors:', errors); }); // Valid JSON parser.write('{"valid":true}'); // Invalid JSON parser.write('{"invalid":}'); parser.end();| Encoding | Input | Output | Description |
|---|---|---|---|
utf8 (default) | β | β | Standard UTF-8 text |
utf-8 | β | β | Alias for utf8 |
base64 | β | β | Base64 encoded data |
latin1 | β | β | Latin-1 encoding |
binary | β | β | Binary data encoding |
hex | β | β | Hexadecimal encoding |
ObjectStream is optimized for high-performance streaming operations:
- Memory Efficient: Processes data in chunks, suitable for large JSON files
- Zero-Copy Operations: Minimizes memory copying where possible
- Stream-Based: Non-blocking operations using Node.js streams
- Optimized Parsing: Efficient JSON parsing with error recovery
The library includes comprehensive TypeScript tests:
npm testTest coverage includes:
- β Parser functionality with various data types
- β Stringifer functionality with validation
- β Custom separators and encodings
- β Stream piping and event handling
- β Error handling and edge cases
- β Performance benchmarks
- β ESM/CJS compatibility
# Install dependencies npm install # Run tests npm test # Build dual package (ESM + CJS) npm run build # Lint code npm run lintdist/ βββ esm/ # ES Modules build βββ cjs/ # CommonJS build βββ types/ # Shared TypeScript definitions - Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- v3.x: TypeScript rewrite, dual package support, modern Node.js features
- v2.x: Enhanced performance and encoding support
- v1.x: Initial release with basic streaming functionality
MIT License - see LICENSE file for details.
- π Issues: GitHub Issues
- π¬ Discussions: GitHub Discussions
- π§ Email: siarhei@dudko.dev
If ObjectStream helps you build amazing applications, consider supporting its development:
- β Buy me a coffee
- π³ PayPal
- π― Patreon
- π More options
Your support helps maintain and improve Redux Cluster for the entire community!
Made with β€οΈ by Siarhei Dudko