12/04/2026 18:16pm

JS2GO EP.38 Buffer and Stream Management in Node.js and Go
#Golang
#Go
#Node.js
#Stream
#Buffer
In a production environment, working with large-scale data — such as
- Reading massive log files
- Downloading data over a network
- Or sending API responses
requires careful consideration of performance, memory efficiency, and system stability.
Both JavaScript (Node.js) and Go (Golang) provide intelligent solutions for handling large data efficiently through two core concepts: Buffer and Stream.
In this article, we’ll explore how to use Buffer and Stream effectively in both Node.js and Go, along with real-world code examples and best practices for production-ready systems. 🚀
1. What is a Buffer?
A Buffer is a temporary memory space used to store data before or during read/write operations.
It allows applications to process large data efficiently without loading everything into memory at once (for example, a 1GB file).
🔹 Buffer in Node.js
In Node.js, the Buffer class handles raw binary data directly.
const fs = require('fs');
fs.readFile('example.txt', (err, data) => {
if (err) throw err;
console.log(data); // Display raw buffer
console.log(data.toString()); // Convert to string
});
Creating your own buffer:
const buf = Buffer.from('Hello Superdev!');
console.log(buf); // <Buffer 48 65 6c 6c 6f ...>
console.log(buf.toString()); // Hello Superdev!
Advantages of Buffer in Node.js:
✅ Handles binary data directly
✅ Great for file and network I/O operations
✅ More memory-efficient than loading entire data at once
🔹 Buffer in Go
In Go, you can use bytes.Buffer from the bytes package — a dynamic data structure that efficiently stores byte streams.
package main
import (
"bytes"
"fmt"
)
func main() {
var buffer bytes.Buffer
buffer.WriteString("Hello Superdev!")
fmt.Println(buffer.String())
}
Advantages of Buffer in Go:
✅ Efficient memory usage
✅ Can append data continuously without creating new strings
✅ Ideal for sequential data processing like log aggregation or network streaming
2. What is a Stream?
A Stream processes data piece by piece (chunk by chunk) instead of loading it all at once. This approach allows systems to handle large datasets efficiently while minimizing memory consumption.
🔹 Stream in Node.js
In Node.js, almost every I/O operation (File, Network, HTTP) can be performed as a Stream.
const fs = require('fs');
const readable = fs.createReadStream('bigfile.txt', { encoding: 'utf8' });
readable.on('data', chunk => {
console.log('📦 Received chunk:', chunk.length);
});
readable.on('end', () => {
console.log('✅ Done reading file');
});
Explanation:
- Node.js reads files in chunks (around 64KB per read).
- No need to load the entire file into memory.
- Perfect for logs, media, or API streaming tasks.
Writing data as a stream:
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('Superdev Stream Example\n');
writeStream.end('✅ Finished writing!');
🔹 Stream in Go
Go uses the io.Reader and io.Writer interfaces to handle data streaming.
package main
import (
"fmt"
"io"
"os"
)
func main() {
file, err := os.Open("bigfile.txt")
if err != nil {
panic(err)
}
defer file.Close()
buffer := make([]byte, 1024)
for {
n, err := file.Read(buffer)
if err == io.EOF {
break
}
if err != nil {
panic(err)
}
fmt.Printf("📦 Read %d bytes\n", n)
}
fmt.Println("✅ Done reading file")
}
Explanation:
- Reads data in chunks of 1024 bytes.
- Gives developers full control over memory usage.
- The same principle applies to network connections (e.g.,
net.Conn).
Writing to a stream:
output, _ := os.Create("output.txt")
defer output.Close()
output.Write([]byte("Superdev Stream Example\n"))
output.Write([]byte("✅ Finished writing!"))
3. Comparing Buffer and Stream
| Feature | Node.js | Go |
|---|---|---|
| Buffer Management | Uses Buffer class | Uses bytes.Buffer |
| File Streaming (Read) | fs.createReadStream() | io.Reader |
| File Streaming (Write) | fs.createWriteStream() | io.Writer |
| Network Handling | net.Socket (TCP/HTTP) | net.Conn |
| Memory Control | Automatic | Fine-grained control |
| I/O Performance | Excellent | Outstanding — more stable at scale |
4. Best Practices for Production Systems
💡 Use Buffer when:
- Data size is small enough to fit into memory
- You’re working with strings, JSON, or small log files
⚡ Use Stream when:
- Data size is large (e.g., file uploads/downloads, network transfers)
- You want to minimize memory footprint
- You’re handling real-time or continuous data such as video streaming or API responses
Node.js:
Perfect for API servers, file streaming, and real-time applications.
Go:
Ideal for backend systems requiring high throughput and low latency, such as data pipelines or log processors.
5. Real-World Use Cases
| Use Case | Recommended Language | Approach |
|---|---|---|
| File Upload / Download | Node.js | Use fs.createReadStream() + .pipe() |
| Real-time Data Processing | Go | Use io.Reader and io.Writer |
| Network Proxy | Go | Use net.Conn + bufio for streaming |
| Video Streaming | Node.js | Use Stream API + HTTP Chunked Responses |
Conclusion
Understanding Buffer and Stream is essential for developers working with large-scale data systems.
By mastering both concepts, you can build software that runs faster, uses less memory, and handles heavier loads efficiently.
✅ Use Stream for large I/O operations
✅ Use Buffer for small or intermediate data handling
💻 Node.js: Flexible, simple, and great for real-time use cases
⚙️ Go: High-performance, concurrency-friendly, and ideal for enterprise systems
Next Episode
In EP.39 of JS2GO, we’ll explore Channels and Pipelines in Go for Data Processing —
diving deep into asynchronous and parallel processing concepts, and how to design pipelines that perform multiple tasks concurrently with maximum efficiency ⚡
Read more
🔵 Facebook: Superdev Academy
🔴 YouTube: Superdev Academy
📸 Instagram: Superdev Academy
🎬 TikTok: https://www.tiktok.com/@superdevacademy?lang=th-TH
🌐 Website: https://www.superdevacademy.com/en