Add MarshalWithBuffer function to support buffered JSON serialization#5186
Add MarshalWithBuffer function to support buffered JSON serialization#5186chowyu12 wants to merge 3 commits intozeromicro:masterfrom
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
|
Thank you for your contribution and the effort to improve performance! 🙏 However, I have some concerns about this implementation that I'd like to discuss: Memory Safety IssuesThe current buffer pooling approach has a potential memory leak issue:
Suggested ImprovementsIf we want to implement buffer pooling, we should add size limits: const maxBufferSize = 64 * 1024 // 64KB
func putBuffer(buf *bytes.Buffer) {
// Only return to pool if buffer isn't too large
if buf.Cap() <= maxBufferSize {
buf.Reset()
bufferPool.Put(buf)
}
// Large buffers are discarded and will be GC'd
}Performance Data RequestCould you provide benchmark data comparing:
This would help us understand the actual performance benefits vs. memory safety trade-offs. Current RecommendationI'm not recommending to merge this PR as-is due to the memory safety concerns. The current approach of creating new buffers might actually be safer for production systems, as:
Would you be interested in implementing the size-limited version, or would you prefer to explore other optimization approaches? Thanks again for contributing to go-zero! 🚀 |
|
great advice |
… MarshalWithBuffer function and replacing it with a built-in buffer pool to improve performance and simplify the code. Meanwhile, update the relevant test cases to verify the correctness of the new implementation.
|
As seen from the benchmark:
|
|
but when Simplified Pool Management 📈 Performance Improvement Summary
🎯 Key Insights
✅ Final Recommendation Reasons: This is a perfect optimization result! The simplified design not only improves performance but also makes the code clearer and more maintainable. |
|
In production, a key service using fasthttp crashed when an upstream returned large responses. fasthttp’s buffer pooling caused all pods to OOM, leading to a major outage. What's the time of each operation in benchmark? I'd like to keep it as simple as possible. |
|
Hi @chowyu12
func MarshalWithBuffer(v any) ([]byte, error) {
// ....
bs := buf.Bytes()
// Remove trailing newline added by json.Encoder.Encode
if len(bs) > 0 && bs[len(bs)-1] == '\n' {
bs = bs[:len(bs)-1]
}
res := make([]byte, len(bs))
copy(res, bs)
return res, nil
} |
|
@yguilai So, you must perform a reset operation before put. https://github.com/ickey-cn/go-zero/blob/sync_pool/core/jsonx/json.go#L24C1-L29C2 |

No description provided.