Commit b134ccb
fix: Use lazy Stream operations in decode_json_stream to avoid memory spike (#3799)
## Summary
Changed `Enum.filter` and `Enum.map` to `Stream.filter` and `Stream.map`
in `decode_json_stream/1` to prevent loading the entire shape log into
memory at once.
## Problem
The `decode_json_stream/1` function in `Materializer` was breaking lazy
evaluation:
```elixir
stream
|> Stream.map(&Jason.decode!/1) # lazy ✓
|> Enum.filter(...) # EAGER - materializes entire stream!
|> Enum.map(...) # eager
```
When a shape has a long history (millions of operations), this causes a
massive memory spike during Materializer startup because `Enum.filter`
forces the entire stream into memory before processing continues.
## Solution
Replace the eager `Enum` functions with lazy `Stream` equivalents:
```elixir
stream
|> Stream.map(&Jason.decode!/1) # lazy ✓
|> Stream.filter(...) # lazy ✓
|> Stream.map(...) # lazy ✓
```
The downstream `apply_changes/2` function uses `Enum.reduce`, which
correctly consumes the stream one element at a time, so records now flow
through the entire pipeline lazily.
## Test Plan
- [x] All existing tests pass
---
Generated with [Claude Code](https://claude.com/claude-code)
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>1 parent 1c20fac commit b134ccb
File tree
2 files changed
+7
-2
lines changed- .changeset
- packages/sync-service/lib/electric/shapes/consumer
2 files changed
+7
-2
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
Lines changed: 2 additions & 2 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
222 | 222 | | |
223 | 223 | | |
224 | 224 | | |
225 | | - | |
| 225 | + | |
226 | 226 | | |
227 | 227 | | |
228 | | - | |
| 228 | + | |
229 | 229 | | |
230 | 230 | | |
231 | 231 | | |
| |||
0 commit comments