Conversation
This commit adds comprehensive support for resuming interrupted model downloads, significantly improving user experience when downloading large model files. Key changes: - Added checkExistingFile() function to validate existing file state before download - Detects complete files and skips re-downloading - Identifies partial files for resume capability - Handles corrupted files (larger than expected) by removing and re-downloading - Returns appropriate size indicators for each state - Enhanced download logic in StartDownload(): - Checks for existing files before initiating new downloads - Resumes downloads from chunk-aligned offsets to ensure data integrity - Properly tracks progress for already-downloaded portions - Skips complete files entirely, reporting progress immediately - Removed pre-download cleanup in Store.Pull(): - Previously removed models before pulling, preventing resume capability - Now preserves existing files to enable resumable downloads - Added comprehensive test coverage: - Tests for nonexistent files - Tests for complete files - Tests for partial files - Tests for corrupted files (larger than expected) - Tests for empty files This implementation ensures that interrupted downloads can be resumed seamlessly, reducing bandwidth usage and improving reliability for large model downloads. Closes #803
Implement resumable downloads in the model hub using persistent chunk markers instead of relying only on file size. This improves reliability when downloads are interrupted and avoids re-downloading completed chunks. Changes: - Add chunkProgress/chunkTracker to persist completed chunk IDs to a JSON marker file (via sonic) per file - loadOrCreateTracker loads existing markers or creates a fresh tracker; discards incompatible markers (different fileSize/chunkSize) and corrupted JSON - Skip chunks already marked complete and resume from the next chunk; align progress to chunk boundaries for data integrity - On full file completion, remove the marker file and optionally trim the target file to expected size - Replace checkExistingFile-based resume with chunk-level tracking for accurate resume across restarts - Add tests for chunkTracker: fresh state, persist/reload, incompatible markers, completedBytes, and concurrent markComplete Closes #803
Replace JSON (sonic) chunk progress with a fixed binary marker file format to reduce overhead, avoid locking, and simplify concurrent writes. Each goroutine writes only its own chunk byte via WriteAt. Changes: - Define binary marker layout: magic "NXCK", fileSize, chunkSize, totalChunks (24B header), then one byte per chunk (0x00/0x01) - chunkTracker: in-memory byte slice + open file for WriteAt; remove mutex and map; markComplete writes a single byte at chunk offset - loadOrCreateTracker: parse binary header, validate magic/sizes; incompatible or corrupted markers start fresh; initMarkerFile writes header + zeroed chunk bytes - Add close() to release file handle; remove() calls close() then deletes marker; defer close of all trackers when download finishes - Discard old JSON markers (bad magic) and overwrite with binary format for seamless migration Tests: - Add defer tracker.close() in all tests that create trackers - Rename corrupted-JSON test to corrupted-marker; add binary layout test (magic, sizes, chunk bytes) and migration-from-JSON test - Concurrent test: reload from disk to verify WriteAt persistence - Add markComplete out-of-range chunkID test and close-idempotent test
addea77 to
384baa0
Compare
384baa0 to
2e43ff0
Compare
RemiliaForever
approved these changes
Feb 6, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #1033
Closes #803