-
Notifications
You must be signed in to change notification settings - Fork 28
Live Collab M4 - Segment Editing #9124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
…th BE; some other misc stuff
…dating of segment
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughLarge frontend and test refactor adding volume/segment merge/metadata/group update actions and handlers, introducing volume-diffing sagas, renaming position fields to anchorPosition, moving utility types to a new module, adding many test fixtures/helpers, and adding a debugging visualizer with server and data output. Changes
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related issues
Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
… update mechanism of segments
…egment group action
…freading saga to finish (also check segment existence)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 18
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
frontend/javascripts/viewer/view/ai_jobs/components/bounding_box_selection.tsx (1)
69-79:⚠️ Potential issue | 🟠 MajorUpdate
optionFilterPropto use the correct Ant Design v6 API.The
optionFilterPropprop should be nested within theshowSearchobject, not used as a standalone prop. ChangeshowSearchandoptionFilterProp="children"toshowSearch={{ optionFilterProp: "label" }}to align with Ant Design v6's semantic API.Example fix:
<Select - showSearch + showSearch={{ optionFilterProp: "label" }} placeholder="Select a bounding box" - optionFilterProp="children" options={options} filterOption={filterOption} disabled={userBoundingBoxes.length < 1} onSelect={setSelectedBoundingBoxId} style={style} value={value} />Optionally, consider adding a
prefix(e.g., an icon) for better visual context if it aligns with the UI design.frontend/javascripts/test/sagas/proofreading/proofreading_skeleton_test_utils.ts (1)
38-79:⚠️ Potential issue | 🟠 MajorTraverse adjacency as undirected to avoid missing nodes.
With a directed adjacency list (as produced byAgglomerateMapping), the BFS only follows outgoing edges. If merges connect through a non-root node, nodes reachable only via incoming edges will be skipped, producing incomplete skeletons.🐛 Suggested fix (include incoming edges during BFS)
- while (queue.length) { - const n = queue.shift()!; - const neighbours = adjacencyList.get(n); - if (!neighbours) continue; - for (const nb of neighbours) { - if (!visited.has(nb)) { - visited.add(nb); - queue.push(nb); - } - } - } + while (queue.length) { + const n = queue.shift()!; + const neighbours = new Set<number>(); + for (const nb of adjacencyList.get(n) ?? []) neighbours.add(nb); + for (const [node, outs] of adjacencyList) { + if (outs.has(n)) neighbours.add(node); + } + if (neighbours.size === 0) continue; + for (const nb of neighbours) { + if (!visited.has(nb)) { + visited.add(nb); + queue.push(nb); + } + } + }frontend/javascripts/viewer/view/version_entry.tsx (1)
386-386:⚠️ Potential issue | 🟡 MinorSame typo: double space in description string.
✏️ Proposed fix
- description: `Updated the segment with id ${firstAction.value.id} in the segments list of layer ${layerName}.`, + description: `Updated the segment with id ${firstAction.value.id} in the segments list of layer ${layerName}.`,
🤖 Fix all issues with AI agents
In `@frontend/javascripts/libs/utils.ts`:
- Around line 1454-1464: The function replaceOrAdd incorrectly uses
Array.prototype.splice and returns its return value (the removed elements)
instead of the modified array; update replaceOrAdd (operating on volumes,
newVolumeTracing and predicate) to produce a new array with the replacement:
create a shallow copy of volumes (e.g., via volumes.slice() or [...volumes]),
then if index === -1 return the copy with newVolumeTracing appended, otherwise
call splice on the copy to replace the element at index and return that copy (or
alternatively use map to replace the matching element and return the new mapped
array).
In
`@frontend/javascripts/test/sagas/proofreading/proofreading_mesh_interaction.spec.ts`:
- Around line 553-555: The test "should perform partitioned min-cut correctly"
uses a mocked segment item for agglomerate 6 with an impossible anchor position
(1337^3), which likely invalidates the partitioned min-cut behaviour; locate
where the mock for agglomerate/segment item 6 is constructed in
proofreading_mesh_interaction.spec.ts (the test block for "should perform
partitioned min-cut correctly") and replace the anchor position with a realistic
coordinate consistent with the mesh/voxel space used by other mocks (or derive
it from the same helper that computes anchors for other agglomerates), then
update any expected outcomes/assertions in that test to match the corrected
anchor; run the test suite to ensure partitioned min-cut assertions now reflect
the corrected mock.
In `@frontend/javascripts/test/sagas/volumetracing/volume_diffing.spec.ts`:
- Around line 173-218: The test setup mistakenly checks
segmentItemsToCreateInSetup.includes(id1) twice so createSegment2 never runs
when only id2 is present; update the second guard to check
segmentItemsToCreateInSetup.includes(id2) so newState applies createSegment2 via
VolumeTracingReducer when id2 is in the array (affecting the
mergeSegmentsAction/id1/id2 scenario used by uncachedDiffSegmentLists and the
expectations around VOLUME_TRACING_ID).
In `@frontend/javascripts/viewer/api/wk_dev.ts`:
- Line 15: The config sets liveCollab: true (in the object in wk_dev.ts) which
must be reverted to opt-in; change liveCollab back to false (or remove the
property so default is false) and update the inline comment from "todop: revert"
to a standard "TODO: revert before merge" so IDE/scan tools pick it up; ensure
the change is made on the liveCollab property in wk_dev.ts and run any local dev
checks to confirm behavior.
In `@frontend/javascripts/viewer/model/helpers/diff_helpers.ts`:
- Around line 121-137: The isDescendant function currently mutates the array
stored in childrenMap because stack is a direct reference and pop() changes it;
change the initialization of stack in isDescendant to use a shallow copy of the
children array (e.g., spread or slice) so popping does not mutate childrenMap,
keep the rest of the logic (visited Set, popping, pushing
childrenMap.get(current) ?? []) intact to preserve DFS behavior.
In `@frontend/javascripts/viewer/model/reducers/volumetracing_reducer_helpers.ts`:
- Around line 462-464: Remove the debug logging calls
(ColoredLogger.logGreen(...) around "merging segments" and the prints of
sourceSegment and targetSegment) from the merge logic so production runs aren't
cluttered; delete the three ColoredLogger.logGreen(...) lines that reference
sourceSegment and targetSegment in the volumetracing reducer helper, and
afterwards run a quick build/lint to remove or delete the ColoredLogger import
if it becomes unused.
In `@frontend/javascripts/viewer/model/sagas/saving/rebasing_helpers_sagas.ts`:
- Around line 2-4: The file imports lodash helpers from "lodash/..." which
conflicts with the project's declared dependency "lodash-es"; update the imports
to use "lodash-es" (either individual paths like "lodash-es/flattenDeep" and
"lodash-es/omitBy" or a single named import from "lodash-es") so that symbols
like flattenDeep and omitBy are imported from lodash-es, leaving other imports
(ColoredLogger, getAdaptToTypeFunction) unchanged.
In
`@frontend/javascripts/viewer/view/right-border-tabs/trees_tab/tree_hierarchy_view_helpers.ts`:
- Around line 292-319: createGroupHelper is creating newGroup without the
required isExpanded property; update it to either call the existing
makeBasicGroupObject(name, newGroupId) to construct newGroup (which sets
isExpanded: false) or explicitly add isExpanded: false to the newGroup literal,
and keep the rest of the logic using MISSING_GROUP_ID, cloneDeep, and callDeep
unchanged; also remove returning newGroupId (return only newSegmentGroups and
newGroup) unless the helper signature must mirror another API that expects
newGroupId to be returned.
In `@frontend/javascripts/viewer/view/version_entry.tsx`:
- Line 372: Fix the minor typo in the description string by removing the extra
space between "list" and "of" in the object that builds the description (the
line using description: `Updated the segment with id ${firstAction.value.id} in
the segments list of layer ${layerName}.`); locate the string near the usage of
firstAction and layerName in version_entry.tsx and update it to a single space
between "list" and "of".
In `@tools/debugging/version-visualizer/public/main.js`:
- Around line 336-339: renderGraph currently reads width/height from the global
main element causing skeletonRenderer (and any other renderer) to size
incorrectly when a different parent (e.g., main2) is used; update renderGraph to
obtain sizing from the renderer's actual container (use a property like
this.container or this.rendererParent or accept a parentElement argument)
instead of the global main, and ensure any code that constructs or calls
renderGraph (including skeletonRenderer initialization) passes or sets the
correct container element so width/height are computed against that element's
clientWidth/clientHeight rather than global main.
- Around line 190-193: Rename the unused reviver parameter in the JSON.parse
call: in the creation of storeState (the JSON.parse(..., (key, value) => { ...
}) reviver) change the first parameter name from key to _key so the linter stops
flagging an unused variable; keep the logic that checks value != null && typeof
value === "object" && value._isDiffableMap and calls parseDiffableMap(value)
unchanged.
- Around line 445-450: Remove the unused d3 line generator by deleting the const
line = d3.line()... declaration (the variable named "line" that calls
d3.line().x(...).y(...).curve(d3.curveBundle.beta(0.5))). The rendering uses SVG
<line> elements (x1,y1,x2,y2) so you should simply remove the entire "line"
variable declaration from main.js to eliminate dead code and any unused d3.curve
references.
- Around line 208-233: The current build of tableHTML inserts unescaped segment
values (e.g., s.name and s.anchorPosition) into innerHTML on segmentViewDiv
which enables XSS; replace this with safe DOM construction: iterate
Array.from(segments.values()) and for each segment create tr/td elements via
document.createElement, set their text via textContent (or explicitly escape
strings) for s.id, s.name and JSON.stringify(s.anchorPosition), append the rows
to a tbody, then replace segmentViewDiv.innerHTML by first clearing children and
appending the constructed table element; update the logic around tableHTML,
segments, and segmentViewDiv accordingly.
In `@tools/debugging/version-visualizer/server.js`:
- Around line 13-24: Wrap the body of the app.get("/versions", (req, res) => {
... }) handler in a try/catch, rename the unused req parameter to _req to
satisfy linting, and add a guard that checks fs.existsSync(DATA_DIR) (or handle
ENOENT) before calling fs.readdirSync; on error return an appropriate HTTP
response (e.g., res.status(500).json({ error: 'Failed to read versions' }) or
res.status(404).json([])) and log the error, while preserving the existing logic
that reads files with fs.readFileSync, JSON.parse and sorts by version.
- Around line 30-50: The route handler for app.get("/creationTime") declares an
unused req parameter and has inconsistent indentation in the lines building the
files array; update the handler signature to either remove req or rename it to
_req to satisfy the linter, and fix indentation of the block that reads DATA_DIR
(the const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
and the subsequent if (files.length === 0) { ... } and const fileName =
files[0]; lines) so they align with the surrounding code and maintain consistent
two-space indentation; keep the rest of the logic (filePath, stat, creationTime,
error handling) unchanged.
In `@unreleased_changes/9124.md`:
- Line 3: Fix the typo in the release note sentence that currently reads "In the
past, this only happend when the automatic meshing feature was enabled." by
changing "happend" to "happened" so the sentence reads "...this only happened
when the automatic meshing feature was enabled."; update the string in the
unreleased change entry (the sentence starting "When proofreading, segment items
are now always created for new agglomerates.") accordingly.
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala`:
- Around line 20-21: Remove the TODOM note in MetadataEntry.scala and choose one
implementation approach (keep the current
single-option-with-multiple-optional-fields or switch to a sealed trait/ADT with
one case per metadata type); update the code accordingly and delete the TODOM
comment so the design decision is recorded in code or documentation (referencing
the MetadataEntry type) and no TODO markers remain.
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala`:
- Around line 587-612: The (None, Some(targetSegment)) branch constructs a new
Segment and drops fields from targetSegment; change it to start from
targetSegment.copy(...) so all existing fields (color, coordinates, etc.) are
preserved, overriding only segmentId to sourceId, creationTime to
actionTimestamp, isVisible/name/metadata as needed (use mergeSegmentNames(...)
for name and keep targetSegment.isVisible), i.e., replace the manual
Segment(...) with targetSegment.copy(...) updating those specific fields so the
rest of targetSegment's data remains intact.
🧹 Nitpick comments (20)
frontend/javascripts/viewer/model/sagas/saving/save_queue_draining_saga.tsx (1)
181-183: Avoid committed commented-out debug logs.If this log is still useful, gate it behind a dev flag and use the logger; otherwise remove it to reduce noise.
🧹 Suggested cleanup
- // ColoredLogger.logRed("sendSaveRequestToServer fullSaveQueue", fullSaveQueue);frontend/javascripts/viewer/model/helpers/action_logger_middleware.ts (1)
55-59: Consider removing commented code and optionally preserving payload visibility.The console grouping is now commented out, and logging has been simplified to show only
action.typeinstead of the full action object. While this reduces log verbosity (likely beneficial with increased action volume from live collaboration), it also removes payload visibility that can be valuable during debugging.Consider:
- Remove commented lines: Lines 55 and 59 should be deleted rather than commented out—git history preserves them if needed later.
- Optional enhancement: If detailed debugging is occasionally needed, you could conditionally log the full action based on an additional flag or action type filter.
♻️ Proposed cleanup
if (WkDevFlags.logActions) { - // console.group(action.type); console.info("dispatching", action.type); let result = next(action); - // console.log('next state', store.getState()) - // console.groupEnd(); return result; }frontend/javascripts/test/sagas/volumetracing/segment_group_fixtures.ts (1)
139-141: Minor formatting: extra blank line.There's a stray blank line at line 140 that breaks consistency with the other fixture entries.
🧹 Remove extra blank line
isExpanded: false, - children: [frontend/javascripts/viewer/model/sagas/undo_saga.ts (1)
47-49: Tighten typing to avoid thestring[]cast (and theanycast downstream).You can preserve type safety by typing the list and using a small type guard instead of casting. This keeps
includestype-safe and avoids wideningaction.♻️ Suggested refactor
-const UndoRedoRelevantBoundingBoxActions = AllUserBoundingBoxActions.filter( - (action) => action !== "SET_USER_BOUNDING_BOXES", -); +const UndoRedoRelevantBoundingBoxActions: ReadonlyArray<UserBoundingBoxAction["type"]> = + AllUserBoundingBoxActions.filter( + (action): action is UserBoundingBoxAction["type"] => + action !== "SET_USER_BOUNDING_BOXES", + ); + +const isUserBoundingBoxAction = (action: Action): action is UserBoundingBoxAction => + UndoRedoRelevantBoundingBoxActions.includes(action.type);- } else if ((UndoRedoRelevantBoundingBoxActions as string[]).includes(action.type)) { + } else if (isUserBoundingBoxAction(action)) { return { - userBoundingBoxAction: action as any as UserBoundingBoxAction, + userBoundingBoxAction: action, }; }Also applies to: 123-126
test/backend/VolumeUpdateActionsUnitTestSuite.scala (2)
246-292: Minor inconsistency in assertion variable.The test logic is correct, but line 286 uses
segment.segmentIdfrom the first result instead ofsegment2.segmentIdfrom the second result. While the values are identical (both are segment ID 5), usingsegment2would be more consistent with the surrounding assertions.Proposed fix
val segmentMetadata2 = segment2.metadata.map(MetadataEntry.fromProto) - assert(segment.segmentId == updateMetadataAction.id) + assert(segment2.segmentId == deleteMetadataAction.id) assert(
496-528: Consider adding edge case tests for delete operations.The recursive deletion tests are solid, but consider adding coverage for:
- Deleting a leaf group (e.g.,
groupId4,groupId6, orgroupId7) to verify no unintended side effects- Deleting a non-existent group to verify graceful handling
These edge cases would strengthen confidence in the delete implementation.
frontend/javascripts/viewer/model/sagas/saving/save_mutex_saga.tsx (1)
266-269: Template literal whitespace may affect display.The multi-line template literal preserves the leading whitespace from lines 267-269, which will result in extra spaces in the rendered toast message. Consider using string concatenation or removing the indentation to ensure clean message formatting.
🔧 Suggested fix
Toast.warning( - `Could not get the annotations write-lock for more than ${MAX_AD_HOC_RETRY_TIME / 1000} seconds. - User ${blockingUserName} is currently blocking the annotation. - This might be due to using non-live collab supported features. + `Could not get the annotations write-lock for more than ${MAX_AD_HOC_RETRY_TIME / 1000} seconds. ` + + `User ${blockingUserName} is currently blocking the annotation. ` + + `This might be due to using non-live collab supported features. ` + Ensure they are sticking to tools supporting live collaboration.`,frontend/javascripts/viewer/model/reducers/skeletontracing_reducer.ts (1)
1319-1321: Consider addressing or tracking the TODO.The
todop:prefix suggests this is a personal reminder. If this represents a known gap that should be addressed, consider either:
- Creating a follow-up issue to track the uniqueness enforcement
- Removing the TODO if uniqueness is already handled elsewhere (e.g., during metadata merging as mentioned in PR objectives)
frontend/javascripts/test/sagas/save_mutex_saga.spec.ts (1)
411-447: Consider simplifying the tool parameterization structure.Wrapping
AnnotationToolvalues in{ tool: ... }objects adds indirection. If only the tool is needed for parameterization, the simpler array[AnnotationTool.SKELETON, AnnotationTool.BOUNDING_BOX]with direct access would be cleaner:describe.each([AnnotationTool.SKELETON, AnnotationTool.BOUNDING_BOX])( "[With AnnotationTool=$id]:", (tool) => { ... } )However, if this structure anticipates future properties per tool entry, the current approach is fine.
frontend/javascripts/test/helpers/agglomerate_mapping_helper.ts (1)
118-130: Clarify directed vs. undirected edge semantics.The adjacency list stores edges directionally (only
A→BinaddEdgeat Line 79), butremoveEdgeattempts to delete bothA→BandB→A(Lines 119-124). This asymmetry meansdidRemovemay betruefor only one direction.The BFS traversal (Lines 150-161) correctly handles this by checking both directions during graph traversal, so the split logic works correctly. However, the code could be clearer about whether the graph is intended to be directed or undirected.
Consider adding a comment clarifying this is an intentional directed storage with bidirectional traversal, or making
addEdgesymmetric by adding the reverse edge as well.frontend/javascripts/types/type_utils.ts (1)
12-17:Mutable<T>andWriteable<T>are identical.Both types use the same implementation (
-readonly [K in keyof T]: T[K]) to remove thereadonlymodifier. Consider removing one to avoid redundancy, or add a comment explaining why both are needed (e.g., for compatibility with different naming conventions in the codebase).Suggested consolidation
export type Mutable<T> = { -readonly [K in keyof T]: T[K]; }; -export type Writeable<T> = { - -readonly [P in keyof T]: T[P]; -}; +// Alias for compatibility +export type Writeable<T> = Mutable<T>;webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala (1)
22-57: Consider documenting the priority order behavior.When
thathas multiple optional fields defined (e.g., bothstringValueandboolValue), only the first matching field by priority order (stringValue>boolValue>numberValue>stringListValue) will be applied. This implicit prioritization should either be documented or the method should validate that only one field is defined.🔧 Optional: Add validation for single-field constraint
def update(that: MetadataEntry): MetadataEntry = if (this.key != that.key) { this } else { val definedFields = Seq( that.stringValue.isDefined, that.boolValue.isDefined, that.numberValue.isDefined, that.stringListValue.isDefined ).count(identity) require(definedFields <= 1, "MetadataEntry update should have at most one value field defined") // ... rest of implementation }frontend/javascripts/test/reducers/update_action_application/volume.spec.ts (1)
235-237: Remove debug logging from the test loop.These logs will make CI output noisy without adding signal.
🧹 Proposed cleanup
- ColoredLogger.logYellow("Segments before", Array.from(volumeTracing2.segments.keys())); - console.log("actionsToApply", actionsToApply); - ColoredLogger.logYellow("Segments after", Array.from(volumeTracing3.segments.keys()));webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala (2)
423-463: Track the TODO for metadata updates in partial segment action.Metadata updates are currently excluded here; ensure callers rely on
UpdateMetadataOfSegmentVolumeActionor implement the TODO.If you want, I can draft the metadata‑update extension for this action.
659-678: Avoid debug placeholder name for new groups.
"Default Name …"looks like a temporary value; consider using the same user‑facing default as the frontend or requiring a name.🧽 Suggested tweak
- name = name.getOrElse(s"Default Name $groupId"), // TODOM: undo default name - used for debugging + name = name.getOrElse(s"Group $groupId"),frontend/javascripts/test/sagas/volumetracing/volume_segment_items_multi_user.spec.ts (1)
164-173: Resolve the TODO around mutex switching.
Line 172 notes the flow relies on an extra save to switch strategies; please address or track it to avoid brittle behavior. I can help draft a fix or follow-up issue if useful.frontend/javascripts/test/helpers/debugging_state_serializer.ts (1)
84-92: Consider handling a missing output directory gracefully.
If the debug directory doesn’t exist in a fresh checkout,readdirwill throw. A small ENOENT guard keeps the helper more robust.🔧 Optional ENOENT guard
async function deleteOldFiles() { - const entries = await readdir(DEBUG_OUTPUT_DIR, { withFileTypes: true }); + let entries; + try { + entries = await readdir(DEBUG_OUTPUT_DIR, { withFileTypes: true }); + } catch (err) { + if ((err as NodeJS.ErrnoException).code === "ENOENT") return; + throw err; + } const files = entries.filter((entry) => entry.isFile() && entry.name !== ".gitignore"); for (const file of files) {frontend/javascripts/viewer/model/bucket_data_handling/layer_rendering_manager.ts (1)
364-374: Reuse empty journals to keep memoization effective.Passing fresh
[]each call defeats referential equality and weakens caching. Consider hoisting a shared empty array until real journals are wired.Proposed refactor
const LOOKUP_CUCKOO_TEXTURE_WIDTH = 256; +const EMPTY_SEGMENT_JOURNAL = []; @@ for (const updateAction of cachedDiffSegmentLists( this.name, prevSegments, newSegments, - [], - [], + EMPTY_SEGMENT_JOURNAL, + EMPTY_SEGMENT_JOURNAL, )) {frontend/javascripts/test/sagas/proofreading/proofreading_skeleton_interaction.spec.ts (1)
513-745: TODOs here flag known gaps worth tracking.The notes about duplicate split actions, mismatched edge counts, and a passing test with an incorrect skeleton could hide regressions. Consider tracking these as follow-up issues or adding assertions. I can help draft fixes or test assertions if you want.
frontend/javascripts/viewer/model/sagas/volume/volume_diffing.ts (1)
365-381: Consider documenting the fallbackcreationTime: 0behavior.When
prevSourceSegmentis null andfallbackCreationTimeis also unavailable, the segment is created withcreationTime: 0. While this is a safe default, it might be worth adding a brief comment explaining this edge case, as acreationTimeof 0 could be confusing when debugging or analyzing segment data.
frontend/javascripts/test/sagas/proofreading/proofreading_mesh_interaction.spec.ts
Show resolved
Hide resolved
| export const WkDevFlags = { | ||
| logActions: false, | ||
| liveCollab: false, | ||
| liveCollab: true, // todop: revert |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Revert this flag before merging to master.
The liveCollab flag is set to true with a TODO to revert. Per the PR objectives, the live-collab property should be opt-in behavior, not enabled globally by default. This change should be reverted before the PR is merged.
Additionally, consider using the standard TODO format instead of todop to ensure it's caught by standard TODO scanners and IDE integrations.
Suggested fix
- liveCollab: true, // todop: revert
+ liveCollab: false,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| liveCollab: true, // todop: revert | |
| liveCollab: false, |
🤖 Prompt for AI Agents
In `@frontend/javascripts/viewer/api/wk_dev.ts` at line 15, The config sets
liveCollab: true (in the object in wk_dev.ts) which must be reverted to opt-in;
change liveCollab back to false (or remove the property so default is false) and
update the inline comment from "todop: revert" to a standard "TODO: revert
before merge" so IDE/scan tools pick it up; ensure the change is made on the
liveCollab property in wk_dev.ts and run any local dev checks to confirm
behavior.
| app.get("/versions", (req, res) => { | ||
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | ||
|
|
||
| const versions = files | ||
| .map((file) => { | ||
| const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8")); | ||
| return json; | ||
| }) | ||
| .sort((a, b) => a.version - b.version); | ||
|
|
||
| res.json(versions); | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add error handling and fix unused parameter.
The /versions endpoint lacks error handling if DATA_DIR doesn't exist or is inaccessible. Also, req parameter should be prefixed with underscore per linting rules.
Proposed fix
-app.get("/versions", (req, res) => {
- const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
-
- const versions = files
- .map((file) => {
- const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
- return json;
- })
- .sort((a, b) => a.version - b.version);
-
- res.json(versions);
+app.get("/versions", (_req, res) => {
+ try {
+ const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
+
+ const versions = files
+ .map((file) => {
+ const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
+ return json;
+ })
+ .sort((a, b) => a.version - b.version);
+
+ res.json(versions);
+ } catch (err) {
+ console.error(err);
+ res.status(500).json({ error: String(err) });
+ }
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| app.get("/versions", (req, res) => { | |
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | |
| const versions = files | |
| .map((file) => { | |
| const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8")); | |
| return json; | |
| }) | |
| .sort((a, b) => a.version - b.version); | |
| res.json(versions); | |
| }); | |
| app.get("/versions", (_req, res) => { | |
| try { | |
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | |
| const versions = files | |
| .map((file) => { | |
| const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8")); | |
| return json; | |
| }) | |
| .sort((a, b) => a.version - b.version); | |
| res.json(versions); | |
| } catch (err) { | |
| console.error(err); | |
| res.status(500).json({ error: String(err) }); | |
| } | |
| }); |
🧰 Tools
🪛 Biome (2.3.13)
[error] 13-13: This parameter is unused.
Unused parameters might be the result of an incomplete refactoring.
Unsafe fix: If this is intentional, prepend req with an underscore.
(lint/correctness/noUnusedFunctionParameters)
🤖 Prompt for AI Agents
In `@tools/debugging/version-visualizer/server.js` around lines 13 - 24, Wrap the
body of the app.get("/versions", (req, res) => { ... }) handler in a try/catch,
rename the unused req parameter to _req to satisfy linting, and add a guard that
checks fs.existsSync(DATA_DIR) (or handle ENOENT) before calling fs.readdirSync;
on error return an appropriate HTTP response (e.g., res.status(500).json({
error: 'Failed to read versions' }) or res.status(404).json([])) and log the
error, while preserving the existing logic that reads files with
fs.readFileSync, JSON.parse and sorts by version.
| app.get("/creationTime", (req, res) => { | ||
| try { | ||
|
|
||
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | ||
| if (files.length === 0) { | ||
| return res.status(404).json({ error: "no json files in data directory" }); | ||
| } | ||
| const fileName = files[0]; | ||
|
|
||
| const filePath = path.join(DATA_DIR, fileName); | ||
|
|
||
| const stat = fs.statSync(filePath); | ||
| const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs; | ||
| const creationTime = new Date(tsMs).toISOString(); | ||
|
|
||
| res.json({ creationTime }); | ||
| } catch (err) { | ||
| console.error(err); | ||
| res.status(500).json({ error: String(err) }); | ||
| } | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix unused parameter and inconsistent indentation.
The req parameter is unused (flagged by Biome) and lines 32-33 have inconsistent indentation.
Proposed fix
-app.get("/creationTime", (req, res) => {
+app.get("/creationTime", (_req, res) => {
try {
-
- const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
- if (files.length === 0) {
- return res.status(404).json({ error: "no json files in data directory" });
- }
- const fileName = files[0];
+ const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
+ if (files.length === 0) {
+ return res.status(404).json({ error: "no json files in data directory" });
+ }
+ const fileName = files[0];
const filePath = path.join(DATA_DIR, fileName);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| app.get("/creationTime", (req, res) => { | |
| try { | |
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | |
| if (files.length === 0) { | |
| return res.status(404).json({ error: "no json files in data directory" }); | |
| } | |
| const fileName = files[0]; | |
| const filePath = path.join(DATA_DIR, fileName); | |
| const stat = fs.statSync(filePath); | |
| const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs; | |
| const creationTime = new Date(tsMs).toISOString(); | |
| res.json({ creationTime }); | |
| } catch (err) { | |
| console.error(err); | |
| res.status(500).json({ error: String(err) }); | |
| } | |
| }); | |
| app.get("/creationTime", (_req, res) => { | |
| try { | |
| const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); | |
| if (files.length === 0) { | |
| return res.status(404).json({ error: "no json files in data directory" }); | |
| } | |
| const fileName = files[0]; | |
| const filePath = path.join(DATA_DIR, fileName); | |
| const stat = fs.statSync(filePath); | |
| const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs; | |
| const creationTime = new Date(tsMs).toISOString(); | |
| res.json({ creationTime }); | |
| } catch (err) { | |
| console.error(err); | |
| res.status(500).json({ error: String(err) }); | |
| } | |
| }); |
🧰 Tools
🪛 Biome (2.3.13)
[error] 30-30: This parameter is unused.
Unused parameters might be the result of an incomplete refactoring.
Unsafe fix: If this is intentional, prepend req with an underscore.
(lint/correctness/noUnusedFunctionParameters)
🤖 Prompt for AI Agents
In `@tools/debugging/version-visualizer/server.js` around lines 30 - 50, The route
handler for app.get("/creationTime") declares an unused req parameter and has
inconsistent indentation in the lines building the files array; update the
handler signature to either remove req or rename it to _req to satisfy the
linter, and fix indentation of the block that reads DATA_DIR (the const files =
fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); and the subsequent
if (files.length === 0) { ... } and const fileName = files[0]; lines) so they
align with the surrounding code and maintain consistent two-space indentation;
keep the rest of the logic (filePath, stat, creationTime, error handling)
unchanged.
| @@ -0,0 +1,3 @@ | |||
| ### Changed | |||
| - Segment items can now be edited by multiple users at the same time (when the annotation is set up for collaborative use). | |||
| - When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled. | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix typo: "happend" → "happened"
📝 Proposed fix
-- When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled.
+- When proofreading, segment items are now always created for new agglomerates. In the past, this only happened when the automatic meshing feature was enabled.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled. | |
| - When proofreading, segment items are now always created for new agglomerates. In the past, this only happened when the automatic meshing feature was enabled. |
🧰 Tools
🪛 LanguageTool
[grammar] ~3-~3: Ensure spelling is correct
Context: ...ew agglomerates. In the past, this only happend when the automatic meshing feature was ...
(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)
🤖 Prompt for AI Agents
In `@unreleased_changes/9124.md` at line 3, Fix the typo in the release note
sentence that currently reads "In the past, this only happend when the automatic
meshing feature was enabled." by changing "happend" to "happened" so the
sentence reads "...this only happened when the automatic meshing feature was
enabled."; update the string in the unreleased change entry (the sentence
starting "When proofreading, segment items are now always created for new
agglomerates.") accordingly.
| // TODOM: The advantage of this implementation: Only one of the optional fields is set | ||
| // Disadvantage: wordiness. Discuss which option to choose |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resolve TODOM comment before merging.
This TODO marker indicates a pending design decision about the implementation approach. Please resolve this discussion and remove the comment before merging.
🤖 Prompt for AI Agents
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala`
around lines 20 - 21, Remove the TODOM note in MetadataEntry.scala and choose
one implementation approach (keep the current
single-option-with-multiple-optional-fields or switch to a sealed trait/ADT with
one case per metadata type); update the code accordingly and delete the TODOM
comment so the design decision is recorded in code or documentation (referencing
the MetadataEntry type) and no TODO markers remain.
| val resultSegment = (sourceSegmentOpt, targetSegmentOpt) match { | ||
| case (None, None) => Segment(segmentId = sourceId, creationTime = actionTimestamp, isVisible = Some(true)) | ||
| case (Some(sourceSegment), None) => sourceSegment | ||
| case (None, Some(targetSegment)) => | ||
| Segment( | ||
| segmentId = sourceId, | ||
| creationTime = actionTimestamp, | ||
| isVisible = targetSegment.isVisible, | ||
| metadata = targetSegment.metadata, | ||
| anchorPosition = targetSegment.anchorPosition, | ||
| groupId = targetSegment.groupId, | ||
| name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name) | ||
| ) | ||
| case (Some(sourceSegment), Some(targetSegment)) => | ||
| sourceSegment.copy( | ||
| name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name), | ||
| metadata = mergeSegmentMetadata(sourceSegment, targetSegment) | ||
| ) | ||
| } | ||
|
|
||
| val withResultSegment = | ||
| if (sourceSegmentOpt.isDefined) tracing.segments.map { segment: Segment => | ||
| if (segment.segmentId == sourceId) resultSegment else segment | ||
| } else tracing.segments :+ resultSegment | ||
|
|
||
| tracing.withSegments(withResultSegment.filter(_.segmentId != targetId)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Merge fallback drops target fields when source is missing.
In the (None, Some(target)) case the new segment only copies a subset of fields, so color/additional coordinates/other attributes are lost. Prefer basing on targetSegment.copy(...) to preserve all data.
🛠️ Suggested fix
- case (None, Some(targetSegment)) =>
- Segment(
- segmentId = sourceId,
- creationTime = actionTimestamp,
- isVisible = targetSegment.isVisible,
- metadata = targetSegment.metadata,
- anchorPosition = targetSegment.anchorPosition,
- groupId = targetSegment.groupId,
- name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name)
- )
+ case (None, Some(targetSegment)) =>
+ targetSegment.copy(
+ segmentId = sourceId,
+ creationTime = actionTimestamp.orElse(targetSegment.creationTime),
+ name = mergeSegmentNames(None, targetSegment.name)
+ )🤖 Prompt for AI Agents
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala`
around lines 587 - 612, The (None, Some(targetSegment)) branch constructs a new
Segment and drops fields from targetSegment; change it to start from
targetSegment.copy(...) so all existing fields (color, coordinates, etc.) are
preserved, overriding only segmentId to sourceId, creationTime to
actionTimestamp, isVisible/name/metadata as needed (use mergeSegmentNames(...)
for name and keep targetSegment.isVisible), i.e., replace the manual
Segment(...) with targetSegment.copy(...) updating those specific fields so the
rest of targetSegment's data remains intact.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In
`@frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts`:
- Line 1169: Remove the leftover debug logging call
ColoredLogger.logGreen("expectMapping") from the proofreading_multi_user.spec.ts
test; locate the call inside the test/sagas/proofreading suite (search for
ColoredLogger.logGreen) and delete that single statement so the test no longer
produces extraneous console output, ensuring no other test logic is altered.
🧹 Nitpick comments (2)
frontend/javascripts/test/helpers/agglomerate_mapping_helper.ts (1)
125-129: Minor: Consider usingpreviousinstead ofthis.versions.at(-1)!for consistency.When no edge is removed, line 127 commits
this.versions.at(-1)!. Sincepreviouswas just retrieved fromthis.versions[this.currentVersion]and no modifications occurred, usingpreviouswould be clearer and more consistent with the rest of the method.💡 Suggested change
if (!didRemove) { // Commit the current version again. - this.commit(this.versions.at(-1)!, bumpVersion); + this.commit(previous, bumpVersion); return; }frontend/javascripts/test/reducers/update_action_application/volume.spec.ts (1)
291-291: Nitpick: Variables declared withletbut never reassigned.
reappliedNewStateis declared withletbut never reassigned in any of the three new test cases. Consider usingconstfor clarity.💡 Suggested change (apply to all three occurrences)
- let reappliedNewState = transformStateAsReadOnly(state1, (state) => + const reappliedNewState = transformStateAsReadOnly(state1, (state) =>Also applies to: 317-317, 343-343
frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts
Outdated
Show resolved
Hide resolved
| yield call(() => api.tracing.save()); // Also pulls newest version from backend. | ||
| } | ||
|
|
||
| // TODO: this test creates a segment item 6 (for agglomerate 6) with an incorrect anchor position (1337^3). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to be fixed in #9036.
|
|
||
| it("should split two agglomerate skeletons if interfering merge makes it an no-op.", async (context: WebknossosTestContext) => { | ||
| const backendMock = mockInitialBucketAndAgglomerateData(context); | ||
| // TODO: the same split update action is sent twice to the server |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to be fixed in #9036.
| await task.toPromise(); | ||
| }, 8000); | ||
|
|
||
| // TODO: the loaded agglomerate skeleton doesnt have the same amount of edges as the actual agglomerate 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to be fixed in #9036.
|
|
||
| it("should try to min cut agglomerate via node ids but interfering merge adds new edge which is not cut. Resulting mapping should be correct.", async (context: WebknossosTestContext) => { | ||
| const backendMock = mockInitialBucketAndAgglomerateData(context); | ||
| // TODO: this test succeeds even though the final agglomerate skeleton is incorrect. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
needs to be fixed in #9036.
| this.name, | ||
| ); | ||
| for (const updateAction of cachedDiffSegmentLists(this.name, prevSegments, newSegments)) { | ||
| // todop: passing [] as the segment journals doesn't make optimal use of caching (the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
todo
| @@ -0,0 +1,389 @@ | |||
| import { diffDiffableMaps } from "libs/diffable_map"; | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lots of code was simply moved.
| @@ -0,0 +1,98 @@ | |||
| <!DOCTYPE html> | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to review since it's a only a debugging tool.
| @@ -0,0 +1,525 @@ | |||
| /* ================================ | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to review since it's a only a debugging tool.
| @@ -0,0 +1,54 @@ | |||
| import express from "express"; | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to review since it's a only a debugging tool.
| stringListValue.getOrElse(Seq.empty) | ||
| ) | ||
|
|
||
| // TODOM: The advantage of this implementation: Only one of the optional fields is set |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@MichaelBuessemeyer todo
|
@MichaelBuessemeyer there are still some smaller todo comments in code, but I think you can already start reviewing, as the PR is quite large. the open checkboxes in the PR description could be done in another PR. That way, we can tackle the merge conflicts with your PR a bit sooner. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
🤖 Fix all issues with AI agents
In
`@frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts`:
- Around line 1160-1163: The test contains a duplicated yield
take("FINISH_MAPPING_INITIALIZATION") call; in the proofreading_multi_user.spec
test remove the extra consecutive yield take("FINISH_MAPPING_INITIALIZATION") so
only a single take remains (i.e., keep one yield
take("FINISH_MAPPING_INITIALIZATION") after the yield call(waitUntilNotBusy) and
delete the duplicate).
In
`@frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts`:
- Around line 78-103: The updateMetadataOfSegment case currently assumes
segment.metadata exists and calls metadata.map, which crashes for segments
without metadata; change the extraction so metadata defaults to an empty array
(e.g., replace "const { metadata } = segment;" with a guarded/default assignment
such as const metadata = segment.metadata ?? [] or
Array.isArray(segment.metadata) ? segment.metadata : []), then proceed with the
same removeKeySet/upsertDict logic and return VolumeTracingReducer(state,
updateSegmentAction(...)) as before; ensure you reference the same identifiers
(case "updateMetadataOfSegment", getSegmentsForLayer, metadata,
VolumeTracingReducer, updateSegmentAction) so behavior and types remain
consistent.
- Around line 117-149: The upsertSegmentGroup case passes raw newParentId
(null/undefined) into moveGroupsHelper which causes moves-to-root to fail
because moveGroupsHelper compares against the virtual root id MISSING_GROUP_ID;
update moveGroupsHelper to normalize its targetParentId (or when calling it from
the "upsertSegmentGroup" branch) by converting null/undefined to
MISSING_GROUP_ID (same normalization used in createGroupHelper) so comparisons
like parentGroup.groupId === targetGroupId succeed and groups attach to the
virtual root as expected.
- Around line 67-76: The createSegment and updateSegmentPartial cases call
VolumeTracingReducer with updateSegmentAction and currently pass Date.now() for
the timestamp, which overwrites server-provided timestamps; change both branches
to read const { actionTracingId, creationTime, ...segment } = ua.value and pass
creationTime ?? Date.now() to updateSegmentAction (i.e., use
ua.value.creationTime when present, otherwise fallback to Date.now()) so
VolumeTracingReducer and updateSegmentAction preserve server timestamps.
In `@frontend/javascripts/viewer/model/reducers/volumetracing_reducer_helpers.ts`:
- Around line 415-449: The code in handleUpdateSegment uses
getSelectedIds(state)[0].group which will throw when getSelectedIds(state)
returns an empty array; to fix, capture the selected entry safely and set
groupId from it with a fallback (e.g., null or undefined) before building
newSegment: call const selected = getSelectedIds(state)[0] (or use optional
chaining), then set groupId: selected ? selected.group : null (or
selected?.group) when creating newSegment so updateSegments/handleUpdateSegment
no longer dereferences undefined.
In `@frontend/javascripts/viewer/model/sagas/saving/rebasing_helpers_sagas.ts`:
- Around line 292-338: partialMapping is currently built from all backend merge
actions and can remap segment ids from other tracings; restrict it to the
current tracing by filtering mergeActions using the tracing identifier. Update
the mergeActions construction (from
appliedBackendUpdateActions.flatMap(...).filter(...)) to also check that each
mergeAction's actionTracingId equals the current action/tracing id (e.g.,
mergeAction.actionTracingId === action.actionTracingId or === currentTracingId)
so partialMapping only includes merges from the same tracing before you iterate
and adapt values.
In `@frontend/javascripts/viewer/model/sagas/volume/volume_diffing.ts`:
- Around line 123-141: In diffSegmentJournals the code uses sortedIndexBy to
compute splitIndex then returns segmentJournal.slice(splitIndex + 1), which will
skip the first newer entry when the previous journal entry is absent or not
found; change the slice call to segmentJournal.slice(splitIndex) (i.e., remove
the +1) so that entries from the correct insertion point are included; keep the
existing guard on prevLatestJournalEntryIndex and the use of sortedIndexBy
(referencing prevLatestJournalEntryIndex, sortedIndexBy, splitIndex, and
segmentJournal.slice) to locate and return the proper diff.
🧹 Nitpick comments (2)
frontend/javascripts/test/sagas/proofreading/proofreading_single_user.spec.ts (2)
106-106: Unused variable can be removed.
_backendMockis captured but never used in this test. Unlike the multi-user tests wherebackendMockis used forplanVersionInjection, this single-user test doesn't need it.🧹 Suggested cleanup
- const _backendMock = mockInitialBucketAndAgglomerateData(context); + mockInitialBucketAndAgglomerateData(context);
148-150: Verify necessity of double save pattern.The pattern of calling
save()twice withwaitUntilNotBusyin between appears intentional (similar pattern exists in multi-user tests with a comment about it). However, it would be helpful to add a brief comment explaining why two saves are needed here for the split scenario.📝 Suggested documentation
+ // First save triggers the split action processing, second save ensures all actions are persisted. yield call(() => api.tracing.save()); yield call(waitUntilNotBusy); yield call(() => api.tracing.save());
frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts
Show resolved
Hide resolved
frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts
Show resolved
Hide resolved
frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts
Show resolved
Hide resolved
frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts
Outdated
Show resolved
Hide resolved
frontend/javascripts/viewer/model/reducers/volumetracing_reducer_helpers.ts
Show resolved
Hide resolved
frontend/javascripts/viewer/model/sagas/saving/rebasing_helpers_sagas.ts
Show resolved
Hide resolved
…tracted from update actions
URL of deployed dev instance (used for testing):
Steps to test:
TODOs:
mergeSegmentsactionIssues:
(Please delete unneeded items, merge only when none are left open)
$PR_NUMBER.mdfile inunreleased_changesor use./tools/create-changelog-entry.py)