Skip to content

Conversation

@philippotto
Copy link
Member

@philippotto philippotto commented Dec 3, 2025

URL of deployed dev instance (used for testing):

  • https://___.webknossos.xyz

Steps to test:

  • monkey test:
    • normal WK
    • normal proofreading (doesn't need to be collaborative in my opinion because it's not enabled yet with this PR)

TODOs:

  • implement basic mechanism for concurrent editing
    • create
    • update
    • delete
  • consider proofreading use case
    • split
    • merge
  • New update actions see design doc
  • fine granular segment update
    • backend
    • frontend
  • fine granular metadata update
    • backend
    • frontend
  • fine granular segment groups update
    • backend
    • frontend
  • new mergeSegments action
    • backend
    • frontend
  • write tests with combinations for
    • create
    • update
    • delete
    • proofreading
      • merge
      • split
  • add live-collab property to annotation to make opt-in clearer
    • backend
    • frontend
      • integrate into UI
      • when activated, only proofreading and move tool should be supported
  • handle when two users create a new group (and use the same id as a result)

Issues:


(Please delete unneeded items, merge only when none are left open)

  • Added changelog entry (create a $PR_NUMBER.md file in unreleased_changes or use ./tools/create-changelog-entry.py)
  • Added migration guide entry if applicable (edit the same file as for the changelog)
  • Updated documentation if applicable
  • Adapted wk-libs python client if relevant API parts change
  • Removed dev-only changes like prints and application.conf edits
  • Considered common edge cases
  • Needs datastore update after deployment

@philippotto philippotto self-assigned this Dec 3, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 3, 2025

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Large frontend and test refactor adding volume/segment merge/metadata/group update actions and handlers, introducing volume-diffing sagas, renaming position fields to anchorPosition, moving utility types to a new module, adding many test fixtures/helpers, and adding a debugging visualizer with server and data output.

Changes

Cohort / File(s) Summary
Type utilities & imports
frontend/javascripts/types/globals.d.ts, frontend/javascripts/types/type_utils.ts, frontend/javascripts/**/...
Moved many utility types from globals.d.ts into type_utils.ts and updated numerous type-only imports across the frontend to the new module.
Segment position / payload rename
frontend/javascripts/viewer/store.ts, frontend/javascripts/viewer/model/actions/volumetracing_actions.ts, frontend/javascripts/viewer/**/segments_tab/*, frontend/javascripts/viewer/**/sagas/**/*
Renamed payload and type fields: somePositionanchorPosition and someAdditionalCoordinatesadditionalCoordinates in actions, reducers, sagas, UI, and tests.
Volume update actions & backend serialization
frontend/javascripts/viewer/model/sagas/volume/update_actions.ts, webknossos-tracingstore/.../VolumeUpdateActions.scala, .../UpdateActions.scala
Introduced partial/metadata/merge/upsert/delete segment-group update actions and legacy aliases; updated backend JSON (de)serialization to recognize the new action names.
Volumetracing reducer & helpers
frontend/javascripts/viewer/model/reducers/volumetracing_reducer.ts, .../volumetracing_reducer_helpers.ts
Added handlers for set/remove/update/merge segments, segmentJournal support, upsert via replaceOrAdd, metadata merge resolution, and helpers to expand parents.
Volume diffing & sagas
frontend/javascripts/viewer/model/sagas/volume/volume_diffing.ts, frontend/javascripts/viewer/model/bucket_data_handling/layer_rendering_manager.ts
New diffVolumeTracing, cached/uncached segment-list diffing, metadata/group diffs, and updated callers to new signatures.
Update-application guards
frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts, .../skeleton.ts
Added ignoreUnsupportedActionTypes flag to server-applied action application paths to optionally skip unsupported actions instead of throwing.
Save, rebasing & save-queue adaptation
frontend/javascripts/viewer/model/sagas/saving/save_saga.tsx, .../rebasing_helpers_sagas.ts
Rebasing logic extended to accept applied backend update batches and adapt save-queue entries (including mergeSegments/create→partial translations and metadata adjustments).
Proofreading & mesh flows
frontend/javascripts/viewer/model/sagas/volume/proofread_saga.ts, .../meshes/precomputed_mesh_saga.ts
Consolidated mesh-loading flow, added ensureSegmentItemAndLoadCoarseMesh, integrated clickSegmentAction/mergeSegmentsAction, and changed channel initialization order.
Tests, fixtures & helpers
frontend/javascripts/test/helpers/*, frontend/javascripts/test/fixtures/*, frontend/javascripts/test/sagas/**
Added many fixtures, helpers (getFlattenedUpdateActions, waitUntilNotBusy, expectMapping), new multi-user and volume-diffing test suites, and updated snapshots.
Agglomerate mapping & visualizer tool
frontend/javascripts/test/helpers/agglomerate_mapping_helper.ts, tools/debugging/version-visualizer/*, .gitignore
Refactored agglomerate mapping to VersionSnapshot; added a version-visualizer web UI, server, data writer, and .gitignore rules to ignore generated data while allowing the folder's .gitignore.
Utility library updates
frontend/javascripts/libs/utils.ts, frontend/javascripts/libs/*
Added diffMaps optional equalityFn, added exported replaceOrAdd, adjusted ColoredLogger output, and other small utility changes.
Store & public types
frontend/javascripts/viewer/store.ts, frontend/javascripts/types/api_types.ts
Added SegmentJournalEntry and segmentJournal on VolumeTracing, rebase volumes in default state, and made ServerSegment.groupId optional.
UI helpers & descriptions
frontend/javascripts/viewer/view/version_entry.tsx, .../tree_hierarchy_view_helpers.ts
Added createGroupHelper, expanded version/action description handlers to cover new actions (merge/upsert/delete-group, updateSegmentPartial, etc.).
Infra, lint & misc
biome.jsonc, package.json, frontend/javascripts/viewer/api/wk_dev.ts, various small files
Added biome rules, adjusted scripts, toggled WkDevFlags.liveCollab, minor logging/comment removals, formatting tweaks.

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Possibly related issues

  • #9045 — Milestone 4: proofreading: this PR implements multi-user proofreading features (merges, mapping/journal handling, rebasing) that align with the milestone objectives.
  • #9036 — mesh/skeleton sync & live-collab follow-ups: changes include mesh/skeleton sync refactors and rebasing/adaptation logic referenced by that issue.

Possibly related PRs

  • #8723 — Live-collab / mutex and proofreading flows: overlapping changes around save/mutex, rebasing, and proofreading plumbing.
  • #8676 — Save-path and differ/wait coordination refactors: related save/rebasing and differ invocation changes.
  • #8546 — Segment visibility and group-related surface: overlaps in segment/group visibility, types, and reducer behavior.

Suggested reviewers

  • fm3

Poem

🐇
I hop through diffs with whiskered cheer,
I stitch the journals, make merges clear,
Maps and graphs in moonlit run,
Tests pass softly — the job is done,
A rabbit applauds this merge, my dear.

🚥 Pre-merge checks | ✅ 4 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 6.38% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title 'Live Collab M4 - Segment Editing' clearly describes the main feature being implemented: segment editing functionality for the live collaboration Milestone 4.
Description check ✅ Passed The description is comprehensive and directly related to the changeset, detailing test steps, completed TODOs, remaining work, and linked issues that guide the implementation.
Linked Issues check ✅ Passed The PR implements the core requirements from issue #9045: support for update actions, fine-granular segment/metadata/group updates, mergeSegments action, and concurrent editing mechanisms, all with corresponding tests.
Out of Scope Changes check ✅ Passed All changes are directly scoped to implementing segment editing for live collaboration M4: segment update actions, metadata handling, segment group operations, merge semantics, testing utilities, and visualization tools for debugging.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch live-m4

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@philippotto philippotto changed the title [WIP] Live Collab M4 - Segment Editing Live Collab M4 - Segment Editing Feb 5, 2026
@philippotto philippotto marked this pull request as ready for review February 5, 2026 12:34
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 18

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
frontend/javascripts/viewer/view/ai_jobs/components/bounding_box_selection.tsx (1)

69-79: ⚠️ Potential issue | 🟠 Major

Update optionFilterProp to use the correct Ant Design v6 API.

The optionFilterProp prop should be nested within the showSearch object, not used as a standalone prop. Change showSearch and optionFilterProp="children" to showSearch={{ optionFilterProp: "label" }} to align with Ant Design v6's semantic API.

Example fix:
    <Select
-     showSearch
+     showSearch={{ optionFilterProp: "label" }}
      placeholder="Select a bounding box"
-     optionFilterProp="children"
      options={options}
      filterOption={filterOption}
      disabled={userBoundingBoxes.length < 1}
      onSelect={setSelectedBoundingBoxId}
      style={style}
      value={value}
    />

Optionally, consider adding a prefix (e.g., an icon) for better visual context if it aligns with the UI design.

frontend/javascripts/test/sagas/proofreading/proofreading_skeleton_test_utils.ts (1)

38-79: ⚠️ Potential issue | 🟠 Major

Traverse adjacency as undirected to avoid missing nodes.
With a directed adjacency list (as produced by AgglomerateMapping), the BFS only follows outgoing edges. If merges connect through a non-root node, nodes reachable only via incoming edges will be skipped, producing incomplete skeletons.

🐛 Suggested fix (include incoming edges during BFS)
-  while (queue.length) {
-    const n = queue.shift()!;
-    const neighbours = adjacencyList.get(n);
-    if (!neighbours) continue;
-    for (const nb of neighbours) {
-      if (!visited.has(nb)) {
-        visited.add(nb);
-        queue.push(nb);
-      }
-    }
-  }
+  while (queue.length) {
+    const n = queue.shift()!;
+    const neighbours = new Set<number>();
+    for (const nb of adjacencyList.get(n) ?? []) neighbours.add(nb);
+    for (const [node, outs] of adjacencyList) {
+      if (outs.has(n)) neighbours.add(node);
+    }
+    if (neighbours.size === 0) continue;
+    for (const nb of neighbours) {
+      if (!visited.has(nb)) {
+        visited.add(nb);
+        queue.push(nb);
+      }
+    }
+  }
frontend/javascripts/viewer/view/version_entry.tsx (1)

386-386: ⚠️ Potential issue | 🟡 Minor

Same typo: double space in description string.

✏️ Proposed fix
-      description: `Updated the segment with id ${firstAction.value.id} in the segments list  of layer ${layerName}.`,
+      description: `Updated the segment with id ${firstAction.value.id} in the segments list of layer ${layerName}.`,
🤖 Fix all issues with AI agents
In `@frontend/javascripts/libs/utils.ts`:
- Around line 1454-1464: The function replaceOrAdd incorrectly uses
Array.prototype.splice and returns its return value (the removed elements)
instead of the modified array; update replaceOrAdd (operating on volumes,
newVolumeTracing and predicate) to produce a new array with the replacement:
create a shallow copy of volumes (e.g., via volumes.slice() or [...volumes]),
then if index === -1 return the copy with newVolumeTracing appended, otherwise
call splice on the copy to replace the element at index and return that copy (or
alternatively use map to replace the matching element and return the new mapped
array).

In
`@frontend/javascripts/test/sagas/proofreading/proofreading_mesh_interaction.spec.ts`:
- Around line 553-555: The test "should perform partitioned min-cut correctly"
uses a mocked segment item for agglomerate 6 with an impossible anchor position
(1337^3), which likely invalidates the partitioned min-cut behaviour; locate
where the mock for agglomerate/segment item 6 is constructed in
proofreading_mesh_interaction.spec.ts (the test block for "should perform
partitioned min-cut correctly") and replace the anchor position with a realistic
coordinate consistent with the mesh/voxel space used by other mocks (or derive
it from the same helper that computes anchors for other agglomerates), then
update any expected outcomes/assertions in that test to match the corrected
anchor; run the test suite to ensure partitioned min-cut assertions now reflect
the corrected mock.

In `@frontend/javascripts/test/sagas/volumetracing/volume_diffing.spec.ts`:
- Around line 173-218: The test setup mistakenly checks
segmentItemsToCreateInSetup.includes(id1) twice so createSegment2 never runs
when only id2 is present; update the second guard to check
segmentItemsToCreateInSetup.includes(id2) so newState applies createSegment2 via
VolumeTracingReducer when id2 is in the array (affecting the
mergeSegmentsAction/id1/id2 scenario used by uncachedDiffSegmentLists and the
expectations around VOLUME_TRACING_ID).

In `@frontend/javascripts/viewer/api/wk_dev.ts`:
- Line 15: The config sets liveCollab: true (in the object in wk_dev.ts) which
must be reverted to opt-in; change liveCollab back to false (or remove the
property so default is false) and update the inline comment from "todop: revert"
to a standard "TODO: revert before merge" so IDE/scan tools pick it up; ensure
the change is made on the liveCollab property in wk_dev.ts and run any local dev
checks to confirm behavior.

In `@frontend/javascripts/viewer/model/helpers/diff_helpers.ts`:
- Around line 121-137: The isDescendant function currently mutates the array
stored in childrenMap because stack is a direct reference and pop() changes it;
change the initialization of stack in isDescendant to use a shallow copy of the
children array (e.g., spread or slice) so popping does not mutate childrenMap,
keep the rest of the logic (visited Set, popping, pushing
childrenMap.get(current) ?? []) intact to preserve DFS behavior.

In `@frontend/javascripts/viewer/model/reducers/volumetracing_reducer_helpers.ts`:
- Around line 462-464: Remove the debug logging calls
(ColoredLogger.logGreen(...) around "merging segments" and the prints of
sourceSegment and targetSegment) from the merge logic so production runs aren't
cluttered; delete the three ColoredLogger.logGreen(...) lines that reference
sourceSegment and targetSegment in the volumetracing reducer helper, and
afterwards run a quick build/lint to remove or delete the ColoredLogger import
if it becomes unused.

In `@frontend/javascripts/viewer/model/sagas/saving/rebasing_helpers_sagas.ts`:
- Around line 2-4: The file imports lodash helpers from "lodash/..." which
conflicts with the project's declared dependency "lodash-es"; update the imports
to use "lodash-es" (either individual paths like "lodash-es/flattenDeep" and
"lodash-es/omitBy" or a single named import from "lodash-es") so that symbols
like flattenDeep and omitBy are imported from lodash-es, leaving other imports
(ColoredLogger, getAdaptToTypeFunction) unchanged.

In
`@frontend/javascripts/viewer/view/right-border-tabs/trees_tab/tree_hierarchy_view_helpers.ts`:
- Around line 292-319: createGroupHelper is creating newGroup without the
required isExpanded property; update it to either call the existing
makeBasicGroupObject(name, newGroupId) to construct newGroup (which sets
isExpanded: false) or explicitly add isExpanded: false to the newGroup literal,
and keep the rest of the logic using MISSING_GROUP_ID, cloneDeep, and callDeep
unchanged; also remove returning newGroupId (return only newSegmentGroups and
newGroup) unless the helper signature must mirror another API that expects
newGroupId to be returned.

In `@frontend/javascripts/viewer/view/version_entry.tsx`:
- Line 372: Fix the minor typo in the description string by removing the extra
space between "list" and "of" in the object that builds the description (the
line using description: `Updated the segment with id ${firstAction.value.id} in
the segments list  of layer ${layerName}.`); locate the string near the usage of
firstAction and layerName in version_entry.tsx and update it to a single space
between "list" and "of".

In `@tools/debugging/version-visualizer/public/main.js`:
- Around line 336-339: renderGraph currently reads width/height from the global
main element causing skeletonRenderer (and any other renderer) to size
incorrectly when a different parent (e.g., main2) is used; update renderGraph to
obtain sizing from the renderer's actual container (use a property like
this.container or this.rendererParent or accept a parentElement argument)
instead of the global main, and ensure any code that constructs or calls
renderGraph (including skeletonRenderer initialization) passes or sets the
correct container element so width/height are computed against that element's
clientWidth/clientHeight rather than global main.
- Around line 190-193: Rename the unused reviver parameter in the JSON.parse
call: in the creation of storeState (the JSON.parse(..., (key, value) => { ...
}) reviver) change the first parameter name from key to _key so the linter stops
flagging an unused variable; keep the logic that checks value != null && typeof
value === "object" && value._isDiffableMap and calls parseDiffableMap(value)
unchanged.
- Around line 445-450: Remove the unused d3 line generator by deleting the const
line = d3.line()... declaration (the variable named "line" that calls
d3.line().x(...).y(...).curve(d3.curveBundle.beta(0.5))). The rendering uses SVG
<line> elements (x1,y1,x2,y2) so you should simply remove the entire "line"
variable declaration from main.js to eliminate dead code and any unused d3.curve
references.
- Around line 208-233: The current build of tableHTML inserts unescaped segment
values (e.g., s.name and s.anchorPosition) into innerHTML on segmentViewDiv
which enables XSS; replace this with safe DOM construction: iterate
Array.from(segments.values()) and for each segment create tr/td elements via
document.createElement, set their text via textContent (or explicitly escape
strings) for s.id, s.name and JSON.stringify(s.anchorPosition), append the rows
to a tbody, then replace segmentViewDiv.innerHTML by first clearing children and
appending the constructed table element; update the logic around tableHTML,
segments, and segmentViewDiv accordingly.

In `@tools/debugging/version-visualizer/server.js`:
- Around line 13-24: Wrap the body of the app.get("/versions", (req, res) => {
... }) handler in a try/catch, rename the unused req parameter to _req to
satisfy linting, and add a guard that checks fs.existsSync(DATA_DIR) (or handle
ENOENT) before calling fs.readdirSync; on error return an appropriate HTTP
response (e.g., res.status(500).json({ error: 'Failed to read versions' }) or
res.status(404).json([])) and log the error, while preserving the existing logic
that reads files with fs.readFileSync, JSON.parse and sorts by version.
- Around line 30-50: The route handler for app.get("/creationTime") declares an
unused req parameter and has inconsistent indentation in the lines building the
files array; update the handler signature to either remove req or rename it to
_req to satisfy the linter, and fix indentation of the block that reads DATA_DIR
(the const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
and the subsequent if (files.length === 0) { ... } and const fileName =
files[0]; lines) so they align with the surrounding code and maintain consistent
two-space indentation; keep the rest of the logic (filePath, stat, creationTime,
error handling) unchanged.

In `@unreleased_changes/9124.md`:
- Line 3: Fix the typo in the release note sentence that currently reads "In the
past, this only happend when the automatic meshing feature was enabled." by
changing "happend" to "happened" so the sentence reads "...this only happened
when the automatic meshing feature was enabled."; update the string in the
unreleased change entry (the sentence starting "When proofreading, segment items
are now always created for new agglomerates.") accordingly.

In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala`:
- Around line 20-21: Remove the TODOM note in MetadataEntry.scala and choose one
implementation approach (keep the current
single-option-with-multiple-optional-fields or switch to a sealed trait/ADT with
one case per metadata type); update the code accordingly and delete the TODOM
comment so the design decision is recorded in code or documentation (referencing
the MetadataEntry type) and no TODO markers remain.

In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala`:
- Around line 587-612: The (None, Some(targetSegment)) branch constructs a new
Segment and drops fields from targetSegment; change it to start from
targetSegment.copy(...) so all existing fields (color, coordinates, etc.) are
preserved, overriding only segmentId to sourceId, creationTime to
actionTimestamp, isVisible/name/metadata as needed (use mergeSegmentNames(...)
for name and keep targetSegment.isVisible), i.e., replace the manual
Segment(...) with targetSegment.copy(...) updating those specific fields so the
rest of targetSegment's data remains intact.
🧹 Nitpick comments (20)
frontend/javascripts/viewer/model/sagas/saving/save_queue_draining_saga.tsx (1)

181-183: Avoid committed commented-out debug logs.

If this log is still useful, gate it behind a dev flag and use the logger; otherwise remove it to reduce noise.

🧹 Suggested cleanup
-  // ColoredLogger.logRed("sendSaveRequestToServer fullSaveQueue", fullSaveQueue);
frontend/javascripts/viewer/model/helpers/action_logger_middleware.ts (1)

55-59: Consider removing commented code and optionally preserving payload visibility.

The console grouping is now commented out, and logging has been simplified to show only action.type instead of the full action object. While this reduces log verbosity (likely beneficial with increased action volume from live collaboration), it also removes payload visibility that can be valuable during debugging.

Consider:

  1. Remove commented lines: Lines 55 and 59 should be deleted rather than commented out—git history preserves them if needed later.
  2. Optional enhancement: If detailed debugging is occasionally needed, you could conditionally log the full action based on an additional flag or action type filter.
♻️ Proposed cleanup
       if (WkDevFlags.logActions) {
-        // console.group(action.type);
         console.info("dispatching", action.type);
         let result = next(action);
-        // console.log('next state', store.getState())
-        // console.groupEnd();
         return result;
       }
frontend/javascripts/test/sagas/volumetracing/segment_group_fixtures.ts (1)

139-141: Minor formatting: extra blank line.

There's a stray blank line at line 140 that breaks consistency with the other fixture entries.

🧹 Remove extra blank line
       isExpanded: false,
-
       children: [
frontend/javascripts/viewer/model/sagas/undo_saga.ts (1)

47-49: Tighten typing to avoid the string[] cast (and the any cast downstream).

You can preserve type safety by typing the list and using a small type guard instead of casting. This keeps includes type-safe and avoids widening action.

♻️ Suggested refactor
-const UndoRedoRelevantBoundingBoxActions = AllUserBoundingBoxActions.filter(
-  (action) => action !== "SET_USER_BOUNDING_BOXES",
-);
+const UndoRedoRelevantBoundingBoxActions: ReadonlyArray<UserBoundingBoxAction["type"]> =
+  AllUserBoundingBoxActions.filter(
+    (action): action is UserBoundingBoxAction["type"] =>
+      action !== "SET_USER_BOUNDING_BOXES",
+  );
+
+const isUserBoundingBoxAction = (action: Action): action is UserBoundingBoxAction =>
+  UndoRedoRelevantBoundingBoxActions.includes(action.type);
-  } else if ((UndoRedoRelevantBoundingBoxActions as string[]).includes(action.type)) {
+  } else if (isUserBoundingBoxAction(action)) {
     return {
-      userBoundingBoxAction: action as any as UserBoundingBoxAction,
+      userBoundingBoxAction: action,
     };
   }

Also applies to: 123-126

test/backend/VolumeUpdateActionsUnitTestSuite.scala (2)

246-292: Minor inconsistency in assertion variable.

The test logic is correct, but line 286 uses segment.segmentId from the first result instead of segment2.segmentId from the second result. While the values are identical (both are segment ID 5), using segment2 would be more consistent with the surrounding assertions.

Proposed fix
       val segmentMetadata2 = segment2.metadata.map(MetadataEntry.fromProto)
-      assert(segment.segmentId == updateMetadataAction.id)
+      assert(segment2.segmentId == deleteMetadataAction.id)
       assert(

496-528: Consider adding edge case tests for delete operations.

The recursive deletion tests are solid, but consider adding coverage for:

  1. Deleting a leaf group (e.g., groupId4, groupId6, or groupId7) to verify no unintended side effects
  2. Deleting a non-existent group to verify graceful handling

These edge cases would strengthen confidence in the delete implementation.

frontend/javascripts/viewer/model/sagas/saving/save_mutex_saga.tsx (1)

266-269: Template literal whitespace may affect display.

The multi-line template literal preserves the leading whitespace from lines 267-269, which will result in extra spaces in the rendered toast message. Consider using string concatenation or removing the indentation to ensure clean message formatting.

🔧 Suggested fix
       Toast.warning(
-        `Could not get the annotations write-lock for more than ${MAX_AD_HOC_RETRY_TIME / 1000} seconds.
-        User ${blockingUserName} is currently blocking the annotation.
-        This might be due to using non-live collab supported features.
+        `Could not get the annotations write-lock for more than ${MAX_AD_HOC_RETRY_TIME / 1000} seconds. ` +
+        `User ${blockingUserName} is currently blocking the annotation. ` +
+        `This might be due to using non-live collab supported features. ` +
         Ensure they are sticking to tools supporting live collaboration.`,
frontend/javascripts/viewer/model/reducers/skeletontracing_reducer.ts (1)

1319-1321: Consider addressing or tracking the TODO.

The todop: prefix suggests this is a personal reminder. If this represents a known gap that should be addressed, consider either:

  1. Creating a follow-up issue to track the uniqueness enforcement
  2. Removing the TODO if uniqueness is already handled elsewhere (e.g., during metadata merging as mentioned in PR objectives)
frontend/javascripts/test/sagas/save_mutex_saga.spec.ts (1)

411-447: Consider simplifying the tool parameterization structure.

Wrapping AnnotationTool values in { tool: ... } objects adds indirection. If only the tool is needed for parameterization, the simpler array [AnnotationTool.SKELETON, AnnotationTool.BOUNDING_BOX] with direct access would be cleaner:

describe.each([AnnotationTool.SKELETON, AnnotationTool.BOUNDING_BOX])(
  "[With AnnotationTool=$id]:",
  (tool) => { ... }
)

However, if this structure anticipates future properties per tool entry, the current approach is fine.

frontend/javascripts/test/helpers/agglomerate_mapping_helper.ts (1)

118-130: Clarify directed vs. undirected edge semantics.

The adjacency list stores edges directionally (only A→B in addEdge at Line 79), but removeEdge attempts to delete both A→B and B→A (Lines 119-124). This asymmetry means didRemove may be true for only one direction.

The BFS traversal (Lines 150-161) correctly handles this by checking both directions during graph traversal, so the split logic works correctly. However, the code could be clearer about whether the graph is intended to be directed or undirected.

Consider adding a comment clarifying this is an intentional directed storage with bidirectional traversal, or making addEdge symmetric by adding the reverse edge as well.

frontend/javascripts/types/type_utils.ts (1)

12-17: Mutable<T> and Writeable<T> are identical.

Both types use the same implementation (-readonly [K in keyof T]: T[K]) to remove the readonly modifier. Consider removing one to avoid redundancy, or add a comment explaining why both are needed (e.g., for compatibility with different naming conventions in the codebase).

Suggested consolidation
 export type Mutable<T> = {
   -readonly [K in keyof T]: T[K];
 };
-export type Writeable<T> = {
-  -readonly [P in keyof T]: T[P];
-};
+// Alias for compatibility
+export type Writeable<T> = Mutable<T>;
webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala (1)

22-57: Consider documenting the priority order behavior.

When that has multiple optional fields defined (e.g., both stringValue and boolValue), only the first matching field by priority order (stringValue > boolValue > numberValue > stringListValue) will be applied. This implicit prioritization should either be documented or the method should validate that only one field is defined.

🔧 Optional: Add validation for single-field constraint
def update(that: MetadataEntry): MetadataEntry =
  if (this.key != that.key) {
    this
  } else {
    val definedFields = Seq(
      that.stringValue.isDefined,
      that.boolValue.isDefined,
      that.numberValue.isDefined,
      that.stringListValue.isDefined
    ).count(identity)
    
    require(definedFields <= 1, "MetadataEntry update should have at most one value field defined")
    
    // ... rest of implementation
  }
frontend/javascripts/test/reducers/update_action_application/volume.spec.ts (1)

235-237: Remove debug logging from the test loop.

These logs will make CI output noisy without adding signal.

🧹 Proposed cleanup
-        ColoredLogger.logYellow("Segments before", Array.from(volumeTracing2.segments.keys()));
-        console.log("actionsToApply", actionsToApply);
-        ColoredLogger.logYellow("Segments after", Array.from(volumeTracing3.segments.keys()));
webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala (2)

423-463: Track the TODO for metadata updates in partial segment action.

Metadata updates are currently excluded here; ensure callers rely on UpdateMetadataOfSegmentVolumeAction or implement the TODO.

If you want, I can draft the metadata‑update extension for this action.


659-678: Avoid debug placeholder name for new groups.

"Default Name …" looks like a temporary value; consider using the same user‑facing default as the frontend or requiring a name.

🧽 Suggested tweak
-      name = name.getOrElse(s"Default Name $groupId"), // TODOM: undo default name - used for debugging
+      name = name.getOrElse(s"Group $groupId"),
frontend/javascripts/test/sagas/volumetracing/volume_segment_items_multi_user.spec.ts (1)

164-173: Resolve the TODO around mutex switching.
Line 172 notes the flow relies on an extra save to switch strategies; please address or track it to avoid brittle behavior. I can help draft a fix or follow-up issue if useful.

frontend/javascripts/test/helpers/debugging_state_serializer.ts (1)

84-92: Consider handling a missing output directory gracefully.
If the debug directory doesn’t exist in a fresh checkout, readdir will throw. A small ENOENT guard keeps the helper more robust.

🔧 Optional ENOENT guard
 async function deleteOldFiles() {
-  const entries = await readdir(DEBUG_OUTPUT_DIR, { withFileTypes: true });
+  let entries;
+  try {
+    entries = await readdir(DEBUG_OUTPUT_DIR, { withFileTypes: true });
+  } catch (err) {
+    if ((err as NodeJS.ErrnoException).code === "ENOENT") return;
+    throw err;
+  }
 
   const files = entries.filter((entry) => entry.isFile() && entry.name !== ".gitignore");
 
   for (const file of files) {
frontend/javascripts/viewer/model/bucket_data_handling/layer_rendering_manager.ts (1)

364-374: Reuse empty journals to keep memoization effective.

Passing fresh [] each call defeats referential equality and weakens caching. Consider hoisting a shared empty array until real journals are wired.

Proposed refactor
 const LOOKUP_CUCKOO_TEXTURE_WIDTH = 256;
+const EMPTY_SEGMENT_JOURNAL = [];
@@
       for (const updateAction of cachedDiffSegmentLists(
         this.name,
         prevSegments,
         newSegments,
-        [],
-        [],
+        EMPTY_SEGMENT_JOURNAL,
+        EMPTY_SEGMENT_JOURNAL,
       )) {
frontend/javascripts/test/sagas/proofreading/proofreading_skeleton_interaction.spec.ts (1)

513-745: TODOs here flag known gaps worth tracking.

The notes about duplicate split actions, mismatched edge counts, and a passing test with an incorrect skeleton could hide regressions. Consider tracking these as follow-up issues or adding assertions. I can help draft fixes or test assertions if you want.

frontend/javascripts/viewer/model/sagas/volume/volume_diffing.ts (1)

365-381: Consider documenting the fallback creationTime: 0 behavior.

When prevSourceSegment is null and fallbackCreationTime is also unavailable, the segment is created with creationTime: 0. While this is a safe default, it might be worth adding a brief comment explaining this edge case, as a creationTime of 0 could be confusing when debugging or analyzing segment data.

export const WkDevFlags = {
logActions: false,
liveCollab: false,
liveCollab: true, // todop: revert
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Revert this flag before merging to master.

The liveCollab flag is set to true with a TODO to revert. Per the PR objectives, the live-collab property should be opt-in behavior, not enabled globally by default. This change should be reverted before the PR is merged.

Additionally, consider using the standard TODO format instead of todop to ensure it's caught by standard TODO scanners and IDE integrations.

Suggested fix
-  liveCollab: true, // todop: revert
+  liveCollab: false,
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
liveCollab: true, // todop: revert
liveCollab: false,
🤖 Prompt for AI Agents
In `@frontend/javascripts/viewer/api/wk_dev.ts` at line 15, The config sets
liveCollab: true (in the object in wk_dev.ts) which must be reverted to opt-in;
change liveCollab back to false (or remove the property so default is false) and
update the inline comment from "todop: revert" to a standard "TODO: revert
before merge" so IDE/scan tools pick it up; ensure the change is made on the
liveCollab property in wk_dev.ts and run any local dev checks to confirm
behavior.

Comment on lines +13 to +24
app.get("/versions", (req, res) => {
const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));

const versions = files
.map((file) => {
const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
return json;
})
.sort((a, b) => a.version - b.version);

res.json(versions);
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add error handling and fix unused parameter.

The /versions endpoint lacks error handling if DATA_DIR doesn't exist or is inaccessible. Also, req parameter should be prefixed with underscore per linting rules.

Proposed fix
-app.get("/versions", (req, res) => {
-  const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
-
-  const versions = files
-    .map((file) => {
-      const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
-      return json;
-    })
-    .sort((a, b) => a.version - b.version);
-
-  res.json(versions);
+app.get("/versions", (_req, res) => {
+  try {
+    const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
+
+    const versions = files
+      .map((file) => {
+        const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
+        return json;
+      })
+      .sort((a, b) => a.version - b.version);
+
+    res.json(versions);
+  } catch (err) {
+    console.error(err);
+    res.status(500).json({ error: String(err) });
+  }
 });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
app.get("/versions", (req, res) => {
const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
const versions = files
.map((file) => {
const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
return json;
})
.sort((a, b) => a.version - b.version);
res.json(versions);
});
app.get("/versions", (_req, res) => {
try {
const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
const versions = files
.map((file) => {
const json = JSON.parse(fs.readFileSync(path.join(DATA_DIR, file), "utf8"));
return json;
})
.sort((a, b) => a.version - b.version);
res.json(versions);
} catch (err) {
console.error(err);
res.status(500).json({ error: String(err) });
}
});
🧰 Tools
🪛 Biome (2.3.13)

[error] 13-13: This parameter is unused.

Unused parameters might be the result of an incomplete refactoring.
Unsafe fix: If this is intentional, prepend req with an underscore.

(lint/correctness/noUnusedFunctionParameters)

🤖 Prompt for AI Agents
In `@tools/debugging/version-visualizer/server.js` around lines 13 - 24, Wrap the
body of the app.get("/versions", (req, res) => { ... }) handler in a try/catch,
rename the unused req parameter to _req to satisfy linting, and add a guard that
checks fs.existsSync(DATA_DIR) (or handle ENOENT) before calling fs.readdirSync;
on error return an appropriate HTTP response (e.g., res.status(500).json({
error: 'Failed to read versions' }) or res.status(404).json([])) and log the
error, while preserving the existing logic that reads files with
fs.readFileSync, JSON.parse and sorts by version.

Comment on lines +30 to +50
app.get("/creationTime", (req, res) => {
try {

const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
if (files.length === 0) {
return res.status(404).json({ error: "no json files in data directory" });
}
const fileName = files[0];

const filePath = path.join(DATA_DIR, fileName);

const stat = fs.statSync(filePath);
const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs;
const creationTime = new Date(tsMs).toISOString();

res.json({ creationTime });
} catch (err) {
console.error(err);
res.status(500).json({ error: String(err) });
}
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix unused parameter and inconsistent indentation.

The req parameter is unused (flagged by Biome) and lines 32-33 have inconsistent indentation.

Proposed fix
-app.get("/creationTime", (req, res) => {
+app.get("/creationTime", (_req, res) => {
   try {
-    
-      const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
-      if (files.length === 0) {
-        return res.status(404).json({ error: "no json files in data directory" });
-      }
-      const fileName = files[0];
+    const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
+    if (files.length === 0) {
+      return res.status(404).json({ error: "no json files in data directory" });
+    }
+    const fileName = files[0];
 
     const filePath = path.join(DATA_DIR, fileName);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
app.get("/creationTime", (req, res) => {
try {
const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
if (files.length === 0) {
return res.status(404).json({ error: "no json files in data directory" });
}
const fileName = files[0];
const filePath = path.join(DATA_DIR, fileName);
const stat = fs.statSync(filePath);
const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs;
const creationTime = new Date(tsMs).toISOString();
res.json({ creationTime });
} catch (err) {
console.error(err);
res.status(500).json({ error: String(err) });
}
});
app.get("/creationTime", (_req, res) => {
try {
const files = fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json"));
if (files.length === 0) {
return res.status(404).json({ error: "no json files in data directory" });
}
const fileName = files[0];
const filePath = path.join(DATA_DIR, fileName);
const stat = fs.statSync(filePath);
const tsMs = stat.birthtimeMs || stat.ctimeMs || stat.mtimeMs;
const creationTime = new Date(tsMs).toISOString();
res.json({ creationTime });
} catch (err) {
console.error(err);
res.status(500).json({ error: String(err) });
}
});
🧰 Tools
🪛 Biome (2.3.13)

[error] 30-30: This parameter is unused.

Unused parameters might be the result of an incomplete refactoring.
Unsafe fix: If this is intentional, prepend req with an underscore.

(lint/correctness/noUnusedFunctionParameters)

🤖 Prompt for AI Agents
In `@tools/debugging/version-visualizer/server.js` around lines 30 - 50, The route
handler for app.get("/creationTime") declares an unused req parameter and has
inconsistent indentation in the lines building the files array; update the
handler signature to either remove req or rename it to _req to satisfy the
linter, and fix indentation of the block that reads DATA_DIR (the const files =
fs.readdirSync(DATA_DIR).filter((f) => f.endsWith(".json")); and the subsequent
if (files.length === 0) { ... } and const fileName = files[0]; lines) so they
align with the surrounding code and maintain consistent two-space indentation;
keep the rest of the logic (filePath, stat, creationTime, error handling)
unchanged.

@@ -0,0 +1,3 @@
### Changed
- Segment items can now be edited by multiple users at the same time (when the annotation is set up for collaborative use).
- When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix typo: "happend" → "happened"

📝 Proposed fix
-- When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled.
+- When proofreading, segment items are now always created for new agglomerates. In the past, this only happened when the automatic meshing feature was enabled.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- When proofreading, segment items are now always created for new agglomerates. In the past, this only happend when the automatic meshing feature was enabled.
- When proofreading, segment items are now always created for new agglomerates. In the past, this only happened when the automatic meshing feature was enabled.
🧰 Tools
🪛 LanguageTool

[grammar] ~3-~3: Ensure spelling is correct
Context: ...ew agglomerates. In the past, this only happend when the automatic meshing feature was ...

(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)

🤖 Prompt for AI Agents
In `@unreleased_changes/9124.md` at line 3, Fix the typo in the release note
sentence that currently reads "In the past, this only happend when the automatic
meshing feature was enabled." by changing "happend" to "happened" so the
sentence reads "...this only happened when the automatic meshing feature was
enabled."; update the string in the unreleased change entry (the sentence
starting "When proofreading, segment items are now always created for new
agglomerates.") accordingly.

Comment on lines +20 to +21
// TODOM: The advantage of this implementation: Only one of the optional fields is set
// Disadvantage: wordiness. Discuss which option to choose
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Resolve TODOM comment before merging.

This TODO marker indicates a pending design decision about the implementation approach. Please resolve this discussion and remove the comment before merging.

🤖 Prompt for AI Agents
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/MetadataEntry.scala`
around lines 20 - 21, Remove the TODOM note in MetadataEntry.scala and choose
one implementation approach (keep the current
single-option-with-multiple-optional-fields or switch to a sealed trait/ADT with
one case per metadata type); update the code accordingly and delete the TODOM
comment so the design decision is recorded in code or documentation (referencing
the MetadataEntry type) and no TODO markers remain.

Comment on lines +587 to +612
val resultSegment = (sourceSegmentOpt, targetSegmentOpt) match {
case (None, None) => Segment(segmentId = sourceId, creationTime = actionTimestamp, isVisible = Some(true))
case (Some(sourceSegment), None) => sourceSegment
case (None, Some(targetSegment)) =>
Segment(
segmentId = sourceId,
creationTime = actionTimestamp,
isVisible = targetSegment.isVisible,
metadata = targetSegment.metadata,
anchorPosition = targetSegment.anchorPosition,
groupId = targetSegment.groupId,
name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name)
)
case (Some(sourceSegment), Some(targetSegment)) =>
sourceSegment.copy(
name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name),
metadata = mergeSegmentMetadata(sourceSegment, targetSegment)
)
}

val withResultSegment =
if (sourceSegmentOpt.isDefined) tracing.segments.map { segment: Segment =>
if (segment.segmentId == sourceId) resultSegment else segment
} else tracing.segments :+ resultSegment

tracing.withSegments(withResultSegment.filter(_.segmentId != targetId))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Merge fallback drops target fields when source is missing.

In the (None, Some(target)) case the new segment only copies a subset of fields, so color/additional coordinates/other attributes are lost. Prefer basing on targetSegment.copy(...) to preserve all data.

🛠️ Suggested fix
-      case (None, Some(targetSegment)) =>
-        Segment(
-          segmentId = sourceId,
-          creationTime = actionTimestamp,
-          isVisible = targetSegment.isVisible,
-          metadata = targetSegment.metadata,
-          anchorPosition = targetSegment.anchorPosition,
-          groupId = targetSegment.groupId,
-          name = mergeSegmentNames(sourceSegmentOpt.flatMap(_.name), targetSegment.name)
-        )
+      case (None, Some(targetSegment)) =>
+        targetSegment.copy(
+          segmentId = sourceId,
+          creationTime = actionTimestamp.orElse(targetSegment.creationTime),
+          name = mergeSegmentNames(None, targetSegment.name)
+        )
🤖 Prompt for AI Agents
In
`@webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeUpdateActions.scala`
around lines 587 - 612, The (None, Some(targetSegment)) branch constructs a new
Segment and drops fields from targetSegment; change it to start from
targetSegment.copy(...) so all existing fields (color, coordinates, etc.) are
preserved, overriding only segmentId to sourceId, creationTime to
actionTimestamp, isVisible/name/metadata as needed (use mergeSegmentNames(...)
for name and keep targetSegment.isVisible), i.e., replace the manual
Segment(...) with targetSegment.copy(...) updating those specific fields so the
rest of targetSegment's data remains intact.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In
`@frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts`:
- Line 1169: Remove the leftover debug logging call
ColoredLogger.logGreen("expectMapping") from the proofreading_multi_user.spec.ts
test; locate the call inside the test/sagas/proofreading suite (search for
ColoredLogger.logGreen) and delete that single statement so the test no longer
produces extraneous console output, ensuring no other test logic is altered.
🧹 Nitpick comments (2)
frontend/javascripts/test/helpers/agglomerate_mapping_helper.ts (1)

125-129: Minor: Consider using previous instead of this.versions.at(-1)! for consistency.

When no edge is removed, line 127 commits this.versions.at(-1)!. Since previous was just retrieved from this.versions[this.currentVersion] and no modifications occurred, using previous would be clearer and more consistent with the rest of the method.

💡 Suggested change
     if (!didRemove) {
       // Commit the current version again.
-      this.commit(this.versions.at(-1)!, bumpVersion);
+      this.commit(previous, bumpVersion);
       return;
     }
frontend/javascripts/test/reducers/update_action_application/volume.spec.ts (1)

291-291: Nitpick: Variables declared with let but never reassigned.

reappliedNewState is declared with let but never reassigned in any of the three new test cases. Consider using const for clarity.

💡 Suggested change (apply to all three occurrences)
-    let reappliedNewState = transformStateAsReadOnly(state1, (state) =>
+    const reappliedNewState = transformStateAsReadOnly(state1, (state) =>

Also applies to: 317-317, 343-343

yield call(() => api.tracing.save()); // Also pulls newest version from backend.
}

// TODO: this test creates a segment item 6 (for agglomerate 6) with an incorrect anchor position (1337^3).
Copy link
Member Author

@philippotto philippotto Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs to be fixed in #9036.


it("should split two agglomerate skeletons if interfering merge makes it an no-op.", async (context: WebknossosTestContext) => {
const backendMock = mockInitialBucketAndAgglomerateData(context);
// TODO: the same split update action is sent twice to the server
Copy link
Member Author

@philippotto philippotto Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs to be fixed in #9036.

await task.toPromise();
}, 8000);

// TODO: the loaded agglomerate skeleton doesnt have the same amount of edges as the actual agglomerate 1
Copy link
Member Author

@philippotto philippotto Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs to be fixed in #9036.


it("should try to min cut agglomerate via node ids but interfering merge adds new edge which is not cut. Resulting mapping should be correct.", async (context: WebknossosTestContext) => {
const backendMock = mockInitialBucketAndAgglomerateData(context);
// TODO: this test succeeds even though the final agglomerate skeleton is incorrect.
Copy link
Member Author

@philippotto philippotto Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs to be fixed in #9036.

this.name,
);
for (const updateAction of cachedDiffSegmentLists(this.name, prevSegments, newSegments)) {
// todop: passing [] as the segment journals doesn't make optimal use of caching (the
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

todo

@@ -0,0 +1,389 @@
import { diffDiffableMaps } from "libs/diffable_map";
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lots of code was simply moved.

@@ -0,0 +1,98 @@
<!DOCTYPE html>
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to review since it's a only a debugging tool.

@@ -0,0 +1,525 @@
/* ================================
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to review since it's a only a debugging tool.

@@ -0,0 +1,54 @@
import express from "express";
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to review since it's a only a debugging tool.

stringListValue.getOrElse(Seq.empty)
)

// TODOM: The advantage of this implementation: Only one of the optional fields is set
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@philippotto
Copy link
Member Author

@MichaelBuessemeyer there are still some smaller todo comments in code, but I think you can already start reviewing, as the PR is quite large. the open checkboxes in the PR description could be done in another PR. That way, we can tackle the merge conflicts with your PR a bit sooner.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🤖 Fix all issues with AI agents
In
`@frontend/javascripts/test/sagas/proofreading/proofreading_multi_user.spec.ts`:
- Around line 1160-1163: The test contains a duplicated yield
take("FINISH_MAPPING_INITIALIZATION") call; in the proofreading_multi_user.spec
test remove the extra consecutive yield take("FINISH_MAPPING_INITIALIZATION") so
only a single take remains (i.e., keep one yield
take("FINISH_MAPPING_INITIALIZATION") after the yield call(waitUntilNotBusy) and
delete the duplicate).

In
`@frontend/javascripts/viewer/model/reducers/update_action_application/volume.ts`:
- Around line 78-103: The updateMetadataOfSegment case currently assumes
segment.metadata exists and calls metadata.map, which crashes for segments
without metadata; change the extraction so metadata defaults to an empty array
(e.g., replace "const { metadata } = segment;" with a guarded/default assignment
such as const metadata = segment.metadata ?? [] or
Array.isArray(segment.metadata) ? segment.metadata : []), then proceed with the
same removeKeySet/upsertDict logic and return VolumeTracingReducer(state,
updateSegmentAction(...)) as before; ensure you reference the same identifiers
(case "updateMetadataOfSegment", getSegmentsForLayer, metadata,
VolumeTracingReducer, updateSegmentAction) so behavior and types remain
consistent.
- Around line 117-149: The upsertSegmentGroup case passes raw newParentId
(null/undefined) into moveGroupsHelper which causes moves-to-root to fail
because moveGroupsHelper compares against the virtual root id MISSING_GROUP_ID;
update moveGroupsHelper to normalize its targetParentId (or when calling it from
the "upsertSegmentGroup" branch) by converting null/undefined to
MISSING_GROUP_ID (same normalization used in createGroupHelper) so comparisons
like parentGroup.groupId === targetGroupId succeed and groups attach to the
virtual root as expected.
- Around line 67-76: The createSegment and updateSegmentPartial cases call
VolumeTracingReducer with updateSegmentAction and currently pass Date.now() for
the timestamp, which overwrites server-provided timestamps; change both branches
to read const { actionTracingId, creationTime, ...segment } = ua.value and pass
creationTime ?? Date.now() to updateSegmentAction (i.e., use
ua.value.creationTime when present, otherwise fallback to Date.now()) so
VolumeTracingReducer and updateSegmentAction preserve server timestamps.

In `@frontend/javascripts/viewer/model/reducers/volumetracing_reducer_helpers.ts`:
- Around line 415-449: The code in handleUpdateSegment uses
getSelectedIds(state)[0].group which will throw when getSelectedIds(state)
returns an empty array; to fix, capture the selected entry safely and set
groupId from it with a fallback (e.g., null or undefined) before building
newSegment: call const selected = getSelectedIds(state)[0] (or use optional
chaining), then set groupId: selected ? selected.group : null (or
selected?.group) when creating newSegment so updateSegments/handleUpdateSegment
no longer dereferences undefined.

In `@frontend/javascripts/viewer/model/sagas/saving/rebasing_helpers_sagas.ts`:
- Around line 292-338: partialMapping is currently built from all backend merge
actions and can remap segment ids from other tracings; restrict it to the
current tracing by filtering mergeActions using the tracing identifier. Update
the mergeActions construction (from
appliedBackendUpdateActions.flatMap(...).filter(...)) to also check that each
mergeAction's actionTracingId equals the current action/tracing id (e.g.,
mergeAction.actionTracingId === action.actionTracingId or === currentTracingId)
so partialMapping only includes merges from the same tracing before you iterate
and adapt values.

In `@frontend/javascripts/viewer/model/sagas/volume/volume_diffing.ts`:
- Around line 123-141: In diffSegmentJournals the code uses sortedIndexBy to
compute splitIndex then returns segmentJournal.slice(splitIndex + 1), which will
skip the first newer entry when the previous journal entry is absent or not
found; change the slice call to segmentJournal.slice(splitIndex) (i.e., remove
the +1) so that entries from the correct insertion point are included; keep the
existing guard on prevLatestJournalEntryIndex and the use of sortedIndexBy
(referencing prevLatestJournalEntryIndex, sortedIndexBy, splitIndex, and
segmentJournal.slice) to locate and return the proper diff.
🧹 Nitpick comments (2)
frontend/javascripts/test/sagas/proofreading/proofreading_single_user.spec.ts (2)

106-106: Unused variable can be removed.

_backendMock is captured but never used in this test. Unlike the multi-user tests where backendMock is used for planVersionInjection, this single-user test doesn't need it.

🧹 Suggested cleanup
-    const _backendMock = mockInitialBucketAndAgglomerateData(context);
+    mockInitialBucketAndAgglomerateData(context);

148-150: Verify necessity of double save pattern.

The pattern of calling save() twice with waitUntilNotBusy in between appears intentional (similar pattern exists in multi-user tests with a comment about it). However, it would be helpful to add a brief comment explaining why two saves are needed here for the split scenario.

📝 Suggested documentation
+      // First save triggers the split action processing, second save ensures all actions are persisted.
       yield call(() => api.tracing.save());
       yield call(waitUntilNotBusy);
       yield call(() => api.tracing.save());

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Milestone 4: Proofreading with segment list (but without undo)

4 participants