Conversation
|
Caution Review failedThe pull request is closed. WalkthroughThis pull request introduces an epoch-aligned snapshot system for tracking entity state changes at regular intervals. It adds Changes
Sequence Diagram(s)sequenceDiagram
actor Handler as Event Handler
participant Aggregator as Aggregator (e.g., LiquidityPoolAggregator)
participant Shared as Snapshots/Shared
participant Snapshot as Snapshots/{Entity}Snapshot
participant Context as Handler Context
Handler->>Aggregator: updateLiquidityPoolAggregator(timestamp, ...)
Aggregator->>Shared: shouldSnapshot(lastSnapshotTimestamp, timestamp)
Shared-->>Aggregator: boolean (new epoch?)
alt New Epoch
Aggregator->>Shared: getSnapshotEpoch(timestamp)
Shared-->>Aggregator: epochTime: Date
Aggregator->>Aggregator: lastSnapshotTimestamp = epochTime
Aggregator->>Snapshot: setLiquidityPoolAggregatorSnapshot(entity, timestamp, context)
Snapshot->>Snapshot: snapshotId = LiquidityPoolAggregatorSnapshotId(chainId, poolAddress, epochMs)
Snapshot->>Snapshot: snapshot = {id, poolAddress, fields from entity, timestamp (epoch), ...}
Snapshot->>Context: LiquidityPoolAggregatorSnapshot.set(snapshot)
Context-->>Snapshot: void
else Same Epoch
Aggregator->>Aggregator: Keep lastSnapshotTimestamp unchanged
end
Aggregator-->>Handler: Updated entity with lastSnapshotTimestamp
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
modified a bit this file
| /** | ||
| * Creates and stores a snapshot of the current state of a LiquidityPoolAggregator. | ||
| * | ||
| * This function is used to capture the state of a liquidity pool aggregator at a specific | ||
| * point in time. The snapshot includes the pool's ID, a unique snapshot ID, and the timestamp | ||
| * of the last update. | ||
| * | ||
| * @param liquidityPoolAggregator - The current state of the liquidity pool aggregator. | ||
| * @param timestamp - The current timestamp when the snapshot is taken. | ||
| * @param context - The handler context used to store the snapshot. | ||
| */ | ||
| export function setLiquidityPoolAggregatorSnapshot( | ||
| liquidityPoolAggregator: LiquidityPoolAggregator, | ||
| timestamp: Date, | ||
| context: handlerContext, | ||
| ) { | ||
| const chainId = liquidityPoolAggregator.chainId; | ||
|
|
||
| const snapshot: LiquidityPoolAggregatorSnapshot = { | ||
| ...liquidityPoolAggregator, | ||
| id: `${PoolId(chainId, liquidityPoolAggregator.poolAddress)}-${timestamp.getTime()}`, | ||
| pool: liquidityPoolAggregator.poolAddress, | ||
| timestamp: timestamp, | ||
| }; | ||
|
|
||
| context.LiquidityPoolAggregatorSnapshot.set(snapshot); | ||
| } | ||
|
|
There was a problem hiding this comment.
this was moved to appropriate Snapshot folder module
| UPDATE_INTERVAL | ||
| ) { | ||
| // Snapshot only when we've entered a new epoch (hour); use epoch-aligned timestamp so we don't drift | ||
| if (shouldSnapshot(current.lastSnapshotTimestamp, timestamp)) { |
There was a problem hiding this comment.
this was the only snapshot that i needed to modified outside of snapshot folder scope because it was already implemented in older version of the repo
| epoch.getTime(), | ||
| ); | ||
|
|
||
| const snapshot: LiquidityPoolAggregatorSnapshot = { |
There was a problem hiding this comment.
explicit type definition is kept here to make sure that typescript detects any divergence between the underlying parent entity and corresponding snapshot
makes it a little more verbose but safer
There was a problem hiding this comment.
once again, centralised ID construction here (this time for snapshots)
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (7)
test/EventHandlers/Voter/VoterCommonLogic.test.ts (1)
227-236:⚠️ Potential issue | 🟡 Minor
mockVeNFTStateis missinglastSnapshotTimestamp, silently hidden by theas VeNFTStatecast.The other mocks updated in this PR (e.g.,
NonFungiblePositioninNFPMDecreaseLiquidityLogic.test.tsandNFPMCommonLogic.test.ts) explicitly addlastSnapshotTimestamp: undefinedto stay in sync with the updated type. This mock should follow suit — particularly since theas VeNFTStatecast at line 236 would suppress any TypeScript error if the field becomes required.🛡️ Proposed fix
const mockVeNFTState: VeNFTState = { id: "10-1", chainId: 10, tokenId: 1n, owner: "0x2222222222222222222222222222222222222222", locktime: 100n, lastUpdatedTimestamp: new Date(1000), totalValueLocked: 1000n, isAlive: true, + lastSnapshotTimestamp: undefined, } as VeNFTState;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/Voter/VoterCommonLogic.test.ts` around lines 227 - 236, The mock object mockVeNFTState is missing the lastSnapshotTimestamp property and is being force-cast to VeNFTState (which hides missing-field errors); update the mockVeNFTState initializer to include lastSnapshotTimestamp: undefined so its shape matches the updated VeNFTState type (keep the rest of fields the same) and remove reliance on the cast hiding type mismatches in VoterCommonLogic.test.ts.test/EventHandlers/NFPM/NFPMCommonLogic.test.ts (1)
25-41:⚠️ Potential issue | 🟡 MinorUpdate
idformat at lines 26 and 45, and use fixed timestamp forlastUpdatedTimestampat line 39.Three fixes needed in
mockPositionandmockPositionDifferentChain:
lastUpdatedTimestamp: new Date()(line 39) violates the testing guideline to avoidnew Date()in fixtures. Use a fixed timestamp instead.
idat line 26 and line 45 both use underscore-delimited format (chainId_poolAddress_tokenId), butNonFungiblePositionId()from Constants produces hyphen-delimited format (chainId-poolAddress-tokenId). This mismatch is inconsistent with every other NFPM/ALM test file. Use theNonFungiblePositionId()function to construct the ID.Proposed fix
-import type { NonFungiblePosition, handlerContext } from "generated"; +import type { NonFungiblePosition, handlerContext } from "generated"; +import { NonFungiblePositionId } from "../../../src/Constants"; const mockPosition: NonFungiblePosition = { - id: `${chainId}_${poolAddress}_${tokenId}`, + id: NonFungiblePositionId(chainId, poolAddress, tokenId), chainId: chainId, tokenId: tokenId, owner: "0x1DFAb7699121fEF702d07932a447868dCcCFb029", pool: poolAddress, tickUpper: 0n, tickLower: -4n, token0: "0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85", token1: "0x7F5c764cBc14f9669B88837ca1490cCa17c31607", liquidity: 26679636922854n, mintTransactionHash: "0xaaa36689c538fcfee2e665f2c7b30bcf2f28ab898050252f50ec1f1d05a5392c", mintLogIndex: 42, - lastUpdatedTimestamp: new Date(), + lastUpdatedTimestamp: new Date(1711601595000), lastSnapshotTimestamp: undefined, }; const mockPositionDifferentChain: NonFungiblePosition = { ...mockPosition, - id: `8453_${poolAddress}_${tokenId}`, + id: NonFungiblePositionId(8453, poolAddress, tokenId), chainId: 8453, };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/NFPM/NFPMCommonLogic.test.ts` around lines 25 - 41, Update the fixtures: replace the dynamic lastUpdatedTimestamp in mockPosition (and mockPositionDifferentChain) with a fixed timestamp (e.g., a constant Date object such as a specific ISO date) and rebuild the id fields to use the canonical NonFungiblePositionId(...) format instead of the underscore-delimited string; specifically change the id assignments in mockPosition and mockPositionDifferentChain to call NonFungiblePositionId(chainId, poolAddress, tokenId) (or the appropriate args used in those objects) and set lastUpdatedTimestamp to a deterministic Date value.test/EventHandlers/VeNFT.test.ts (1)
381-383:⚠️ Potential issue | 🟡 MinorInvalid Ethereum address:
poolAddresshas 21 bytes.
"0xdddddddddddddddddddddddddddddddddddddddd"contains 42 hex digits — one byte too many. Same issue appears inLPWrapperV2.test.ts.🐛 Fix
- const poolAddress = toChecksumAddress( - "0xdddddddddddddddddddddddddddddddddddddddd", - ); + const poolAddress = toChecksumAddress( + "0xdddddddddddddddddddddddddddddddddddddd", + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/VeNFT.test.ts` around lines 381 - 383, The poolAddress string passed into toChecksumAddress is one hex byte too long; replace "0xdddd...dddd" used to set poolAddress in VeNFT.test.ts (and the similar instance in LPWrapperV2.test.ts) with a valid 20-byte (40-hex-character) address (e.g. trim one pair of hex digits so the string after 0x is 40 characters) so toChecksumAddress receives a properly sized 20-byte Ethereum address.test/EventHandlers/ALM/LPWrapperV2.test.ts (1)
23-23:⚠️ Potential issue | 🟡 MinorInvalid Ethereum address:
userBhas 21 bytes (42 hex chars) instead of 20.
"0xdddddddddddddddddddddddddddddddddddddddd"contains 42 hex digits after0x, making it a 21-byte value. Ethereum addresses must be 20 bytes (40 hex digits). This may silently passtoChecksumAddressbut produce non-standard IDs, leading to test fragility or masked bugs.🐛 Fix
- const userB = toChecksumAddress("0xdddddddddddddddddddddddddddddddddddddddd"); + const userB = toChecksumAddress("0xdddddddddddddddddddddddddddddddddddddddd".slice(0, 42));Or simply use a valid 40-hex-char address:
- const userB = toChecksumAddress("0xdddddddddddddddddddddddddddddddddddddddd"); + const userB = toChecksumAddress("0xdddddddddddddddddddddddddddddddddddddd");🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/ALM/LPWrapperV2.test.ts` at line 23, The test defines userB using toChecksumAddress with an invalid 42-hex-digit string (21 bytes); replace the hex string passed to toChecksumAddress with a valid 40-hex-digit (20-byte) address so userB is a standard Ethereum address (e.g., "0x" + 40 hex chars) to avoid non-standard IDs and fragile tests; update the toChecksumAddress call that sets userB accordingly.test/EventHandlers/ALM/LPWrapperV1.test.ts (1)
25-27:⚠️ Potential issue | 🟡 MinorInvalid Ethereum address:
senderAddresshas 21 bytes (42 hex chars).Same issue as
userBinLPWrapperV2.test.ts—"0xdddddddddddddddddddddddddddddddddddddddd"is not a valid 20-byte address.🐛 Fix
- const senderAddress = toChecksumAddress( - "0xdddddddddddddddddddddddddddddddddddddddd", - ); + const senderAddress = toChecksumAddress( + "0xdddddddddddddddddddddddddddddddddddddd", + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/ALM/LPWrapperV1.test.ts` around lines 25 - 27, The test uses an invalid Ethereum address for senderAddress (it has 42 hex chars / 21 bytes); replace the string passed to toChecksumAddress in the senderAddress declaration with a valid 20-byte (40 hex char) address (for example use a 40-hex-digit string after 0x or reuse the correctly-sized one used for userB in LPWrapperV2.test.ts), then regenerate the checksum via toChecksumAddress so senderAddress is a valid checksummed 20-byte address.test/EventHandlers/CLFactory/CLFactoryPoolCreatedLogic.test.ts (1)
207-214:⚠️ Potential issue | 🟡 MinorStale inline ID format — uses underscore separator instead of dash.
Line 209 uses
`${CHAIN_ID}_200`which produces"10_200", butFeeToTickSpacingMappingId(8453, TICK_SPACING)at line 312 andFeeToTickSpacingMappingId(CHAIN_ID, tickSpacing)at line 520 produce dash-separated IDs (e.g.,"10-200"). This line was missed during the migration to centralized ID helpers.🐛 Suggested fix
const mappingForTickSpacing200: FeeToTickSpacingMapping = { - id: `${CHAIN_ID}_200`, + id: FeeToTickSpacingMappingId(CHAIN_ID, 200n), chainId: CHAIN_ID, tickSpacing: 200n, fee: 300n, lastUpdatedTimestamp: new Date(1000000 * 1000), };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/CLFactory/CLFactoryPoolCreatedLogic.test.ts` around lines 207 - 214, The test uses a stale underscore ID in mappingForTickSpacing200 (id: `${CHAIN_ID}_200`) which should use the centralized ID format; update the id to use the dash-separated helper or equivalent (e.g., replace `${CHAIN_ID}_200` with FeeToTickSpacingMappingId(CHAIN_ID, 200n) or `${CHAIN_ID}-200`) so it matches other usages like FeeToTickSpacingMappingId(8453, TICK_SPACING) and FeeToTickSpacingMappingId(CHAIN_ID, tickSpacing).test/EventHandlers/CLFactory.test.ts (1)
301-336:⚠️ Potential issue | 🟠 MajorBug:
Token.set()return values discarded — tokens never added to the mock DB.Lines 313–314 call
.set()but don't capture the returned (updated) mock DB. Since the mock DB is immutable, the tokens are silently lost. The test still passes because the handler likely returns early due to missing tokens before it even checks for the mapping, so the test may not be validating the intended branch.🐛 Proposed fix
- mockDbWithoutMapping.entities.Token.set(token0ForBase); - mockDbWithoutMapping.entities.Token.set(token1ForBase); + let dbWithoutMapping = mockDbWithoutMapping; + dbWithoutMapping = dbWithoutMapping.entities.Token.set(token0ForBase); + dbWithoutMapping = dbWithoutMapping.entities.Token.set(token1ForBase);Then use
dbWithoutMappingin theCLGaugeConfig.set()call andprocessEvent()below.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/CLFactory.test.ts` around lines 301 - 336, The test mistakenly treats the mock DB as mutable: calls to MockDb.createMockDb().entities.Token.set(...) discard the returned updated DB, so tokens are never actually stored; update the code to capture and reuse the immutable DB returned by each .set() call (e.g., dbWithoutMapping = dbWithoutMapping.entities.Token.set(token0ForBase) and then for token1ForBase), then use that updated db to call CLGaugeConfig.set(...) and pass it into CLFactory.PoolCreated.processEvent({ event: mockEvent, mockDb: dbWithoutMapping }); ensure you always assign the result of .set() back to the db variable before further modifications or calling processEvent().
🧹 Nitpick comments (19)
test/EventHandlers/ALM/LPWrapperLogic.test.ts (1)
487-487: Inconsistent ID formats: some tests still use the old${srcAddress}_${chainId}format.Lines 404 and 427 (loadALMLPWrapper tests) correctly use
ALMLPWrapperId(chainId, srcAddress), butprocessDepositEvent(Line 487),processWithdrawEvent(Line 628), and several other tests still constructwrapperIdusing the old underscore format (${srcAddress}_${chainId}). Similarly,userStatsIduses the old format in most tests but the newUserStatsPerPoolId(...)in Line 896.The tests pass today because
mockResolvedValueOncereturns regardless of the argument, but the stalemockWrapper.idmay cause subtle mismatches if downstream code relies on the wrapper's ID. Consider migrating these to use the centralized helpers for consistency.Also applies to: 628-628
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/ALM/LPWrapperLogic.test.ts` at line 487, Replace hardcoded wrapperId and userStatsId string templates with the centralized ID helper functions to ensure consistency: in tests calling processDepositEvent and processWithdrawEvent (and other occurrences using `${srcAddress}_${chainId}`) stop constructing wrapperId as `${srcAddress}_${chainId}` and instead call ALMLPWrapperId(chainId, srcAddress); likewise replace userStatsId string templates with UserStatsPerPoolId(poolId, userAddress) where applicable; update any mocked objects (e.g., mockWrapper.id) to use these helper-generated IDs so mocks match the IDs used by the tested logic.test/Aggregators/VeNFTPoolVote.test.ts (1)
23-25: MockVeNFTState.iduses old underscore format.
mockVeNFTState.idis${chainId}_${tokenId}(i.e.,10_1), but the centralizedVeNFTIdfrom Constants now produces10-1(dash). While this doesn't break the current tests (mocks return unconditionally), it creates an inconsistency. Consider usingVeNFTId(chainId, tokenId)here for consistency with the rest of the migration.Proposed fix
+import { VeNFTId, VeNFTPoolVoteId } from "../../src/Constants"; -import { VeNFTPoolVoteId } from "../../src/Constants"; const mockVeNFTState: VeNFTState = { - id: `${chainId}_${tokenId}`, + id: VeNFTId(chainId, tokenId), chainId,🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Aggregators/VeNFTPoolVote.test.ts` around lines 23 - 25, The mock VeNFTState uses the old underscore id format; replace the hardcoded id `${chainId}_${tokenId}` in mockVeNFTState with the canonical generator VeNFTId(chainId, tokenId) so the test's VeNFTState.id matches the centralized VeNFTId format used elsewhere (update the mockVeNFTState object where VeNFTState.id is set).test/EventHandlers/NFPM/NFPMDecreaseLiquidityLogic.test.ts (1)
31-47: UseNonFungiblePositionId()helper for mockidconsistency with other NFPM tests.Line 32 constructs the mock ID as
${chainId}_${poolAddress}_${tokenId}, but all other NFPM tests (NFPM.test.ts, NFPMTransferLogic.test.ts) and the Pool test fixtures use theNonFungiblePositionId()helper. Apply the helper for consistency:♻️ Suggested update
+import { NonFungiblePositionId } from "../../../src/Constants"; const mockPosition: NonFungiblePosition = { - id: `${chainId}_${poolAddress}_${tokenId}`, + id: NonFungiblePositionId(chainId, poolAddress, tokenId),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/NFPM/NFPMDecreaseLiquidityLogic.test.ts` around lines 31 - 47, The mock position's id is hard-coded as `${chainId}_${poolAddress}_${tokenId}`; replace that construction with the shared helper NonFungiblePositionId(...) to match other NFPM tests. Locate the mockPosition object in NFPMDecreaseLiquidityLogic.test.ts and change its id field to call NonFungiblePositionId(chainId, poolAddress, tokenId) so the test uses the same ID format as NFPM.test.ts and NFPMTransferLogic.test.ts.test/Snapshots/TokenPriceSnapshot.test.ts (1)
21-85: Extract the repeated context/invocation boilerplate into a shared helper.All three tests repeat the same three-step pattern: create
context, callsetTokenPriceSnapshot, access.mock.calls[0][0]. This violates the "Stop and centralize" rule. Additionally, only the first test assertstoHaveBeenCalledTimes(1).♻️ Proposed refactor
- it("should set snapshot with id from TokenIdByBlock", () => { - const context = common.createMockContext({ - TokenPriceSnapshot: { set: jest.fn() }, - }); - - setTokenPriceSnapshot( - address, - chainId, - blockNumber, - lastUpdatedTimestamp, - pricePerUSDNew, - isWhitelisted, - context, - ); - - expect(context.TokenPriceSnapshot.set).toHaveBeenCalledTimes(1); - const setArg = (context.TokenPriceSnapshot.set as jest.Mock).mock - .calls[0][0]; - // TokenIdByBlock(chainId, address, blockNumber) => "chainId-address-blockNumber" - expect(setArg.id).toBe(TokenIdByBlock(chainId, address, blockNumber)); - }); - - it("should set snapshot with all passed fields", () => { - const context = common.createMockContext({ - TokenPriceSnapshot: { set: jest.fn() }, - }); - - setTokenPriceSnapshot( - address, - chainId, - blockNumber, - lastUpdatedTimestamp, - pricePerUSDNew, - isWhitelisted, - context, - ); - - const setArg = (context.TokenPriceSnapshot.set as jest.Mock).mock - .calls[0][0]; - expect(setArg.address).toBe(address); - expect(setArg.chainId).toBe(chainId); - expect(setArg.pricePerUSDNew).toBe(pricePerUSDNew); - expect(setArg.isWhitelisted).toBe(isWhitelisted); - expect(setArg.lastUpdatedTimestamp).toBe(lastUpdatedTimestamp); - }); - - it("should handle isWhitelisted false", () => { - const context = common.createMockContext({ - TokenPriceSnapshot: { set: jest.fn() }, - }); - - setTokenPriceSnapshot( - address, - chainId, - blockNumber, - lastUpdatedTimestamp, - pricePerUSDNew, - false, - context, - ); - - const setArg = (context.TokenPriceSnapshot.set as jest.Mock).mock - .calls[0][0]; - expect(setArg.isWhitelisted).toBe(false); - }); + function callSetSnapshot(isWhitelisted: boolean) { + const ctx = common.createMockContext({ + TokenPriceSnapshot: { set: jest.fn() }, + }); + setTokenPriceSnapshot( + address, + chainId, + blockNumber, + lastUpdatedTimestamp, + pricePerUSDNew, + isWhitelisted, + ctx, + ); + expect(ctx.TokenPriceSnapshot.set).toHaveBeenCalledTimes(1); + return (ctx.TokenPriceSnapshot.set as jest.Mock).mock.calls[0][0]; + } + + it("should set snapshot with id from TokenIdByBlock", () => { + const setArg = callSetSnapshot(true); + expect(setArg.id).toBe(TokenIdByBlock(chainId, address, blockNumber)); + }); + + it("should set snapshot with all passed fields", () => { + const setArg = callSetSnapshot(true); + expect(setArg.address).toBe(address); + expect(setArg.chainId).toBe(chainId); + expect(setArg.pricePerUSDNew).toBe(pricePerUSDNew); + expect(setArg.isWhitelisted).toBe(true); + expect(setArg.lastUpdatedTimestamp).toBe(lastUpdatedTimestamp); + }); + + it("should handle isWhitelisted false", () => { + const setArg = callSetSnapshot(false); + expect(setArg.isWhitelisted).toBe(false); + });As per coding guidelines: "If logic repeats across 2+ tests, extract it into a shared helper" and "Apply the 'Stop and centralize' rule."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Snapshots/TokenPriceSnapshot.test.ts` around lines 21 - 85, Extract the repeated arrange/act/inspect into a test helper that creates the mock context, invokes setTokenPriceSnapshot(...) and returns the captured set argument (the first call's [0] arg) so the three tests call that helper instead of duplicating code; ensure the helper sets up TokenPriceSnapshot: { set: jest.fn() }, calls setTokenPriceSnapshot(address, chainId, blockNumber, lastUpdatedTimestamp, pricePerUSDNew, isWhitelisted, context), and returns (context.TokenPriceSnapshot.set as jest.Mock).mock.calls[0][0]; update each test to use this helper and keep the first test's expect(context.TokenPriceSnapshot.set).toHaveBeenCalledTimes(1) by either returning the context from the helper or asserting the call count before returning the setArg, and keep assertions referencing TokenIdByBlock(chainId, address, blockNumber) and the specific fields (address, chainId, pricePerUSDNew, isWhitelisted, lastUpdatedTimestamp).src/EventHandlers/RootCLPoolFactory.ts (1)
33-38: Verify this ID format change doesn't require a full resync before deploying.The old ID was
${rootPoolAddress}_${rootChainId}_${leafPoolAddress}_${leafChainId}(underscore-separated, address-first);RootPoolLeafPoolIdproduces${rootChainId}-${leafChainId}-${rootPoolAddress}-${leafPoolAddress}(hyphen-separated, chainId-first). These are structurally different strings.If the indexer is running incrementally with existing
RootPool_LeafPoolrecords, deploying this change will cause the handler to write new hyphen-format IDs while old underscore-format records remain in the database — resulting in orphaned records or duplicates that silently diverge from real state. A full resync from genesis is required to avoid this.Please confirm: is a full resync planned alongside this deployment, or are there mechanisms (e.g., a DB migration/cleanup script) to invalidate old
RootPool_LeafPoolrows before the new indexer version goes live?🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/EventHandlers/RootCLPoolFactory.ts` around lines 33 - 38, The ID string format produced by RootPoolLeafPoolId has changed from the old underscore/address-first format to a new hyphen/chainId-first format, which will leave existing RootPool_LeafPool records orphaned; either (A) revert RootPoolLeafPoolId to the previous `${rootPoolAddress}_${rootChainId}_${leafPoolAddress}_${leafChainId}` format to remain backward-compatible, or (B) add a migration/cleanup step that converts or deletes all existing RootPool_LeafPool rows to the new `${rootChainId}-${leafChainId}-${rootPoolAddress}-${leafPoolAddress}` format before deploying, or (C) make the handler accept both formats (lookup by new ID and fallback to old ID) and upsert using the canonical new ID so live data isn’t duplicated; update the code paths that call RootPoolLeafPoolId and any DB upsert/lookup logic to implement one of these options.test/EventHandlers/Pool/Swap.test.ts (1)
133-138: Consider extracting the repeatedUserStatsPerPoolIdcall to a constant.The same
UserStatsPerPoolId(eventData.mockEventData.chainId, eventData.sender, eventData.mockEventData.srcAddress)expression appears in two separate test cases. Per the "Stop and centralize" guideline, a shared constant would reduce maintenance surface.♻️ Proposed refactor
Add in the outer
describescope (aftereventDatais defined):+ const swapUserStatsId = UserStatsPerPoolId( + eventData.mockEventData.chainId, + eventData.sender, + eventData.mockEventData.srcAddress, + );Then replace both call sites with
swapUserStatsId.Also applies to: 203-208
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/Pool/Swap.test.ts` around lines 133 - 138, Extract the repeated UserStatsPerPoolId(...) expression into a shared constant named (for example) swapUserStatsId declared in the outer describe scope immediately after eventData is defined; assign it the value UserStatsPerPoolId(eventData.mockEventData.chainId, eventData.sender, eventData.mockEventData.srcAddress) and then replace both inline call sites (the two occurrences currently invoking UserStatsPerPoolId) with swapUserStatsId so the tests reference the centralized identifier.test/EventHandlers/Pool/Fees.test.ts (1)
76-81: Extract the repeatedUserStatsPerPoolIdcomputation to a describe-scoped constant.
UserStatsPerPoolId(10, toChecksumAddress("0x1234..."), toChecksumAddress("0x3333..."))is called with the same arguments in 6 places. Per the coding guidelines' "Stop and centralize" rule, this should be extracted.♻️ Proposed refactor
Add after the
setupCommon()call on line 12:const { mockToken0Data, mockToken1Data, mockLiquidityPoolData } = setupCommon(); const poolId = mockLiquidityPoolData.id; + const feesSenderAddress = toChecksumAddress( + "0x1234567890123456789012345678901234567890", + ); + const feesUserStatsId = UserStatsPerPoolId( + 10, + feesSenderAddress, + mockLiquidityPoolData.poolAddress, + );Then replace all 6 occurrences with:
- UserStatsPerPoolId( - 10, - toChecksumAddress("0x1234567890123456789012345678901234567890"), - toChecksumAddress("0x3333333333333333333333333333333333333333"), - ), + feesUserStatsId,Based on coding guidelines: "Apply the 'Stop and centralize' rule: when copying large object literals, repeating calculations, repeating assertions, or repeating setup boilerplate."
Also applies to: 113-119, 202-207, 267-272, 332-337, 394-399
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/Pool/Fees.test.ts` around lines 76 - 81, Extract the repeated UserStatsPerPoolId(...) call into a describe-scoped constant: after the setupCommon() invocation create a const (e.g., const expectedUserStats = UserStatsPerPoolId(10, toChecksumAddress("0x1234567890123456789012345678901234567890"), toChecksumAddress("0x3333333333333333333333333333333333333333")) and then replace all six inline calls (the occurrences using UserStatsPerPoolId with those same toChecksumAddress arguments) with that constant; update tests at the cited locations to reference expectedUserStats so the repeated computation is centralized.test/Aggregators/LiquidityPoolAggregator.test.ts (1)
341-349: The snapshot assertion inheritstimestamp = new Date()— prefer a fixed timestamp.
getSnapshotEpoch(timestamp)on Line 341 is derived fromtimestamp = new Date()(Line 212). While the test is internally consistent (creation and assertion both use the same variable), this creates a theoretical epoch-boundary flakiness and violates the guideline.♻️ Proposed fix
In the outer
beforeEach(line 212), replace the wall-clock time with a fixed value:- timestamp = new Date(); + timestamp = new Date(1_700_000_000_000); // fixed: 2023-11-14T22:13:20ZThis makes the
epochMsderivation in the new assertion fully deterministic.The same
new Date()pattern at Lines 209–210 (lastUpdatedTimestamp,lastSnapshotTimestamp) and in the snapshot/CL-pool tests (Lines 406–407, 446–447) should be updated to fixed timestamps as well.Based on coding guidelines: "Avoid
new Date()in assertions and fixtures unless using fake timers. Prefer fixed timestamps from shared helpers."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Aggregators/LiquidityPoolAggregator.test.ts` around lines 341 - 349, The test uses live Date() values (timestamp, lastUpdatedTimestamp, lastSnapshotTimestamp) which makes getSnapshotEpoch(timestamp) and the LiquidityPoolAggregatorSnapshotId assertion flaky; replace the Date() constructors in the outer beforeEach and any related fixtures (the variables timestamp, lastUpdatedTimestamp, lastSnapshotTimestamp and the similar ones in the CL-pool tests) with fixed, deterministic timestamps (e.g., a constant epoch ms or a shared test helper timestamp) so that getSnapshotEpoch(timestamp).getTime() yields a stable epochMs and the expect(...) assertions against LiquidityPoolAggregatorSnapshotId and snapshot fields are deterministic.src/Snapshots/ALMLPWrapperSnapshot.ts (1)
26-28: Duplicated address-extraction logic — consider a canonical utility or direct use ofentity.chainId.The identical string-parsing logic already exists as
getWrapperAddressFromIdintest/Snapshots/helpers.ts. The production code should not re-implement it inline; either:
- Export the utility from
src/Constants.ts(orsrc/Snapshots/Shared.ts) and have the test helper import it, or- Use
entity.chainIddirectly to avoid the fragile split/join entirely:♻️ Proposed refactor using `entity.chainId`
- // entity.id is ALMLPWrapperId: {chainId}-{wrapperAddress} - const wrapperAddress = entity.id.includes("-") - ? entity.id.split("-").slice(1).join("-") - : entity.id; + // entity.id is ALMLPWrapperId: {chainId}-{wrapperAddress} + const prefix = `${entity.chainId}-`; + const wrapperAddress = entity.id.startsWith(prefix) + ? entity.id.slice(prefix.length) + : entity.id;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Snapshots/ALMLPWrapperSnapshot.ts` around lines 26 - 28, The inline address-extraction in ALMLPWrapperSnapshot (const wrapperAddress = entity.id.includes("-") ? entity.id.split("-").slice(1).join("-") : entity.id) duplicates getWrapperAddressFromId from test/Snapshots/helpers.ts; remove the ad-hoc parsing and either (A) import and use a single exported utility (move/export getWrapperAddressFromId from test helper into src/Constants.ts or src/Snapshots/Shared.ts and update tests to import it), or (B) simply use entity.chainId directly where appropriate (replace wrapperAddress usage with entity.chainId) to avoid fragile string splits; update imports/usages accordingly.test/Snapshots/ALMLPWrapperSnapshot.test.ts (1)
82-100: Optional: prefertoMatchObjectfor the field-spreading assertion.Per coding guidelines,
toMatchObjectis preferred for partial matches when extra fields are noisy. The 8 individualtoBeassertions could be consolidated:♻️ Optional refactor
- expect(setArg.chainId).toBe(entity.chainId); - expect(setArg.pool).toBe(entity.pool); - expect(setArg.token0).toBe(entity.token0); - expect(setArg.token1).toBe(entity.token1); - expect(setArg.lpAmount).toBe(entity.lpAmount); - expect(setArg.liquidity).toBe(entity.liquidity); - expect(setArg.tickLower).toBe(entity.tickLower); - expect(setArg.tickUpper).toBe(entity.tickUpper); + expect(setArg).toMatchObject({ + chainId: entity.chainId, + pool: entity.pool, + token0: entity.token0, + token1: entity.token1, + lpAmount: entity.lpAmount, + liquidity: entity.liquidity, + tickLower: entity.tickLower, + tickUpper: entity.tickUpper, + });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Snapshots/ALMLPWrapperSnapshot.test.ts` around lines 82 - 100, Consolidate the eight individual expect(...).toBe(...) assertions into a single partial-match assertion using toMatchObject: after calling setALMLPWrapperSnapshot(...) and extracting setArg from context.ALM_LP_WrapperSnapshot.set.mock.calls[0][0], call expect(setArg).toMatchObject(...) with an object containing the properties chainId, pool, token0, token1, lpAmount, liquidity, tickLower, and tickUpper taken from the entity (common.mockALMLPWrapperData) so the test asserts the spread fields in one concise check.test/Snapshots/Shared.test.ts (1)
6-23: HoistoneHourMsto thedescribe("getSnapshotEpoch")scope to reduce repetition.
const oneHourMs = SNAPSHOT_INTERVALis repeated in three consecutive tests; it can live once at thedescribelevel, just like theshouldSnapshotblock does at line 32.♻️ Proposed refactor
describe("getSnapshotEpoch", () => { + const oneHourMs = SNAPSHOT_INTERVAL; + it("should round timestamp down to the nearest hourly boundary", () => { - // 1 hour = SNAPSHOT_INTERVAL ms - const oneHourMs = SNAPSHOT_INTERVAL; const exactBoundary = new Date(oneHourMs * 3); // 03:00:00.000 expect(getSnapshotEpoch(exactBoundary).getTime()).toBe(oneHourMs * 3); }); it("should floor timestamps within an hour to the start of that hour", () => { - const oneHourMs = SNAPSHOT_INTERVAL; const midHour = new Date(oneHourMs * 2 + 30 * 60 * 1000); // 02:30:00.000 expect(getSnapshotEpoch(midHour).getTime()).toBe(oneHourMs * 2); }); it("should floor timestamps with milliseconds to the hour start", () => { - const oneHourMs = SNAPSHOT_INTERVAL; const withMs = new Date(oneHourMs * 5 + 59 * 60 * 1000 + 999); // 05:59:59.999 expect(getSnapshotEpoch(withMs).getTime()).toBe(oneHourMs * 5); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Snapshots/Shared.test.ts` around lines 6 - 23, Move the repeated const oneHourMs = SNAPSHOT_INTERVAL out of the individual its and into the surrounding describe("getSnapshotEpoch") scope so all three tests reuse the single binding; update the describe block that contains the tests for getSnapshotEpoch to declare oneHourMs once (near existing shouldSnapshot scope) and remove the three per-test declarations inside the it blocks; this affects the getSnapshotEpoch tests and references SNAPSHOT_INTERVAL and oneHourMs only.src/Snapshots/LiquidityPoolAggregatorSnapshot.ts (1)
30-101: Fragile field-by-field copy — new aggregator fields will silently be missed in snapshots.This manual copy of ~70 fields is a maintenance risk. When a new field is added to
LiquidityPoolAggregator, the compiler won't warn if it's omitted here (sinceLiquidityPoolAggregatorSnapshotis a separate type). Consider a pattern that makes missing fields a compile-time error, e.g.:♻️ Suggested approach using a mapping helper
// If the snapshot type is a strict subset of the aggregator, you could use a helper: const { id: _id, lastUpdatedTimestamp: _ts, lastSnapshotTimestamp: _snap, ...rest } = liquidityPoolAggregator; const snapshot: LiquidityPoolAggregatorSnapshot = { ...rest, id: snapshotId, timestamp: epoch, };This only works if the snapshot type's fields are an exact subset. If the generated types diverge, consider adding a type-level check (e.g.,
satisfies) to catch drift at compile time.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Snapshots/LiquidityPoolAggregatorSnapshot.ts` around lines 30 - 101, The snapshot is built by manually copying many fields from liquidityPoolAggregator into a LiquidityPoolAggregatorSnapshot, which is fragile and will miss new fields; replace the field-by-field copy in the snapshot creation with a safe spread-based pattern that takes all snapshot-compatible fields from liquidityPoolAggregator and then overrides id and timestamp (use destructuring to pull out id/timestamps you want to replace and then do ...rest), and add a TypeScript compile-time check (for example using the satisfies operator or ensuring LiquidityPoolAggregatorSnapshot is a subset of the liquidityPoolAggregator type) so any future field additions cause a type error rather than silent omission; update the code at the snapshot construction site (where snapshotId, epoch, liquidityPoolAggregator and LiquidityPoolAggregatorSnapshot are used) to follow this pattern.test/Snapshots/LiquidityPoolAggregatorSnapshot.test.ts (1)
66-85: Field exclusion list may silently fall out of sync.The hardcoded exclusion list (
id,lastUpdatedTimestamp,lastSnapshotTimestamp,tickSpacing) must be manually updated whenever pool gains a new field that the snapshot intentionally omits. If a new pool-only field is added without updating the filter, the loop will either skip it (if it'sundefinedonsetArg) or assert equality to a snapshot field that shouldn't exist — both silently correct but potentially masking omissions.Consider extracting the exclusion set into a named constant (e.g.,
POOL_ONLY_FIELDS) near the top of the file so it's easier to keep in sync with schema changes, or assert the snapshot key count to catch new fields proactively.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/Snapshots/LiquidityPoolAggregatorSnapshot.test.ts` around lines 66 - 85, Extract the hardcoded exclusion list used in snapshotKeysFromPool into a named constant (e.g., POOL_ONLY_FIELDS) near the top of the test file and use that constant in the filter instead of inline string checks; additionally, add an assertion validating the number of keys in the derived snapshot (from setArg) matches the expected number after exclusions to catch newly added pool-only fields. Update references to snapshotKeysFromPool to use POOL_ONLY_FIELDS and include the key-count assertion so any schema changes that add a pool-only field will fail the test until the constant is updated.test/EventHandlers/VotingReward/SuperchainIncentiveVotingReward.test.ts (1)
58-68: Consider usingcreateMockVeNFTStatebuilder for the VeNFT fixture.The VeNFTState is constructed from scratch with an
as VeNFTStatecast, which silently omits the newlastSnapshotTimestampfield. Usingcommon.createMockVeNFTState({ chainId, tokenId, owner: userAddress, ... })would keep it in sync with schema changes and avoid the cast.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/VotingReward/SuperchainIncentiveVotingReward.test.ts` around lines 58 - 68, The VeNFT fixture is hand-constructed and cast to VeNFTState, which can omit new fields like lastSnapshotTimestamp; replace the manual object for veNFT with the builder common.createMockVeNFTState({ chainId, tokenId, owner: userAddress, locktime: 1000000n, totalValueLocked: 10000n, isAlive: true, lastUpdatedTimestamp: new Date(1000000 * 1000) }) so the created value matches VeNFTState shape (and retains fields such as lastSnapshotTimestamp) and remove the unsafe "as VeNFTState" cast; keep VeNFTId usage if needed to set id.test/EventHandlers/VeNFT/VeNFTLogic.test.ts (2)
432-440: Same inline-ID issue forUserStatsPerPool.Line 434 uses a string literal for the
UserStatsPerPoolID instead ofUserStatsPerPoolId(10, "0x1111111111111111111111111111111111111111", "0xpool1").🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/VeNFT/VeNFTLogic.test.ts` around lines 432 - 440, The test uses a hardcoded string for the UserStatsPerPool ID which can break ID construction; update the existingUserStats fixture in VeNFTLogic.test.ts (the existingUserStats object) to call UserStatsPerPoolId(10, "0x1111111111111111111111111111111111111111", "0xpool1") instead of the literal "10-0x1111111111111111111111111111111111111111-0xpool1", ensuring the test imports/uses the UserStatsPerPoolId helper used elsewhere so the ID format matches the production constructor.
318-340: Inconsistent ID construction: mix of string literals and centralized helpers.Line 294 correctly uses
VeNFTPoolVoteId(10, 1n, "0xpool1"), but lines 320 and 327 use string literals ("10-1-0xpool0","10-1-0xpool1"). Similarly, lines 339–340 buildUserStatsPerPoolIDs with template strings instead ofUserStatsPerPoolId(...).Since this PR's goal is to centralize ID construction, these should also use the helpers for consistency.
♻️ Suggested fix
{ - id: "10-1-0xpool0", + id: VeNFTPoolVoteId(10, 1n, "0xpool0"), poolAddress: "0xpool0", veNFTamountStaked: 0n, veNFTState_id: mockVeNFTState.id, lastUpdatedTimestamp: new Date(0), }, { - id: "10-1-0xpool1", + id: VeNFTPoolVoteId(10, 1n, "0xpool1"), poolAddress: "0xpool1", veNFTamountStaked: 50n, veNFTState_id: mockVeNFTState.id, lastUpdatedTimestamp: new Date(0), }, ] as VeNFTPoolVote[]; ... - const previousOwnerId = `10-${mockVeNFTState.owner}-0xpool1`; - const newOwnerId = `10-${mockTransferEvent.params.to}-0xpool1`; + const previousOwnerId = UserStatsPerPoolId(10, mockVeNFTState.owner, "0xpool1"); + const newOwnerId = UserStatsPerPoolId(10, mockTransferEvent.params.to, "0xpool1");Add
UserStatsPerPoolIdto the import from../../../src/Constants.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/VeNFT/VeNFTLogic.test.ts` around lines 318 - 340, Replace the hard-coded ID strings in the test with the centralized ID helpers: change poolVotes entries and the variables previousOwnerId/newOwnerId to use VeNFTPoolVoteId(10, 1n, "0xpool0") and VeNFTPoolVoteId(10, 1n, "0xpool1") respectively (instead of "10-1-0xpool0"/"10-1-0xpool1"), and replace the template-built UserStatsPerPool IDs with UserStatsPerPoolId(...) calls; also add UserStatsPerPoolId to the import from ../../../src/Constants so the test consistently uses VeNFTPoolVoteId and UserStatsPerPoolId helpers.test/EventHandlers/Pool/common.ts (1)
406-414: Fragile ID parsing increateMockVeNFTPoolVote.Splitting
veNFTStateIdon"-"to reverse-engineerchainIdandtokenIdis brittle — it'll break if the ID format changes or iftokenIditself contains a-. Consider acceptingchainIdandtokenIdas explicit override fields instead of parsing them from the composite string.♻️ Suggested approach
function createMockVeNFTPoolVote( overrides: Partial<VeNFTPoolVote> = {}, ): VeNFTPoolVote { - const veNFTStateId = overrides.veNFTState_id ?? mockVeNFTStateData.id; - const [chainIdPart, tokenIdPart] = veNFTStateId.split("-"); - const chainId = chainIdPart ? Number(chainIdPart) : CHAIN_ID; - const tokenId = tokenIdPart ? BigInt(tokenIdPart) : 1n; + const chainId = CHAIN_ID; + const tokenId = 1n; + const veNFTStateId = overrides.veNFTState_id ?? VeNFTId(chainId, tokenId); const poolAddress = toChecksumAddress( overrides.poolAddress ?? POOL_ADDRESS, ); const id = overrides.id ?? VeNFTPoolVoteId(chainId, tokenId, poolAddress);If callers need different
chainId/tokenId, they can supplyveNFTState_idandidtogether explicitly.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/Pool/common.ts` around lines 406 - 414, The current createMockVeNFTPoolVote function brittlely parses overrides.veNFTState_id to derive chainId and tokenId; update the function to accept explicit override fields (e.g., overrides.chainId and overrides.tokenId) and use them to compute chainId and tokenId when provided, falling back to the existing veNFTState_id parsing only if those explicit overrides are absent, then use those values when calling VeNFTPoolVoteId; reference the veNFTState_id, chainId, tokenId, and VeNFTPoolVoteId symbols when making this change so callers can supply numeric chainId/BigInt tokenId directly instead of relying on string splitting.test/EventHandlers/Voter/Voter.test.ts (1)
263-265: UseRootPoolLeafPoolIdhelper instead of inline template string.This PR introduces
RootPoolLeafPoolId(rootChainId, leafChainId, rootPoolAddress, leafPoolAddress)insrc/Constants.ts(Lines 645–650), but here the ID is still manually constructed via a template literal. The same issue exists at Line 665. Using the helper keeps the test consistent with the centralization goal.♻️ Proposed fix (both occurrences)
Add to the imports at line 13:
import { CHAIN_CONSTANTS, PoolId, TokenId, UserStatsPerPoolId, VeNFTId, VeNFTPoolVoteId, + RootPoolLeafPoolId, toChecksumAddress, } from "../../../src/Constants";Line 264:
- id: `${rootChainId}-${leafChainId}-${rootPoolAddress}-${leafPoolAddress}`, + id: RootPoolLeafPoolId(rootChainId, leafChainId, rootPoolAddress, leafPoolAddress),Line 665:
- id: `${rootChainId}-${leafChainId}-${rootPoolAddress}-${leafPoolAddress}`, + id: RootPoolLeafPoolId(rootChainId, leafChainId, rootPoolAddress, leafPoolAddress),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/Voter/Voter.test.ts` around lines 263 - 265, Replace the inline template-string ID construction with the new RootPoolLeafPoolId helper: import RootPoolLeafPoolId from where constants are exported (add to the test imports) and change the id field for the rootPoolLeafPool object (and the second occurrence in the file) from `${rootChainId}-${leafChainId}-${rootPoolAddress}-${leafPoolAddress}` to RootPoolLeafPoolId(rootChainId, leafChainId, rootPoolAddress, leafPoolAddress); keep all other fields the same.test/EventHandlers/ALM/DeployFactoryV1.test.ts (1)
63-78: Consider usingcreateMockNonFungiblePositionfromcommon.tsto reduce duplication.The
NonFungiblePositionobject literal is repeated in five test cases with minor variations. The newcreateMockNonFungiblePositionbuilder intest/EventHandlers/Pool/common.ts(Lines 292–312) was introduced in this PR and could reduce the boilerplate here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/EventHandlers/ALM/DeployFactoryV1.test.ts` around lines 63 - 78, Replace the repeated NonFungiblePosition object literals (e.g., mockNFPM created using NonFungiblePositionId, chainId, poolAddress, tokenId, etc.) with the new builder createMockNonFungiblePosition from test/EventHandlers/Pool/common.ts; call createMockNonFungiblePosition with the differing fields (owner, token0, token1, liquidity, mintTransactionHash, mintLogIndex, lastUpdatedTimestamp, tickUpper/tickLower) to produce the mock, and update all five test cases to use that helper to remove duplication while preserving the unique values passed into the builder.
| expect(setArg.locktime).toBe(entity.locktime); | ||
| expect(setArg.lastUpdatedTimestamp).toEqual(entity.lastUpdatedTimestamp); | ||
| expect(setArg.isAlive).toBe(entity.isAlive); | ||
| expect(setArg.lastSnapshotTimestamp).toEqual(entity.lastSnapshotTimestamp); |
There was a problem hiding this comment.
Trivially-true assertion provides no coverage.
VeNFTStateSnapshot.ts does not include lastSnapshotTimestamp in the snapshot payload, and createMockVeNFTState() defaults it to undefined. Both sides of this assertion are always undefined, so it can never fail and exercises nothing.
Either:
- If the field should be propagated: add it to the snapshot object in
VeNFTStateSnapshot.tsand initialize the mock with a non-undefinedvalue. - If the field intentionally belongs only on the source entity (not the snapshot record): remove this assertion.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@test/Snapshots/VeNFTStateSnapshot.test.ts` at line 78, The test's assertion
is trivially true because createMockVeNFTState() defaults lastSnapshotTimestamp
to undefined and VeNFTStateSnapshot.ts does not include that field; decide
whether lastSnapshotTimestamp should be part of the snapshot or not: if it
should be propagated, update the snapshot builder in VeNFTStateSnapshot.ts to
include lastSnapshotTimestamp in the snapshot payload and change
createMockVeNFTState() to initialize lastSnapshotTimestamp to a non-undefined
value so
expect(setArg.lastSnapshotTimestamp).toEqual(entity.lastSnapshotTimestamp) is
meaningful; otherwise remove the pointless assertion from the test (the expect
on setArg.lastSnapshotTimestamp) to avoid asserting two undefined values.
| export function shouldSnapshot( | ||
| lastSnapshotTimestamp: Date | undefined, | ||
| currentTimestamp: Date, | ||
| ): boolean { | ||
| if (!lastSnapshotTimestamp) return true; | ||
| return ( | ||
| getSnapshotEpoch(currentTimestamp).getTime() > | ||
| getSnapshotEpoch(lastSnapshotTimestamp).getTime() | ||
| ); | ||
| } |
There was a problem hiding this comment.
[question] I was of the understanding that a snapshot would be taken at least on regular intervals i.e. every hour or week and so the check should be
getSnapshotEpoch(currentTimestamp).getTime() >
getSnapshotEpoch(lastSnapshotTimestamp).getTime() + SNAPSHOT_INTERVAL (i.e. 1 day or hour)
The current logic seems to take a snapshot on every update - is that intended?
There was a problem hiding this comment.
We only take a snapshot when the hour (epoch) of the current event is later than the hour of the last snapshot. Many events in the same hour don’t create new snapshots.
Example
- When we call
shouldSnapshot
Some handler has just updated an entity (e.g. pool or user stats) and is about to persist it. Before saving, it calls:
shouldSnapshot(lastSnapshotTimestamp, currentTimestamp)
lastSnapshotTimestamp – value stored on the entity: “last time we wrote a snapshot” (or undefined if we never have).
currentTimestamp – time of the event we’re processing (e.g. block timestamp of the swap/sync).
We only want to create a snapshot when the current event is in a later “hour bucket” than the last time we snapshotted.
- First check: never snapshotted
If lastSnapshotTimestamp is undefined, we return true immediately: take the first snapshot. No getSnapshotEpoch is used here.
- When we do use
getSnapshotEpoch
If we have a lastSnapshotTimestamp, we need to compare “which hour” it belongs to vs “which hour” the current event belongs to.
getSnapshotEpoch(currentTimestamp)
Takes the event time and rounds it down to the start of its interval.
With 1-hour intervals (SNAPSHOT_INTERVAL = 60 * 60 * 1000 ms):
e.g. 11:23:45 → 11:00:00.000
So we get the epoch (interval start) of the current event.
getSnapshotEpoch(lastSnapshotTimestamp)
Same thing for the “last snapshot” time: round down to the start of that interval.
e.g. 10:30:00 (last time we snapshotted) → 10:00:00.000
So we get the epoch we last snapshotted in.
We then compare the two epoch times (as numbers):
currentEpochMs > lastEpochMs → we’re in a new interval → return true (take a snapshot).
Otherwise (same interval or earlier) → return false (no snapshot).
So the flow is: enter shouldSnapshot → if we have no last snapshot, snapshot; otherwise reduce both timestamps to their interval starts with getSnapshotEpoch and only snapshot when the current interval is strictly after the last one.
There was a problem hiding this comment.
All good - I misread the wrapper getSnapshotEpoch in the boolean expression 👍
57c5487 to
ee95c67
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
src/Constants.ts (1)
707-712: Inconsistent snapshot ID composition — consider reusing base entity ID helpers.
LiquidityPoolAggregatorSnapshotId(line 698) delegates toPoolId(), butUserStatsPerPoolSnapshotId,ALMLPWrapperSnapshotId, andVeNFTStateSnapshotIdinline their format strings. If a base entity ID format ever changes, these snapshot generators won't pick it up automatically.♻️ Proposed consistent composition
-export const UserStatsPerPoolSnapshotId = ( - chainId: number, - userAddress: string, - poolAddress: string, - epochMs: number, -) => `${chainId}-${userAddress}-${poolAddress}-${epochMs}`; +export const UserStatsPerPoolSnapshotId = ( + chainId: number, + userAddress: string, + poolAddress: string, + epochMs: number, +) => `${UserStatsPerPoolId(chainId, userAddress, poolAddress)}-${epochMs}`; -export const ALMLPWrapperSnapshotId = ( - chainId: number, - wrapperAddress: string, - epochMs: number, -) => `${chainId}-${wrapperAddress}-${epochMs}`; +export const ALMLPWrapperSnapshotId = ( + chainId: number, + wrapperAddress: string, + epochMs: number, +) => `${ALMLPWrapperId(chainId, wrapperAddress)}-${epochMs}`; -export const VeNFTStateSnapshotId = ( - chainId: number, - tokenId: bigint, - epochMs: number, -) => `${chainId}-${tokenId}-${epochMs}`; +export const VeNFTStateSnapshotId = ( + chainId: number, + tokenId: bigint, + epochMs: number, +) => `${VeNFTId(chainId, tokenId)}-${epochMs}`;Also applies to: 732-736, 744-748
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/Constants.ts` around lines 707 - 712, Update the snapshot ID helpers to compose from the existing base entity ID functions instead of inlining format strings: replace the body of UserStatsPerPoolSnapshotId, ALMLPWrapperSnapshotId, and VeNFTStateSnapshotId to call the appropriate base ID helper (e.g., PoolId, UserId, ALMLPWrapperId, VeNFTId or similarly named functions used elsewhere such as in LiquidityPoolAggregatorSnapshotId) and append the epochMs suffix (like `${baseId}-${epochMs}`), so any future changes to base ID formats are picked up automatically.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@src/Constants.ts`:
- Around line 707-712: Update the snapshot ID helpers to compose from the
existing base entity ID functions instead of inlining format strings: replace
the body of UserStatsPerPoolSnapshotId, ALMLPWrapperSnapshotId, and
VeNFTStateSnapshotId to call the appropriate base ID helper (e.g., PoolId,
UserId, ALMLPWrapperId, VeNFTId or similarly named functions used elsewhere such
as in LiquidityPoolAggregatorSnapshotId) and append the epochMs suffix (like
`${baseId}-${epochMs}`), so any future changes to base ID formats are picked up
automatically.
* chore: centralising ID entity creation * chore: additional ID changes + coderabbit suggestions
slight changes more refactor changes
Co-authored-by: Woo Sung Dong <88306711+WooSungD@users.noreply.github.com> some changes
0f55a70 to
2ffe329
Compare
Implements:
Summary by CodeRabbit
New Features
lastSnapshotTimestampto indicate when they were last captured in a snapshot.Documentation