Skip to content

Add Generative Graphs UI and Data Analysis Tool#464

Open
ngoiyaeric wants to merge 50 commits intomainfrom
feature/generative-graphs-ui-data-preview-17358342468692938721
Open

Add Generative Graphs UI and Data Analysis Tool#464
ngoiyaeric wants to merge 50 commits intomainfrom
feature/generative-graphs-ui-data-preview-17358342468692938721

Conversation

@ngoiyaeric
Copy link
Collaborator

@ngoiyaeric ngoiyaeric commented Jan 31, 2026

User description

This PR introduces generative graphing capabilities to the QCX platform.

Key changes:

  1. Recharts Integration: Added recharts to dependencies to support Bar, Line, Pie, Area, and Scatter charts.
  2. Data Analysis Tool: Created a new AI tool dataAnalysis that generates structured data for charts based on user prompts or uploaded files.
  3. Enhanced File Support: The chat interface now accepts .csv and .json files, which are processed and made available to the AI for analysis.
  4. Real-time Visualization: Updated the researcher agent and tool pipeline to stream chart components to the UI in real-time.
  5. Geospatial Awareness: If the analysis data contains geographic coordinates, the system automatically updates the map view via the MapQueryHandler.
  6. Robust UI Rendering: Added the GraphSection component which handles both streaming and static (hydrated) data.

PR created automatically by Jules for task 17358342468692938721 started by @ngoiyaeric


PR Type

Enhancement


Description

  • Added generative graphing capabilities with Recharts integration

  • Created dataAnalysis tool for structured chart generation from data

  • Extended file upload support to CSV and JSON formats

  • Integrated geospatial awareness for location-based analysis results

  • Added GraphSection component for streaming and static chart rendering


Diagram Walkthrough

flowchart LR
  A["User Input<br/>CSV/JSON Files"] --> B["Chat Panel<br/>File Upload"]
  B --> C["Data Analysis Tool<br/>Zod Schema"]
  C --> D["GraphSection Component<br/>Recharts Charts"]
  D --> E["UI Rendering<br/>Bar/Line/Pie/Area/Scatter"]
  C --> F["Geospatial Data<br/>Coordinates"]
  F --> G["MapQueryHandler<br/>Map Integration"]
Loading

File Walkthrough

Relevant files
Enhancement
7 files
index.ts
Added DataAnalysisResult type definition                                 
+21/-0   
actions.tsx
Extended file support and added chart UI handling               
+32/-1   
chat-panel.tsx
Updated file input to accept CSV and JSON                               
+1/-1     
graph-section.tsx
New component for rendering Recharts visualizations           
+163/-0 
data-analysis.tsx
New tool for data analysis and chart generation                   
+19/-0   
index.tsx
Registered dataAnalysis tool in tool pipeline                       
+5/-0     
data-analysis.tsx
Zod schema for data analysis tool parameters                         
+25/-0   
Documentation
1 files
researcher.tsx
Added dataAnalysis tool guidelines to agent instructions 
+11/-4   
Dependencies
1 files
package.json
Added recharts dependency for chart rendering                       
+1/-0     

Summary by CodeRabbit

Release Notes

  • New Features
    • Added interactive data visualization with bar, line, pie, area, and scatter chart types
    • Extended file upload support to include CSV and JSON file formats
    • Integrated geospatial mapping features with data visualizations for location-based analysis
    • Improved resolution search processing with streamlined results handling

CJWTRUST and others added 14 commits January 20, 2026 01:20
Restored the branch to the expected head at commit 488b47c.
This recovers several missing features and architectural improvements:
- Integrated HistoryToggleProvider and HistorySidebar.
- Integrated UsageToggleProvider and the new UsageView component.
- Removed legacy usage-sidebar.tsx.
- Restored pricing and feature updates in PurchaseCreditsPopup.
- Fixed layout structure in app/layout.tsx to support these new global state providers.

Resolved previous merge conflicts and incorrect force-push state.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Reset branch HEAD to historical recovery point 488b47c.
- Applied latest code fixes from orphan commit 166aee9, including Mapbox rendering optimizations and GeoJSON streaming.
- Improved type safety in app/actions.tsx by removing 'as any' casts and properly typing message arrays.
- Refined getModel utility to support vision-aware model fallback.
- Documented non-blocking background task pattern in server actions.
- Updated feature/billing-integration branch with the restored state.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Reset branch HEAD to historical recovery point 488b47c.
- Applied latest code fixes from orphan commit 166aee9.
- Resolved TypeScript build error in app/actions.tsx by casting GeoJSON data to FeatureCollection.
- Improved type safety for AIState message filtering and content handling.
- Implemented background processing for resolution search with immediate UI streaming.
- Optimized chat history fetching to trigger only when the sidebar is open.
- Ensured Mapbox style loading is robust against re-renders.
- Cleaned up dead code and improved documentation in server actions.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Reset branch HEAD to historical recovery point 488b47c.
- Applied latest code fixes from orphan commit 166aee9.
- Resolved TypeScript build error in app/actions.tsx by casting GeoJSON data to FeatureCollection.
- Improved type safety for AIState message filtering and content handling.
- Implemented background processing for resolution search with immediate UI streaming.
- Re-enabled auto-opening of the pricing popup in components/header.tsx.
- Optimized chat history fetching to trigger only when the sidebar is open.
- Ensured Mapbox style loading is robust against re-renders.
- Cleaned up dead code and improved documentation in server actions.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Restored historical state from 488b47c and merged latest changes.
- Implemented streaming for resolution search in lib/agents/resolution-search.tsx and app/actions.tsx to improve response time.
- Re-enabled auto-opening pricing popup in components/header.tsx.
- Resolved all TypeScript build errors in app/actions.tsx.
- Restored Grok model support for vision tasks in lib/utils/index.ts.
- Optimized chat history loading in components/sidebar/chat-history-client.tsx.
- Improved Mapbox style loading robustness in components/map/mapbox-map.tsx.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Synchronized branch with origin/main, prioritizing main's code structure and latest features.
- Bridged the history with historical recovery point 488b47c.
- Implemented streaming for resolution search in lib/agents/resolution-search.tsx and app/actions.tsx to resolve performance issues.
- Restored the auto-opening Pricing Popup and Usage View in the Header component.
- Integrated the Timezone Clock and time context into the restored resolution search logic.
- Resolved TypeScript build errors with proper type casting and fixed a missing 'use client' directive in components/history.tsx.
- Ensured all required providers (History, Usage, etc.) are correctly wrapped in the root layout.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Synchronized branch with origin/main, prioritizing main's code structure and latest features.
- Bridged the history with historical recovery point 488b47c.
- Integrated UsageView and billing UI into the Chat component, toggled by the tent tree icon.
- Implemented streaming for resolution search in lib/agents/resolution-search.tsx and app/actions.tsx for better performance.
- Restored the auto-opening Pricing Popup in the Header component.
- Improved type safety across server actions and Mapbox components.
- Ensured mutual exclusion between Settings, Usage, and Map views.
- Fixed a missing 'use client' directive in components/history.tsx.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Integrated UsageView into the main Chat component for both mobile and desktop.
- Ensured mutual exclusion between Settings, Usage, and Map views in the UI panel.
- Updated ConditionalLottie to hide the loading animation when the Usage View is open.
- Synchronized with origin/main while prioritizing its code structure.
- Maintained historical recovery and performance optimizations for resolution search.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Synchronized with origin/main, prioritizing main branch code and features.
- Fully restored historical context and missing changes from commit 488b47c.
- Integrated Usage and Billing UI (UsageView) into the Chat component.
- Implemented streaming for resolution search analysis to improve perceived performance.
- Re-enabled auto-opening pricing popup in components/header.tsx.
- Refined ConditionalLottie visibility to avoid overlaps with the Usage UI.
- Ensured mutual exclusion between Settings, Usage, and Map views in the side panel.
- Improved type safety across server actions and Mapbox components.
- Resolved build failures related to missing client directives and type mismatches.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…ability

- Recovered historical state from commit 488b47c and synchronized with main.
- Optimized resolution search by refactoring to stream partial summaries.
- Fixed Mapbox memory leaks by ensuring all event listeners are removed on cleanup.
- Improved StreamableValue stability by initializing with default values.
- Integrated Usage View with mutual exclusion logic and Lottie player visibility fixes.
- Refined model selection for Grok vision and Gemini 1.5 Pro.
- Integrated timezone-aware analysis using tz-lookup.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Fixed Type error in `app/actions.tsx` by casting `content` to `string` in `createStreamableValue`.
- Addressed ESLint warnings in `components/map/mapbox-map.tsx` and `components/chat-panel.tsx` by adding missing dependencies to `useEffect` and `useCallback` hooks.
- Ensured `relatedQueries` streamable value is initialized with an empty state for stability.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
- Integrated Recharts for data visualization
- Added dataAnalysis tool with Zod schema for chart configuration
- Extended chat file uploads to support CSV and JSON files
- Updated researcher agent with guidelines for data visualization
- Integrated MapQueryHandler for geospatial data in analysis results
- Ensured real-time UI streaming for chart components

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@vercel
Copy link
Contributor

vercel bot commented Jan 31, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
qcx Error Error Feb 3, 2026 2:18pm

@charliecreates charliecreates bot requested a review from CharlieHelps January 31, 2026 09:44
@CLAassistant
Copy link

CLAassistant commented Jan 31, 2026

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ ngoiyaeric
❌ google-labs-jules[bot]
You have signed the CLA already but the status is still pending? Let us recheck it.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 31, 2026

Walkthrough

This PR introduces a data analysis and visualization feature with a new GraphSection component for rendering charts, a dataAnalysisTool for agents, and simplified resolutionSearch signatures while adding support for CSV and JSON file inputs.

Changes

Cohort / File(s) Summary
Type and Schema Definitions
lib/types/index.ts, lib/schema/data-analysis.tsx
Added new DataAnalysisResult type and corresponding Zod schema with fields for chart configuration, data arrays, axis keys, series definitions, and optional geospatial points.
Data Visualization Component
components/graph-section.tsx
New React component using Recharts to render bar, line, pie, area, or scatter charts; handles static, string, and StreamableValue inputs with loading states and responsive containers.
Agent Tool and Configuration
lib/agents/tools/data-analysis.tsx, lib/agents/tools/index.tsx
New dataAnalysisTool that streams chart rendering via GraphSection; integrated into getTools export alongside existing tools.
Agent Decision Flow
lib/agents/researcher.tsx
Updated system prompt to introduce data analysis/visualization as step 2 in decision flow; geospatial queries moved to step 3; adjusted subsequent flow steps and added explicit dataAnalysis examples.
Main Application Flow
app/actions.tsx
Simplified resolutionSearch call signature by removing drawnFeatures parameter; updated to handle text/csv/json file inputs; integrated GraphSection rendering and MapQueryHandler for geospatial results; replaced streaming aggregation with direct analysisResult consumption.
File Input Enhancement
components/chat-panel.tsx
Updated file input accept attribute to include text/csv and application/json formats.
Resolution Search Logic
lib/agents/resolution-search.tsx
Removed DrawnFeature type and parameter; replaced streamObject with generateObject for full object retrieval; simplified timezone handling and return flow.
Dependencies
package.json
Added recharts (^3.7.0) dependency for chart rendering.

Sequence Diagram

sequenceDiagram
    participant User
    participant App as App/Actions
    participant DataAnalysisTool
    participant UIStream
    participant GraphSection
    participant Recharts

    User->>App: Submit data for analysis
    App->>DataAnalysisTool: Execute with data
    activate DataAnalysisTool
    DataAnalysisTool->>UIStream: Create streamable value
    DataAnalysisTool->>UIStream: Append GraphSection component
    DataAnalysisTool->>GraphSection: Pass streamable result
    GraphSection->>GraphSection: Detect and resolve data
    GraphSection->>Recharts: Render chart (bar/line/pie/area/scatter)
    Recharts->>GraphSection: Display visualization
    DataAnalysisTool->>UIStream: Mark stream as done
    deactivate DataAnalysisTool
    UIStream-->>User: Display chart in UI
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested reviewers

  • CharlieHelps

Poem

🐰✨ A chart so fine, with bars that align,
Data dances in colors divine!
Graphs leap and bound through Recharts' domain,
Visualizing insights without a strain! 📊🎨

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 16.67% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title directly and clearly describes the main addition to the codebase: a new generative graphs UI component and data analysis tool for creating charts from data.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/generative-graphs-ui-data-preview-17358342468692938721

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Jan 31, 2026

ⓘ Your approaching your monthly quota for Qodo. Upgrade your plan

PR Compliance Guide 🔍

Below is a summary of compliance checks for this PR:

Security Compliance
Unbounded file ingestion

Description: The PR expands accepted uploads to text/plain, text/csv, and application/json and eagerly
reads each file fully into memory (Buffer.from(buffer).toString('utf-8')) and prepends it
into the message text, which can be abused with very large files to cause memory/CPU
exhaustion (availability/DoS) during request handling.
actions.tsx [293-310]

Referred Code
  const dataUrl = `data:${file.type};base64,${Buffer.from(
    buffer
  ).toString('base64')}`
  messageParts.push({
    type: 'image',
    image: dataUrl,
    mimeType: file.type
  })
} else if (
  file.type === 'text/plain' ||
  file.type === 'text/csv' ||
  file.type === 'application/json'
) {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const existingTextPart = messageParts.find(p => p.type === 'text')
  if (existingTextPart) {
    existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
  } else {
Unvalidated JSON parsing

Description: GraphSection performs JSON.parse on a string result without validation or bounds checks,
so a malicious or unexpectedly large tool/stream payload could trigger client-side crashes
or heavy parsing work (availability/DoS) when rendering charts.
graph-section.tsx [45-48]

Referred Code
const chartData: DataAnalysisResult | undefined = typeof data === 'string'
  ? JSON.parse(data)
  : data as DataAnalysisResult
Ticket Compliance
🎫 No ticket provided
  • Create ticket/issue
Codebase Duplication Compliance
Codebase context is not defined

Follow the guide to enable codebase context checks.

Custom Compliance
🟢
Generic: Secure Logging Practices

Objective: To ensure logs are useful for debugging and auditing without exposing sensitive
information like PII, PHI, or cardholder data.

Status: Passed

Learn more about managing compliance generic rules or creating your own custom rules

🔴
Generic: Robust Error Handling and Edge Case Management

Objective: Ensure comprehensive error handling that provides meaningful context and graceful
degradation

Status:
Unhandled parse/shape: The component calls JSON.parse on result and assumes required fields like config.series[0]
and scatter config.yAxisKey exist without guards, which can crash the UI on
malformed/partial tool output.

Referred Code
const chartData: DataAnalysisResult | undefined = typeof data === 'string'
  ? JSON.parse(data)
  : data as DataAnalysisResult

if (pending && !chartData) {
  return (
    <Section className="py-2">
      <div className="animate-pulse flex space-y-4 flex-col">
        <div className="h-4 bg-muted rounded w-3/4"></div>
        <div className="h-64 bg-muted rounded"></div>
      </div>
    </Section>
  )
}

if (!chartData) return null

const { title, description, chartType, data: plotData, config } = chartData

const renderChart = () => {
  switch (chartType) {


 ... (clipped 75 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Secure Error Handling

Objective: To prevent the leakage of sensitive system information through error messages while
providing sufficient detail for internal debugging.

Status:
User-visible exceptions: A failed JSON.parse or missing chart config fields will likely surface as a client-side
exception/stack trace to the end-user because no safe error boundary or fallback messaging
is implemented.

Referred Code
const chartData: DataAnalysisResult | undefined = typeof data === 'string'
  ? JSON.parse(data)
  : data as DataAnalysisResult

if (pending && !chartData) {
  return (
    <Section className="py-2">
      <div className="animate-pulse flex space-y-4 flex-col">
        <div className="h-4 bg-muted rounded w-3/4"></div>
        <div className="h-64 bg-muted rounded"></div>
      </div>
    </Section>
  )
}

if (!chartData) return null

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Security-First Input Validation and Data Handling

Objective: Ensure all data inputs are validated, sanitized, and handled securely to prevent
vulnerabilities

Status:
Unvalidated file content: The new CSV/JSON/plaintext upload path converts arbitrary file bytes to UTF-8 and injects
them into the AI message without size/type/content validation or sanitization, increasing
risk of resource abuse and unsafe downstream handling.

Referred Code
} else if (
  file.type === 'text/plain' ||
  file.type === 'text/csv' ||
  file.type === 'application/json'
) {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const existingTextPart = messageParts.find(p => p.type === 'text')
  if (existingTextPart) {
    existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
  } else {

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Comprehensive Audit Trails

Objective: To create a detailed and reliable record of critical system actions for security analysis
and compliance.

Status:
No audit logging: The newly added CSV/JSON upload handling and geospatial-triggered map updates do not show
any audit logging (user, action, outcome), so it is unclear if these potentially sensitive
actions are captured elsewhere.

Referred Code
  const dataUrl = `data:${file.type};base64,${Buffer.from(
    buffer
  ).toString('base64')}`
  messageParts.push({
    type: 'image',
    image: dataUrl,
    mimeType: file.type
  })
} else if (
  file.type === 'text/plain' ||
  file.type === 'text/csv' ||
  file.type === 'application/json'
) {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const existingTextPart = messageParts.find(p => p.type === 'text')
  if (existingTextPart) {
    existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
  } else {
    messageParts.push({ type: 'text', text: textContent })
  }
}


 ... (clipped 457 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Meaningful Naming and Self-Documenting Code

Objective: Ensure all identifiers clearly express their purpose and intent, making code
self-documenting

Status:
Overly generic types: The new DataAnalysisResult uses broadly named fields like data with type any[], which
reduces self-documentation and makes the expected chart payload structure unclear.

Referred Code
export type DataAnalysisResult = {
  title: string;
  description?: string;
  chartType: 'bar' | 'line' | 'pie' | 'area' | 'scatter';
  data: any[];
  config: {
    xAxisKey: string;
    yAxisKey?: string;
    series: {
      key: string;
      name: string;
      color?: string;
    }[];
  };
  geospatial?: {
    latitude: number;
    longitude: number;
    label?: string;
  }[];
};

Learn more about managing compliance generic rules or creating your own custom rules

  • Update
Compliance status legend 🟢 - Fully Compliant
🟡 - Partial Compliant
🔴 - Not Compliant
⚪ - Requires Further Human Verification
🏷️ - Compliance label

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Jan 31, 2026

ⓘ Your approaching your monthly quota for Qodo. Upgrade your plan

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
High-level
Decouple charting from map updates

Refactor the code to decouple the data analysis and mapping features. Instead of
the UI manually triggering a map update, the AI agent should orchestrate this by
making a separate, explicit call to the geospatial tool when location data is
present in the analysis results.

Examples:

app/actions.tsx [754-769]
                      {toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
                        <MapQueryHandler
                          toolOutput={{
                            type: 'MAP_QUERY_TRIGGER',
                            originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
                            timestamp: new Date().toISOString(),
                            mcp_response: {
                              location: {
                                latitude: toolOutput.geospatial[0].latitude,
                                longitude: toolOutput.geospatial[0].longitude,

 ... (clipped 6 lines)

Solution Walkthrough:

Before:

// In app/actions.tsx, inside the tool rendering switch statement
case 'dataAnalysis':
  return {
    id,
    component: (
      <>
        <GraphSection result={...} />
        {toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
          // Manually constructing a fake map query response
          <MapQueryHandler
            toolOutput={{
              type: 'MAP_QUERY_TRIGGER',
              mcp_response: { location: { ...toolOutput.geospatial[0] } }
            }}
          />
        )}
      </>
    ),
    ...
  }

After:

// 1. In the AI agent's logic (e.g., lib/agents/researcher.tsx)
// The agent would be responsible for orchestration

// ... after calling dataAnalysis tool
const analysisResult = await tool.execute(args);

// If the result has geo data, the agent makes another tool call
if (analysisResult.geospatial && analysisResult.geospatial.length > 0) {
  const geospatialTool = tools.geospatialQueryTool;
  // This would trigger the map update through the standard tool-use flow
  await geospatialTool.execute({ points: analysisResult.geospatial });
}

// 2. In app/actions.tsx, the UI logic is simplified
case 'dataAnalysis':
  return {
    id,
    component: <GraphSection result={searchResults.value} />,
    ...
  };
Suggestion importance[1-10]: 9

__

Why: This suggestion correctly identifies a significant architectural flaw, where the UI rendering logic for dataAnalysis is tightly coupled with map updates, making the system brittle and hard to scale.

High
Possible issue
Correctly pass data to scatter chart

Move the data={plotData} prop from the component to its parent component to
fix data rendering in the scatter chart.

components/graph-section.tsx [125-137]

 case 'scatter':
   return (
-    <ScatterChart>
+    <ScatterChart data={plotData}>
       <CartesianGrid strokeDasharray="3 3" />
       <XAxis type="number" dataKey={config.xAxisKey} name={config.xAxisKey} />
       <YAxis type="number" dataKey={config.yAxisKey} name={config.yAxisKey} />
       <Tooltip cursor={{ strokeDasharray: '3 3' }} />
       <Legend />
       {config.series.map((s, i) => (
-        <Scatter key={s.key} name={s.name} data={plotData} fill={s.color || COLORS[i % COLORS.length]} />
+        <Scatter key={s.key} name={s.name} fill={s.color || COLORS[i % COLORS.length]} />
       ))}
     </ScatterChart>
   )
  • Apply / Chat
Suggestion importance[1-10]: 9

__

Why: This suggestion correctly identifies a bug where the data prop is passed to the wrong component, which would prevent the scatter chart from rendering. Applying this fix is critical for the correct functionality of the new feature.

High
General
Handle JSON parse errors

Add a try...catch block for JSON.parse to prevent crashes from invalid JSON and
display an error message to the user if parsing fails.

components/graph-section.tsx [45-47]

-const chartData: DataAnalysisResult | undefined = typeof data === 'string'
-  ? JSON.parse(data)
-  : data as DataAnalysisResult
+let chartData: DataAnalysisResult | undefined
+if (typeof data === 'string') {
+  try {
+    chartData = JSON.parse(data)
+  } catch (e) {
+    console.error('Failed to parse chart data:', e)
+    return (
+      <Section className="py-2">
+        <div className="text-red-600">Error: Invalid chart data format.</div>
+      </Section>
+    )
+  }
+} else {
+  chartData = data as DataAnalysisResult
+}
  • Apply / Chat
Suggestion importance[1-10]: 8

__

Why: This suggestion improves the component's robustness by adding error handling for JSON parsing. It prevents potential crashes and provides clear feedback to the user by rendering an error message, which is better than failing silently.

Medium
Provide meaningful input to map handler

Update the originalUserInput prop for MapQueryHandler to be a descriptive string
instead of a stringified JSON object for better context.

app/actions.tsx [755-768]

 <MapQueryHandler
   toolOutput={{
     type: 'MAP_QUERY_TRIGGER',
-    originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
+    originalUserInput: `Location from data analysis: ${toolOutput.geospatial[0].label || 'Point'}`,
     timestamp: new Date().toISOString(),
     mcp_response: {
       location: {
         latitude: toolOutput.geospatial[0].latitude,
         longitude: toolOutput.geospatial[0].longitude,
         place_name: toolOutput.geospatial[0].label
       }
     }
   }}
 />
  • Apply / Chat
Suggestion importance[1-10]: 6

__

Why: The suggestion correctly points out that passing a stringified JSON object to originalUserInput is likely incorrect and improves it by providing a more user-friendly string. This enhances the user experience by providing better context on the map.

Low
  • Update

Copy link

@charliecreates charliecreates bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Main blockers are in components/graph-section.tsx: conditional hook usage (useStreamableValue) is a Rules of Hooks violation and can break rendering, and unguarded JSON.parse can crash the UI. There are also correctness gaps in chart rendering (pie assumes series[0], scatter duplicates identical series), and the tool’s “streaming” path currently adds complexity without actual incremental streaming. Consider tightening the dataAnalysisSchema to reduce runtime failures from malformed model output.

Additional notes (1)
  • Performance | app/actions.tsx:298-308
    The file ingestion now treats application/json uploads as plain text and prepends the entire file content into the model prompt. Large JSON/CSV files can easily blow prompt limits and degrade model performance or cause failures.

At minimum, implement size caps and/or truncation/summarization before injecting into messageParts, and consider attaching metadata (filename, row count) rather than full raw content by default.

Summary of changes

What this PR changes

  • Adds a new dataAnalysis tool (lib/agents/tools/data-analysis.tsx) backed by a Zod schema (lib/schema/data-analysis.tsx) to generate structured chart specs.
  • Introduces GraphSection UI (components/graph-section.tsx) using Recharts to render bar/line/area/pie/scatter charts, including support for streamed tool output.
  • Extends file uploads to accept text/csv and application/json in the chat UI (components/chat-panel.tsx) and server action parsing (app/actions.tsx).
  • Updates AI → UI mapping (app/actions.tsx) to render GraphSection for dataAnalysis tool results and trigger map updates when geospatial points are present.
  • Updates researcher agent instructions (lib/agents/researcher.tsx) to include a decision path for dataAnalysis.
  • Adds dependency: recharts in package.json.
  • Adds a new exported type DataAnalysisResult (lib/types/index.ts).

Comment on lines 45 to 48
const chartData: DataAnalysisResult | undefined = typeof data === 'string'
? JSON.parse(data)
: data as DataAnalysisResult

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JSON.parse(data) is performed without any error handling. If the AI/tool ever emits a partial string (streaming), malformed JSON, or a plain-text error message, this will throw during render and crash the client subtree.

This is especially risky given the component explicitly supports streamed and string inputs.

Suggestion

Guard JSON.parse with a try/catch and render a safe fallback UI (or return null) if parsing fails.

let chartData: DataAnalysisResult | undefined
if (typeof data === 'string') {
  try {
    chartData = JSON.parse(data)
  } catch {
    chartData = undefined
  }
} else {
  chartData = data as DataAnalysisResult
}

Optionally show a small CardDescription like “Unable to render chart: invalid data” for debuggability.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this change.

Comment on lines 41 to 60
const [streamData, error, pending] = useStreamableValue(isStreamable ? (result as any) : undefined)

const data = isStreamable ? streamData : result;

const chartData: DataAnalysisResult | undefined = typeof data === 'string'
? JSON.parse(data)
: data as DataAnalysisResult

if (pending && !chartData) {
return (
<Section className="py-2">
<div className="animate-pulse flex space-y-4 flex-col">
<div className="h-4 bg-muted rounded w-3/4"></div>
<div className="h-64 bg-muted rounded"></div>
</div>
</Section>
)
}

if (!chartData) return null

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error value returned by useStreamableValue is computed but ignored. If the stream errors, the user will just see an empty section or partial UI with no explanation.

Also, pending && !chartData only handles the initial loading state; if the stream errors, you should surface that explicitly.

Suggestion

Handle the error case explicitly and render an error state.

if (error) {
  return (
    <Section className="py-2">
      <Card>
        <CardHeader className="pb-2">
          <CardTitle className="text-lg font-medium">Chart unavailable</CardTitle>
          <CardDescription>{String(error)}</CardDescription>
        </CardHeader>
      </Card>
    </Section>
  )
}

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this error handling.

Comment on lines 105 to 123
case 'pie':
return (
<PieChart>
<Pie
data={plotData}
dataKey={config.series[0].key}
nameKey={config.xAxisKey}
cx="50%"
cy="50%"
outerRadius={80}
label
>
{plotData.map((entry, index) => (
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
))}
</Pie>
<Tooltip />
<Legend />
</PieChart>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pie rendering assumes config.series[0] always exists. If the tool returns an empty series array, this will throw and crash rendering.

Even if the schema intends series to exist, model output can still be inconsistent at runtime.

Suggestion

Add a defensive guard for series.length === 0 and render a fallback message.

case 'pie': {
  const first = config.series[0]
  if (!first) return <div>Missing series configuration for pie chart.</div>
  // ...use `first.key`
}

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this guard.

Comment on lines 125 to 137
case 'scatter':
return (
<ScatterChart>
<CartesianGrid strokeDasharray="3 3" />
<XAxis type="number" dataKey={config.xAxisKey} name={config.xAxisKey} />
<YAxis type="number" dataKey={config.yAxisKey} name={config.yAxisKey} />
<Tooltip cursor={{ strokeDasharray: '3 3' }} />
<Legend />
{config.series.map((s, i) => (
<Scatter key={s.key} name={s.name} data={plotData} fill={s.color || COLORS[i % COLORS.length]} />
))}
</ScatterChart>
)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

scatter rendering currently maps config.series but each <Scatter ... data={plotData} /> uses the same plotData and does not use s.key at all. This likely renders duplicated scatter series with identical points, and the legend will be misleading.

If the intention is to support multiple series, each series should typically point to a different dataKey / shape, or you should render a single <Scatter> and use config for axes only.

Suggestion

Either:

  1. Render a single scatter series:
case 'scatter':
  return (
    <ScatterChart>
      ...
      <Scatter name={config.series[0]?.name ?? 'Series'} data={plotData} fill={...} />
    </ScatterChart>
  )

or 2) If you need multi-series scatter, change the schema/transform so each series references a distinct dataset (e.g. series[i].dataKey or separate arrays).

Reply with "@CharlieHelps yes please" if you'd like me to add a commit implementing option (1) with basic guards.

Comment on lines +7 to +18
export const dataAnalysisTool = ({ uiStream }: ToolProps) => ({
description: 'Analyze data and generate a structured representation for visualization in a graph or chart. Use this tool when the user asks for a chart, graph, or data visualization, or when you have structured data (like from a CSV or search results) that would be better understood visually.',
parameters: dataAnalysisSchema,
execute: async (result: DataAnalysisResult) => {
const streamResults = createStreamableValue<DataAnalysisResult>()

uiStream.append(<GraphSection result={streamResults.value} />)

streamResults.done(result)

return result
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dataAnalysisTool.execute creates a streamable value, appends <GraphSection result={streamResults.value} />, and then immediately calls streamResults.done(result). This doesn’t actually stream incremental updates; it just publishes a single final payload.

If the intent is real-time rendering, this should either:

  • append a placeholder and update it over time, or
  • stream partial results (e.g., streamResults.update(...) as analysis progresses).

As-is, the extra stream plumbing adds complexity without benefit.

Suggestion

If you don’t plan to stream partial chart specs, simplify:

  • remove createStreamableValue usage
  • append <GraphSection result={result} /> directly
execute: async (result: DataAnalysisResult) => {
  uiStream.append(<GraphSection result={result} />)
  return result
}

If you do plan to stream, change the tool to progressively update the streamable value with intermediate states.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with the simplification.

Comment on lines +748 to +769
case 'dataAnalysis':
return {
id,
component: (
<>
<GraphSection result={searchResults.value} />
{toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
<MapQueryHandler
toolOutput={{
type: 'MAP_QUERY_TRIGGER',
originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
timestamp: new Date().toISOString(),
mcp_response: {
location: {
latitude: toolOutput.geospatial[0].latitude,
longitude: toolOutput.geospatial[0].longitude,
place_name: toolOutput.geospatial[0].label
}
}
}}
/>
)}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This constructs a MAP_QUERY_TRIGGER with originalUserInput: JSON.stringify(toolOutput.geospatial[0]) and uses only the first geospatial point. If multiple points are returned, the map will ignore all but one.

Also, this code is doing a lot of ad-hoc shaping of a map tool payload inside UI-state mapping logic, which makes the contract fragile and hard to evolve.

Suggestion

At minimum, consider passing the entire geospatial array through and letting MapQueryHandler decide what to do, or explicitly document that only the first point is supported.

If MapQueryHandler expects a single location, add a small helper function (in this file) to build the payload so the shape is centralized and testable.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit that factors payload creation into a helper and (optionally) uses the first item explicitly with a comment.

Comment on lines +4 to +23
export const dataAnalysisSchema = z.object({
title: z.string().describe('The title of the chart'),
description: z.string().optional().describe('A brief description of the chart'),
chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
data: z.array(z.record(z.any())).describe('The data points for the chart'),
config: z.object({
xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
series: z.array(z.object({
key: z.string().describe('The key in the data object for this series'),
name: z.string().describe('The display name for this series'),
color: z.string().optional().describe('Optional hex color for this series')
})).describe('The series to be plotted')
}).describe('Configuration for the chart layout'),
geospatial: z.array(z.object({
latitude: z.number(),
longitude: z.number(),
label: z.string().optional()
})).optional().describe('Optional geospatial data points to be displayed on a map')
})

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DataAnalysisResult uses any[] and z.record(z.any()) in the schema. This makes the tool output effectively untyped and pushes failures into runtime rendering (e.g., missing keys for axes/series).

Given this is a core integration point between the model and UI, it’s worth tightening validation at least around required keys (xAxisKey, series keys existing in data, etc.).

Suggestion

Improve runtime robustness by refining the schema:

  • use z.record(z.unknown()) instead of z.any() to avoid implicit any
  • add min(1) on data and config.series
  • add a superRefine that verifies xAxisKey and each series.key appear in at least one data row
data: z.array(z.record(z.unknown())).min(1),
config: z.object({
  xAxisKey: z.string(),
  series: z.array(...).min(1),
}).superRefine((cfg, ctx) => { /* validate keys */ })

Reply with "@CharlieHelps yes please" if you'd like me to add a commit that tightens the schema with min(1) and z.unknown() (and optionally key checks).

Comment on lines +4 to +23
export const dataAnalysisSchema = z.object({
title: z.string().describe('The title of the chart'),
description: z.string().optional().describe('A brief description of the chart'),
chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
data: z.array(z.record(z.any())).describe('The data points for the chart'),
config: z.object({
xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
series: z.array(z.object({
key: z.string().describe('The key in the data object for this series'),
name: z.string().describe('The display name for this series'),
color: z.string().optional().describe('Optional hex color for this series')
})).describe('The series to be plotted')
}).describe('Configuration for the chart layout'),
geospatial: z.array(z.object({
latitude: z.number(),
longitude: z.number(),
label: z.string().optional()
})).optional().describe('Optional geospatial data points to be displayed on a map')
})

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dataAnalysisSchema uses z.array(z.record(z.any())) for data. This forces the model/tool pipeline to accept arbitrarily-shaped values and makes it easy to return non-serializable objects (Dates, nested functions, etc.) or huge payloads.

Since the UI expects Recharts-friendly JSON, you should constrain this to JSON-serializable primitives (e.g., string | number | boolean | null) and objects/arrays thereof, or at least z.record(z.union([z.string(), z.number(), z.boolean(), z.null()])) for flat chart rows. This reduces runtime render errors and payload bloat.

Suggestion

Tighten the data schema to JSON-serializable and ideally flat row records.

Example (flat rows):

const cell = z.union([z.string(), z.number(), z.boolean(), z.null()])
export const dataAnalysisSchema = z.object({
  // ...
  data: z.array(z.record(cell)).describe('Flat row records for the chart')
  // ...
})

If you need nested structures, model them explicitly rather than any.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.

Comment on lines +28 to +47
export type DataAnalysisResult = {
title: string;
description?: string;
chartType: 'bar' | 'line' | 'pie' | 'area' | 'scatter';
data: any[];
config: {
xAxisKey: string;
yAxisKey?: string;
series: {
key: string;
name: string;
color?: string;
}[];
};
geospatial?: {
latitude: number;
longitude: number;
label?: string;
}[];
};

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DataAnalysisResult in lib/types/index.ts uses any[] for data. This is type-valid but defeats type safety across the rendering surface and makes it easy to accidentally pass non-JSON/non-Recharts-friendly values into the chart components.

Given you already have a Zod schema for the tool, consider deriving the TS type from Zod (z.infer<typeof dataAnalysisSchema>) or at least using a safer record type (Record<string, string | number | boolean | null>).

Suggestion

Prefer deriving DataAnalysisResult from the Zod schema to keep runtime/compile-time aligned.

Example:

// lib/types/index.ts
import type { z } from 'zod'
import { dataAnalysisSchema } from '@/lib/schema/data-analysis'
export type DataAnalysisResult = z.infer<typeof dataAnalysisSchema>

Or if you want to avoid importing schema into types, at least replace any[] with Array<Record<string, string | number | boolean | null>>.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.

@charliecreates charliecreates bot removed the request for review from CharlieHelps January 31, 2026 09:50
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
lib/agents/researcher.tsx (1)

24-75: ⚠️ Potential issue | 🟠 Major

Resolve conflict between “Geospatial Priority” and the new dataAnalysis step.

The prompt says any location query must use geospatialQueryTool, but the decision flow now prioritizes dataAnalysis for charts. For location‑based charts, the model may pick the wrong tool. Clarify precedence (e.g., “chart requests → dataAnalysis; maps/directions/places → geospatialQueryTool”) to keep behavior deterministic.

🤖 Fix all issues with AI agents
In `@components/graph-section.tsx`:
- Around line 35-60: GraphSection currently ignores the error returned from
useStreamableValue and calls JSON.parse without try/catch; update the component
to first check the error from useStreamableValue (variable error) and render an
appropriate fallback (null or an error UI) when error exists, then wrap the
JSON.parse logic that produces chartData in a try/catch to safely handle
malformed strings and set chartData to undefined (or an error state) on parse
failure, and ensure pending logic still uses the safe chartData variable;
reference symbols: GraphSection, useStreamableValue, streamData, error, pending,
chartData, JSON.parse.

In `@lib/agents/tools/data-analysis.tsx`:
- Around line 10-18: The execute function should be made resilient by wrapping
the UI and stream operations in a try-catch: inside execute (which uses
createStreamableValue(), uiStream.append(<GraphSection ...>) and
streamResults.done(result)), wrap the uiStream.append and streamResults.done
calls in a try block and in the catch log the error (e.g., processLogger.error
or console.error) and call streamResults.error or streamResults.done with a safe
fallback to ensure the stream is completed; finally return the original result
(or a safe fallback) so the caller does not receive an unhandled exception.

In `@lib/schema/data-analysis.tsx`:
- Around line 4-23: Update dataAnalysisSchema to enforce non-empty series and
conditional yAxisKey rules: ensure config.series uses z.array(...).min(1) so
charts that access config.series[0].key (e.g., pie handling) cannot be empty,
and make chartType a discriminated union or add a z.refine on dataAnalysisSchema
that when chartType === 'scatter' requires config.yAxisKey to be
present/non-empty; reference dataAnalysisSchema, chartType, config.series, and
config.yAxisKey when implementing the min(1) and conditional requirement so
runtime accesses are safe.
- Line 25: The current type alias PartialDataAnalysis applies DeepPartial to the
schema instance instead of the inferred data shape; change the alias to apply
DeepPartial to the inferred type by using z.infer on dataAnalysisSchema (e.g.,
update PartialDataAnalysis to DeepPartial<z.infer<typeof dataAnalysisSchema>>),
referencing the existing dataAnalysisSchema and PartialDataAnalysis identifiers
so the type represents the actual data structure.

In `@lib/types/index.ts`:
- Around line 42-46: Rename the optional text field in the geospatial array to
match the MapData.markers interface: change geospatial's label?: string to
title?: string so it aligns with MapData.markers (referencing geospatial and
MapData.markers, and the label?/title? property names) and update any usages or
types that reference geospatial to use title instead of label.
- Line 32: The type declaration uses data: any[] which loses type safety; change
the DataAnalysisResult type to use a safer type such as data: Record<string,
unknown>[] or make it generic (e.g., export type DataAnalysisResult<T extends
Record<string, unknown> = Record<string, unknown>> = { ... data: T[]; ... }) and
update usages of DataAnalysisResult to pass the concrete T where appropriate
(refer to the DataAnalysisResult type name and the data property to locate the
change).
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 25a50ed and 8c7120d.

⛔ Files ignored due to path filters (1)
  • bun.lock is excluded by !**/*.lock
📒 Files selected for processing (9)
  • app/actions.tsx
  • components/chat-panel.tsx
  • components/graph-section.tsx
  • lib/agents/researcher.tsx
  • lib/agents/tools/data-analysis.tsx
  • lib/agents/tools/index.tsx
  • lib/schema/data-analysis.tsx
  • lib/types/index.ts
  • package.json
🧰 Additional context used
🧬 Code graph analysis (4)
lib/agents/tools/index.tsx (4)
lib/agents/tools/geospatial.tsx (3)
  • tools (318-318)
  • uiStream (158-407)
  • queryType (333-342)
lib/agents/tools/search.tsx (1)
  • ToolProps (9-77)
lib/agents/tools/retrieve.tsx (1)
  • ToolProps (8-67)
lib/agents/tools/video-search.tsx (1)
  • ToolProps (8-50)
lib/types/index.ts (3)
components/map/map-query-handler.tsx (1)
  • GeospatialToolOutput (18-23)
components/map/map-data-context.tsx (2)
  • MapData (15-32)
  • MapDataContextType (34-37)
mapbox_mcp/hooks.ts (1)
  • DistanceResult (22-29)
components/graph-section.tsx (2)
components/video-search-section.tsx (1)
  • VideoSearchSection (14-35)
components/section.tsx (1)
  • icon (24-75)
app/actions.tsx (1)
components/map/map-query-handler.tsx (5)
  • GeospatialToolOutput (18-23)
  • prevData (39-48)
  • toolOutput (33-73)
  • prevData (52-56)
  • MapQueryHandlerProps (25-28)
🔇 Additional comments (6)
components/chat-panel.tsx (1)

213-218: Accept list now covers CSV/JSON uploads.

This aligns with the server-side ingestion logic.

lib/agents/tools/index.tsx (1)

6-33: dataAnalysis tool is correctly wired into the tool registry.

The new tool is integrated consistently with existing tooling patterns.

app/actions.tsx (2)

301-305: CSV/JSON file ingestion is handled alongside text/plain.

Good alignment with the expanded upload accept list.


748-773: GraphSection + optional map update wiring looks consistent.

The dataAnalysis tool output now renders graphs and updates the map when geospatial points are provided.

package.json (1)

89-89: No action required. Recharts 3.7.0 explicitly supports React 19 in its peer dependencies (react: ^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0) and is compatible with React 19.1.2 and Next.js 15. This is the latest stable version as of January 2026.

lib/agents/tools/data-analysis.tsx (1)

11-15: Note: Streamable value is immediately completed.

The createStreamableValue is created and immediately marked as done on the next line. While this works correctly for one-shot rendering, it doesn't leverage the progressive streaming capability. If the intent is to support incremental chart updates (e.g., data loading in chunks), you'd need to call streamResults.update() with partial data before done().

If one-shot rendering is the intended behavior, this is fine as-is.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +10 to +18
execute: async (result: DataAnalysisResult) => {
const streamResults = createStreamableValue<DataAnalysisResult>()

uiStream.append(<GraphSection result={streamResults.value} />)

streamResults.done(result)

return result
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Add error handling for robustness.

The execute function lacks error handling. If uiStream.append throws or if the result contains invalid data that causes GraphSection to fail, the error will propagate unhandled. Consider wrapping in a try-catch to ensure graceful degradation.

🛡️ Proposed error handling
   execute: async (result: DataAnalysisResult) => {
+    try {
       const streamResults = createStreamableValue<DataAnalysisResult>()
 
       uiStream.append(<GraphSection result={streamResults.value} />)
 
       streamResults.done(result)
 
       return result
+    } catch (error) {
+      console.error('Data analysis tool error:', error)
+      throw error
+    }
   }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
execute: async (result: DataAnalysisResult) => {
const streamResults = createStreamableValue<DataAnalysisResult>()
uiStream.append(<GraphSection result={streamResults.value} />)
streamResults.done(result)
return result
}
execute: async (result: DataAnalysisResult) => {
try {
const streamResults = createStreamableValue<DataAnalysisResult>()
uiStream.append(<GraphSection result={streamResults.value} />)
streamResults.done(result)
return result
} catch (error) {
console.error('Data analysis tool error:', error)
throw error
}
}
🤖 Prompt for AI Agents
In `@lib/agents/tools/data-analysis.tsx` around lines 10 - 18, The execute
function should be made resilient by wrapping the UI and stream operations in a
try-catch: inside execute (which uses createStreamableValue(),
uiStream.append(<GraphSection ...>) and streamResults.done(result)), wrap the
uiStream.append and streamResults.done calls in a try block and in the catch log
the error (e.g., processLogger.error or console.error) and call
streamResults.error or streamResults.done with a safe fallback to ensure the
stream is completed; finally return the original result (or a safe fallback) so
the caller does not receive an unhandled exception.

Comment on lines +4 to +23
export const dataAnalysisSchema = z.object({
title: z.string().describe('The title of the chart'),
description: z.string().optional().describe('A brief description of the chart'),
chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
data: z.array(z.record(z.any())).describe('The data points for the chart'),
config: z.object({
xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
series: z.array(z.object({
key: z.string().describe('The key in the data object for this series'),
name: z.string().describe('The display name for this series'),
color: z.string().optional().describe('Optional hex color for this series')
})).describe('The series to be plotted')
}).describe('Configuration for the chart layout'),
geospatial: z.array(z.object({
latitude: z.number(),
longitude: z.number(),
label: z.string().optional()
})).optional().describe('Optional geospatial data points to be displayed on a map')
})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Search for the files mentioned and check their content
rg -n "series\[0\]|yAxisKey" --max-count=20

Repository: QueueLab/QCX

Length of output: 447


🏁 Script executed:

#!/bin/bash
# Read GraphSection component around the usage lines
cat -n components/graph-section.tsx | sed -n '100,140p'

Repository: QueueLab/QCX

Length of output: 1811


🏁 Script executed:

#!/bin/bash
# Search for where dataAnalysisSchema is used
rg -n "dataAnalysisSchema|DataAnalysis" lib/types/index.ts components/graph-section.tsx -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 1312


🏁 Script executed:

#!/bin/bash
# Get the full DataAnalysisResult type
sed -n '28,60p' lib/types/index.ts

Repository: QueueLab/QCX

Length of output: 650


Enforce series and yAxisKey constraints to prevent runtime errors.

GraphSection accesses config.series[0].key for pie charts (line 110) and config.yAxisKey for scatter charts (line 130) without guards. The current schema allows empty series and missing y-axis keys, which will cause crashes or render blank charts. Add a minimum series length constraint and conditional requirements for scatter and pie chart types.

🛡️ Suggested schema hardening
-export const dataAnalysisSchema = z.object({
+export const dataAnalysisSchema = z
+  .object({
   title: z.string().describe('The title of the chart'),
   description: z.string().optional().describe('A brief description of the chart'),
   chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
   data: z.array(z.record(z.any())).describe('The data points for the chart'),
   config: z.object({
     xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
     yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
     series: z.array(z.object({
       key: z.string().describe('The key in the data object for this series'),
       name: z.string().describe('The display name for this series'),
       color: z.string().optional().describe('Optional hex color for this series')
-    })).describe('The series to be plotted')
+    })).min(1, 'At least one series is required').describe('The series to be plotted')
   }).describe('Configuration for the chart layout'),
   geospatial: z.array(z.object({
     latitude: z.number(),
     longitude: z.number(),
     label: z.string().optional()
   })).optional().describe('Optional geospatial data points to be displayed on a map')
-})
+  })
+  .superRefine((val, ctx) => {
+    if (val.chartType === 'scatter' && !val.config.yAxisKey) {
+      ctx.addIssue({
+        code: z.ZodIssueCode.custom,
+        path: ['config', 'yAxisKey'],
+        message: 'yAxisKey is required for scatter charts'
+      })
+    }
+    if (val.chartType === 'pie' && val.config.series.length !== 1) {
+      ctx.addIssue({
+        code: z.ZodIssueCode.custom,
+        path: ['config', 'series'],
+        message: 'Pie charts must define exactly one series'
+      })
+    }
+  })
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export const dataAnalysisSchema = z.object({
title: z.string().describe('The title of the chart'),
description: z.string().optional().describe('A brief description of the chart'),
chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
data: z.array(z.record(z.any())).describe('The data points for the chart'),
config: z.object({
xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
series: z.array(z.object({
key: z.string().describe('The key in the data object for this series'),
name: z.string().describe('The display name for this series'),
color: z.string().optional().describe('Optional hex color for this series')
})).describe('The series to be plotted')
}).describe('Configuration for the chart layout'),
geospatial: z.array(z.object({
latitude: z.number(),
longitude: z.number(),
label: z.string().optional()
})).optional().describe('Optional geospatial data points to be displayed on a map')
})
export const dataAnalysisSchema = z
.object({
title: z.string().describe('The title of the chart'),
description: z.string().optional().describe('A brief description of the chart'),
chartType: z.enum(['bar', 'line', 'pie', 'area', 'scatter']).describe('The type of chart to render'),
data: z.array(z.record(z.any())).describe('The data points for the chart'),
config: z.object({
xAxisKey: z.string().describe('The key in the data object to use for the X axis'),
yAxisKey: z.string().optional().describe('The key in the data object to use for the Y axis (for scatter charts)'),
series: z.array(z.object({
key: z.string().describe('The key in the data object for this series'),
name: z.string().describe('The display name for this series'),
color: z.string().optional().describe('Optional hex color for this series')
})).min(1, 'At least one series is required').describe('The series to be plotted')
}).describe('Configuration for the chart layout'),
geospatial: z.array(z.object({
latitude: z.number(),
longitude: z.number(),
label: z.string().optional()
})).optional().describe('Optional geospatial data points to be displayed on a map')
})
.superRefine((val, ctx) => {
if (val.chartType === 'scatter' && !val.config.yAxisKey) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
path: ['config', 'yAxisKey'],
message: 'yAxisKey is required for scatter charts'
})
}
if (val.chartType === 'pie' && val.config.series.length !== 1) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
path: ['config', 'series'],
message: 'Pie charts must define exactly one series'
})
}
})
🤖 Prompt for AI Agents
In `@lib/schema/data-analysis.tsx` around lines 4 - 23, Update dataAnalysisSchema
to enforce non-empty series and conditional yAxisKey rules: ensure config.series
uses z.array(...).min(1) so charts that access config.series[0].key (e.g., pie
handling) cannot be empty, and make chartType a discriminated union or add a
z.refine on dataAnalysisSchema that when chartType === 'scatter' requires
config.yAxisKey to be present/non-empty; reference dataAnalysisSchema,
chartType, config.series, and config.yAxisKey when implementing the min(1) and
conditional requirement so runtime accesses are safe.

})).optional().describe('Optional geospatial data points to be displayed on a map')
})

export type PartialDataAnalysis = DeepPartial<typeof dataAnalysisSchema>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, let's look at the file structure and content
find . -type f -name "data-analysis.tsx" -o -name "*analysis*" | head -20

Repository: QueueLab/QCX

Length of output: 442


🏁 Script executed:

# Check the current file content around line 25
cat -n lib/schema/data-analysis.tsx | head -40

Repository: QueueLab/QCX

Length of output: 1521


🏁 Script executed:

# Search for DeepPartial definition
rg "DeepPartial" -B 2 -A 2 --type ts --type tsx

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Check for PartialDataAnalysis usage
rg "PartialDataAnalysis" -B 2 -A 2

Repository: QueueLab/QCX

Length of output: 219


🏁 Script executed:

# Check the imports to understand Zod usage
cat -n lib/schema/data-analysis.tsx | grep -E "import|from" | head -20

Repository: QueueLab/QCX

Length of output: 126


🏁 Script executed:

# Search for all usages of PartialDataAnalysis
rg "PartialDataAnalysis" --type ts --type tsx -B 3 -A 3

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Also check in JavaScript/JSX files
rg "PartialDataAnalysis" -B 3 -A 3

Repository: QueueLab/QCX

Length of output: 334


🌐 Web query:

Vercel AI SDK DeepPartial type utility Zod schema inference

💡 Result:

In the Vercel AI SDK, DeepPartial<T> is used to type “partial” objects that arrive during streaming, not to change what Zod infers.

How it relates to Zod inference

  • Your full, validated type is still the normal Zod inference:
    type MyType = z.infer<typeof mySchema>;
  • The AI SDK uses DeepPartial<MyType> for partial (incomplete) streamed updates. For example, the Output helper returns an Output<OBJECT, DeepPartial<OBJECT>>, where the second type parameter is the partial streaming shape. (ai-sdk.dev)

So if you want to explicitly type streamed partials alongside your Zod-inferred full type:

import type { DeepPartial } from 'ai';
import { z } from 'zod';

const mySchema = z.object({
  user: z.object({
    name: z.string(),
    age: z.number(),
  }),
});

type Full = z.infer<typeof mySchema>;
type Partial = DeepPartial<Full>; // what you’ll see mid-stream

Tools: getting Zod input inference for execute

If your question is specifically about tool input inference: use the AI SDK’s tool() helper, because it’s what connects inputSchema (Zod) to the execute parameter type. (ai-sdk.dev)

Zod schemas in the AI SDK

You can pass Zod directly; internally the SDK converts it (via zodSchema()), and zodSchema() exists if you need options like useReferences. (ai-sdk.dev)

Citations:


🌐 Web query:

TypeScript DeepPartial ZodObject type vs inferred data type

💡 Result:

In Zod, a ZodObject is the schema object (a class instance with methods like .parse, .partial, etc.), while z.infer<typeof schema> (or z.output<typeof schema>) is the validated data type produced by that schema.

Why DeepPartial<typeof mySchema> is usually wrong

If you write:

type X = DeepPartial<typeof userSchema>;

you’re making the schema instance type partial (its methods/properties), not the shape of the data you want to validate/handle. That’s almost never what you intend.

The “data” deep-partial type

If you want a deep-partial TypeScript type of the parsed data:

type User = z.infer<typeof userSchema>;      // output type (parsed result) ([npmjs.com](https://www.npmjs.com/package/zod/v/3.0.0-beta.4?utm_source=openai))
type UserPatch = DeepPartial<User>;

If your schema uses transforms, note that z.infer gives the output type; use z.input<typeof schema> if you want the pre-transform input type. (npmjs.com)

The “schema” deep-partial (runtime validation)

If you want a schema that accepts partial data at runtime, create a new schema:

const userPatchSchema = userSchema.deepPartial(); // deep optional at runtime ([npmjs.com](https://www.npmjs.com/package/zod/v/3.14.5?utm_source=openai))
type UserPatch = z.infer<typeof userPatchSchema>;

(For shallow optional fields, use .partial().) (npmjs.com)

Citations:


Use z.infer to properly type the inferred data shape, not the schema instance.

DeepPartial<typeof dataAnalysisSchema> incorrectly makes the ZodObject schema type partial rather than the data shape. Per Vercel AI SDK and Zod best practices, DeepPartial should be applied to the inferred data type.

✅ Type-safe alias
+export type DataAnalysisInput = z.infer<typeof dataAnalysisSchema>
-export type PartialDataAnalysis = DeepPartial<typeof dataAnalysisSchema>
+export type PartialDataAnalysis = DeepPartial<DataAnalysisInput>
🤖 Prompt for AI Agents
In `@lib/schema/data-analysis.tsx` at line 25, The current type alias
PartialDataAnalysis applies DeepPartial to the schema instance instead of the
inferred data shape; change the alias to apply DeepPartial to the inferred type
by using z.infer on dataAnalysisSchema (e.g., update PartialDataAnalysis to
DeepPartial<z.infer<typeof dataAnalysisSchema>>), referencing the existing
dataAnalysisSchema and PartialDataAnalysis identifiers so the type represents
the actual data structure.

title: string;
description?: string;
chartType: 'bar' | 'line' | 'pie' | 'area' | 'scatter';
data: any[];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider stronger typing for data.

Using any[] loses type safety and allows malformed data to pass through undetected. Consider using Record<string, unknown>[] or a generic type parameter to provide better type checking while maintaining flexibility.

♻️ Proposed improvement
-  data: any[];
+  data: Record<string, unknown>[];

Alternatively, for maximum flexibility with type safety:

export type DataAnalysisResult<T extends Record<string, unknown> = Record<string, unknown>> = {
  // ...
  data: T[];
  // ...
};
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
data: any[];
data: Record<string, unknown>[];
🤖 Prompt for AI Agents
In `@lib/types/index.ts` at line 32, The type declaration uses data: any[] which
loses type safety; change the DataAnalysisResult type to use a safer type such
as data: Record<string, unknown>[] or make it generic (e.g., export type
DataAnalysisResult<T extends Record<string, unknown> = Record<string, unknown>>
= { ... data: T[]; ... }) and update usages of DataAnalysisResult to pass the
concrete T where appropriate (refer to the DataAnalysisResult type name and the
data property to locate the change).

Comment on lines +42 to +46
geospatial?: {
latitude: number;
longitude: number;
label?: string;
}[];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Naming inconsistency with existing MapData.markers interface.

The geospatial array uses label?: string for the optional text field, but the existing MapData.markers interface in components/map/map-data-context.tsx uses title?: string for the same purpose. This inconsistency may cause confusion or require mapping logic when integrating with the map system.

Consider aligning on a single naming convention to simplify integration.

♻️ Proposed alignment with existing interface
   geospatial?: {
     latitude: number;
     longitude: number;
-    label?: string;
+    title?: string;
   }[];
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
geospatial?: {
latitude: number;
longitude: number;
label?: string;
}[];
geospatial?: {
latitude: number;
longitude: number;
title?: string;
}[];
🤖 Prompt for AI Agents
In `@lib/types/index.ts` around lines 42 - 46, Rename the optional text field in
the geospatial array to match the MapData.markers interface: change geospatial's
label?: string to title?: string so it aligns with MapData.markers (referencing
geospatial and MapData.markers, and the label?/title? property names) and update
any usages or types that reference geospatial to use title instead of label.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
app/actions.tsx (1)

319-331: 🧹 Nitpick | 🔵 Trivial

Consider adding file size validation for text-based uploads to prevent excessive token usage.

While Next.js enforces a 200MB bodySizeLimit for server actions (configured in next.config.mjs), text files (CSV, JSON, plain text) should have more restrictive limits. Large files can exceed LLM token limits, causing failures or unexpected behavior. Consider adding a size check before processing:

🛡️ Proposed validation
+    const MAX_TEXT_FILE_SIZE = 1024 * 1024; // 1MB limit for text files
     } else if (
       file.type === 'text/plain' ||
       file.type === 'text/csv' ||
       file.type === 'application/json'
     ) {
+      if (buffer.byteLength > MAX_TEXT_FILE_SIZE) {
+        throw new Error(`File too large. Maximum size is ${MAX_TEXT_FILE_SIZE / 1024}KB.`);
+      }
       const textContent = Buffer.from(buffer).toString('utf-8')
🤖 Fix all issues with AI agents
In `@app/actions.tsx`:
- Around line 772-787: The code passes potentially undefined
latitude/longitude/label from toolOutput.geospatial[0] into MapQueryHandler; add
defensive checks before constructing the MAP_QUERY_TRIGGER payload: verify
toolOutput.geospatial exists and is a non-empty array, and validate that
toolOutput.geospatial[0] has numeric latitude and longitude and a non-empty
label (or provide safe defaults or skip rendering). Update the conditional
around MapQueryHandler (referencing toolOutput.geospatial and MapQueryHandler)
to only render when these validated properties are present, or transform the
values to safe fallbacks before passing them into the mcp_response.location
fields.
- Around line 766-791: The dataAnalysis branch reads toolOutput and accesses
nested fields without runtime validation; use the existing dataAnalysisSchema
(from lib/schema/data-analysis) to validate/parse toolOutput (e.g.,
dataAnalysisSchema.parse or safeParse) into a typed DataAnalysisResult before
rendering, then conditionally render MapQueryHandler only when the parsed result
has a non-empty geospatial array and use
parsed.geospatial[0].latitude/longitude/label for the mcp_response;
alternatively add explicit type guards for toolOutput.geospatial and its
elements to avoid direct indexing of potentially invalid data.
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8c7120d and 38704ca.

📒 Files selected for processing (1)
  • app/actions.tsx

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +766 to +791
case 'dataAnalysis':
return {
id,
component: (
<>
<GraphSection result={searchResults.value} />
{toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
<MapQueryHandler
toolOutput={{
type: 'MAP_QUERY_TRIGGER',
originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
timestamp: new Date().toISOString(),
mcp_response: {
location: {
latitude: toolOutput.geospatial[0].latitude,
longitude: toolOutput.geospatial[0].longitude,
place_name: toolOutput.geospatial[0].label
}
}
}}
/>
)}
</>
),
isCollapsed: isCollapsed.value
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Find the DataAnalysisResult type definition
echo "=== Searching for DataAnalysisResult type ==="
rg -n -A 20 'type DataAnalysisResult|DataAnalysisResult = ' --type=ts -g '!node_modules' | head -100

echo -e "\n=== Searching for schema definitions ==="
rg -n -B 2 -A 15 'dataAnalysisSchema' --type=ts -g '!node_modules' | head -150

echo -e "\n=== Checking app/actions.tsx context (around line 766) ==="
wc -l app/actions.tsx

Repository: QueueLab/QCX

Length of output: 4460


🏁 Script executed:

#!/bin/bash
echo "=== Reading app/actions.tsx around line 766 ==="
sed -n '740,810p' app/actions.tsx | cat -n

echo -e "\n=== Checking broader context (line 700-750) ==="
sed -n '700,750p' app/actions.tsx | cat -n

Repository: QueueLab/QCX

Length of output: 5172


Add runtime type validation for toolOutput in the dataAnalysis case.

The toolOutput is parsed from JSON without schema validation. While a DataAnalysisResult type and dataAnalysisSchema already exist (in lib/types and lib/schema/data-analysis.tsx), they are only used in the tool definition. Here, the code accesses nested properties like toolOutput.geospatial[0].latitude without confirming the structure at runtime. Consider reusing dataAnalysisSchema with parse() to validate before access, or add explicit type guards for the geospatial property structure.

🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 766 - 791, The dataAnalysis branch reads
toolOutput and accesses nested fields without runtime validation; use the
existing dataAnalysisSchema (from lib/schema/data-analysis) to validate/parse
toolOutput (e.g., dataAnalysisSchema.parse or safeParse) into a typed
DataAnalysisResult before rendering, then conditionally render MapQueryHandler
only when the parsed result has a non-empty geospatial array and use
parsed.geospatial[0].latitude/longitude/label for the mcp_response;
alternatively add explicit type guards for toolOutput.geospatial and its
elements to avoid direct indexing of potentially invalid data.

Comment on lines +772 to +787
{toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
<MapQueryHandler
toolOutput={{
type: 'MAP_QUERY_TRIGGER',
originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
timestamp: new Date().toISOString(),
mcp_response: {
location: {
latitude: toolOutput.geospatial[0].latitude,
longitude: toolOutput.geospatial[0].longitude,
place_name: toolOutput.geospatial[0].label
}
}
}}
/>
)}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add defensive checks for geospatial item properties.

The code accesses toolOutput.geospatial[0].latitude, .longitude, and .label without validating these properties exist. If the geospatial array contains a malformed item, undefined values will be passed to MapQueryHandler.

🛡️ Proposed defensive check
-                      {toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
+                      {toolOutput.geospatial &&
+                        toolOutput.geospatial.length > 0 &&
+                        toolOutput.geospatial[0].latitude != null &&
+                        toolOutput.geospatial[0].longitude != null && (
                         <MapQueryHandler
                           toolOutput={{
                             type: 'MAP_QUERY_TRIGGER',
                             originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
                             timestamp: new Date().toISOString(),
                             mcp_response: {
                               location: {
                                 latitude: toolOutput.geospatial[0].latitude,
                                 longitude: toolOutput.geospatial[0].longitude,
-                                place_name: toolOutput.geospatial[0].label
+                                place_name: toolOutput.geospatial[0].label ?? ''
                               }
                             }
                           }}
                         />
                       )}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
{toolOutput.geospatial && toolOutput.geospatial.length > 0 && (
<MapQueryHandler
toolOutput={{
type: 'MAP_QUERY_TRIGGER',
originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
timestamp: new Date().toISOString(),
mcp_response: {
location: {
latitude: toolOutput.geospatial[0].latitude,
longitude: toolOutput.geospatial[0].longitude,
place_name: toolOutput.geospatial[0].label
}
}
}}
/>
)}
{toolOutput.geospatial &&
toolOutput.geospatial.length > 0 &&
toolOutput.geospatial[0].latitude != null &&
toolOutput.geospatial[0].longitude != null && (
<MapQueryHandler
toolOutput={{
type: 'MAP_QUERY_TRIGGER',
originalUserInput: JSON.stringify(toolOutput.geospatial[0]),
timestamp: new Date().toISOString(),
mcp_response: {
location: {
latitude: toolOutput.geospatial[0].latitude,
longitude: toolOutput.geospatial[0].longitude,
place_name: toolOutput.geospatial[0].label ?? ''
}
}
}}
/>
)}
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 772 - 787, The code passes potentially
undefined latitude/longitude/label from toolOutput.geospatial[0] into
MapQueryHandler; add defensive checks before constructing the MAP_QUERY_TRIGGER
payload: verify toolOutput.geospatial exists and is a non-empty array, and
validate that toolOutput.geospatial[0] has numeric latitude and longitude and a
non-empty label (or provide safe defaults or skip rendering). Update the
conditional around MapQueryHandler (referencing toolOutput.geospatial and
MapQueryHandler) to only render when these validated properties are present, or
transform the values to safe fallbacks before passing them into the
mcp_response.location fields.

@ngoiyaeric
Copy link
Collaborator Author

@coderabbitai can you make the recharts graph generative UI graph from queries, what queries would be able to trigger that?

@ngoiyaeric ngoiyaeric added the 1.0.0 Standard release label Feb 1, 2026
CJWTRUST and others added 22 commits February 2, 2026 01:48
- Optimized resolution search with streaming.
- Improved timezone handling and AI context.
- Fixed UI mutual exclusion and loading state logic.
- Stabilized Mapbox cleanup and event listeners.
- Resolved build errors and ESLint warnings.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…head features

- Reverted branch to commit 6acfbe5.
- Merged changes from main (e95a25b).
- Preserved Tentree Icon (UsageView logic), History layout (HistoryContainer), and PurchaseCreditsPopup from 6acfbe5 as requested.
- Updated Stripe checkout URL in PurchaseCreditsPopup to the official one (14A3cv7K72TR3go14Nasg02).
- Re-applied bug fixes from 6acfbe5 (type casting in app/actions.tsx, ESLint dependencies in components/map/mapbox-map.tsx and components/chat-panel.tsx, and relatedQueries initialization).
- Verified build and synchronization with main branch features (e.g., resolution search enhancements).

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…ranch head features

- Reverted branch to commit 6acfbe5 and merged with main (e95a25b).
- Preserved Tentree Icon (Usage View), history layout, and PurchaseCreditsPopup from 6acfbe5.
- Restored branch-specific UI components: Header, History button, Profile toggle, and Usage integration in Chat.
- Synchronized all other logic with main, including resolution search enhancements.
- Fixed an infinite re-render loop in Mapbox map by optimizing useEffect dependencies (removed cameraState from initialization effect).
- Re-applied critical bug fixes from 6acfbe5:
  - Type casting in app/actions.tsx (getUIStateFromAIState).
  - Streamable value initialization for stability.
  - setSuggestions dependency in ChatPanel.
- Updated Stripe checkout URL in PurchaseCreditsPopup to the official link.
- Verified successful build with 'bun run build'.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
Replace all occurrences of the old Stripe checkout link with https://buy.stripe.com/14A3cv7K72TR3go14Nasg02 in:
- components/mobile-icons-bar.tsx
- components/purchase-credits-popup.tsx
- components/usage-view.tsx

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…514914606268144687

Update Stripe payment links
- Updated 'Daily refresh credits' to 'Yearly refresh credits'
- Updated refresh credit value from 300 to 500
- Updated subtext to 'Refresh to 500 every year.'
- Replaced task list with 'QCX-TERRA Crop yield Analysis', 'QCX-TERRA Flood predictions', and 'Green OS climate synchronization'
- Set task dates to 'upcoming' and credit changes to single digits (-7, -5, -3)

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…resh-14763946411330416327

Update Usage View to Yearly Refresh
This commit refactors the desktop header icon container in `components/header.tsx` to ensure all icons are equidistant.

Key changes:
- Changed the icon container from `justify-between w-1/2` to `justify-center flex-1` with a fixed `gap-10`.
- Removed the unused `<div id="timezone-clock-portal" />` which was causing irregular spacing between the TentTree and ModeToggle icons.
- Applied `className="contents"` to the `<div id="header-search-portal" />` so that it doesn't affect the flex layout when empty.
- These changes ensure that icons are always perfectly centered and equidistant regardless of the number of active icons.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…1736717079589

Standardize Header Icon Spacing
- Modified `mapbox-map.tsx` to only show `NavigationControl` when in `DrawingMode` on desktop.
- Fixed a bug in the mode change `useEffect` where previous state was being updated prematurely, preventing cleanup logic from executing correctly.
- Updated `tests/map.spec.ts` to switch to `DrawingMode` before verifying zoom control visibility.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
…94765400062008

Conditional Map Zoom Controls and Mode Transition Fix
@ngoiyaeric ngoiyaeric force-pushed the feature/generative-graphs-ui-data-preview-17358342468692938721 branch from 6b80d53 to fdc7c5e Compare February 3, 2026 14:11
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In `@app/actions.tsx`:
- Around line 301-305: Add a server-side size guard for uploads matching the
file.type checks ('text/plain', 'text/csv', 'application/json') to prevent
large-memory malicious uploads: define a MAX_UPLOAD_BYTES constant and in the
same handler where file.type is inspected (the branch using file.type ===
'text/plain' || 'text/csv' || 'application/json') reject the request if
file.size (or a streamed byte counter if using streams) exceeds that limit,
returning/throwing an appropriate error before attempting to parse or buffer the
file; ensure the guard runs unconditionally on the server side so client-side
checks cannot be bypassed.

In `@components/graph-section.tsx`:
- Around line 216-242: The scatter branch currently assumes config.yAxisKey
exists and binds YAxis to undefined; update the 'scatter' case to validate that
config.yAxisKey is present before rendering the ScatterChart (e.g., early return
a fallback/null or render an informative placeholder/error UI), and ensure the
YAxis and any bindings (YAxis dataKey and Tooltip/XAxis names that reference
config.yAxisKey) only use config.yAxisKey when defined; locate the scatter
rendering in the switch case (the JSX using ScatterChart, XAxis, YAxis, Tooltip,
and the map over config.series) and add the guard/conditional rendering around
it using config.yAxisKey to prevent binding to undefined.
- Around line 29-241: Replace the static COLORS palette with a theme-aware
palette derived from CSS variables (e.g., --primary, --secondary, --accent,
--muted) inside GraphCard (or top-level used by GraphCard); read values via
getComputedStyle(document.documentElement).getPropertyValue, convert returned
CSS color strings to hex (handle hex, rgb(a), and hsl formats), build an array
like themePalette = [primaryHex, secondaryHex, accentHex, mutedHex] with a
fallback to the original COLORS, and then use s.color || themePalette[i %
themePalette.length] (and for Pie Cells) and for stroke/fill defaults across
Bar, Line, Area, Scatter and Cell renderings so charts adapt to light/dark/earth
themes while preserving existing fallbacks.
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 37d772b and c5aae97.

⛔ Files ignored due to path filters (1)
  • verification/fix_verification.png is excluded by !**/*.png
📒 Files selected for processing (3)
  • app/actions.tsx
  • components/chat-panel.tsx
  • components/graph-section.tsx
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: ngoiyaeric
Repo: QueueLab/QCX PR: 0
File: :0-0
Timestamp: 2026-01-31T16:53:16.122Z
Learning: The GraphSection component in components/graph-section.tsx should use theme-aware colors that match the site's current theme (light, dark, or earth) by reading CSS variables like --primary, --secondary, --accent, and --muted, and converting them to hex colors for Recharts compatibility.
📚 Learning: 2026-01-31T16:53:16.122Z
Learnt from: ngoiyaeric
Repo: QueueLab/QCX PR: 0
File: :0-0
Timestamp: 2026-01-31T16:53:16.122Z
Learning: The GraphSection component in components/graph-section.tsx should use theme-aware colors that match the site's current theme (light, dark, or earth) by reading CSS variables like --primary, --secondary, --accent, and --muted, and converting them to hex colors for Recharts compatibility.

Applied to files:

  • components/graph-section.tsx
  • app/actions.tsx
📚 Learning: 2026-01-16T10:02:40.220Z
Learnt from: ngoiyaeric
Repo: QueueLab/QCX PR: 0
File: :0-0
Timestamp: 2026-01-16T10:02:40.220Z
Learning: The router agent and resolution search agent must implement a feedback loop where tool execution results (including GeoTIFF URLs, embeddings, and satellite analysis data) are fed back to the AI model's context through enriched messages before generating the final response, ensuring the foundational model receives and can synthesize all satellite intelligence data.

Applied to files:

  • app/actions.tsx
🧬 Code graph analysis (1)
components/graph-section.tsx (5)
lib/types/index.ts (1)
  • DataAnalysisResult (28-47)
components/section.tsx (1)
  • Section (24-75)
components/ui/tooltip.tsx (1)
  • Tooltip (30-30)
components/tool-badge.tsx (1)
  • ToolBadge (11-26)
components/ui/card.tsx (5)
  • Card (79-79)
  • CardHeader (79-79)
  • CardTitle (79-79)
  • CardDescription (79-79)
  • CardContent (79-79)
🔇 Additional comments (7)
app/actions.tsx (3)

15-15: LGTM — agent imports are consistent.


30-30: LGTM — GraphSection import is correctly wired.


91-92: LGTM — direct resolutionSearch invocation reads cleanly.

components/graph-section.tsx (3)

35-51: LGTM — static vs. streamable detection is clear.


70-85: LGTM — JSON parsing is safely guarded.


254-270: LGTM — card layout and metadata rendering look solid.

components/chat-panel.tsx (1)

217-217: LGTM — file input now matches supported CSV/JSON uploads.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +301 to +305
} else if (
file.type === 'text/plain' ||
file.type === 'text/csv' ||
file.type === 'application/json'
) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add a server-side file size guard for csv/json uploads.

Client-side checks are not sufficient; a malicious request can bypass them and force large memory allocations on the server.

🔧 Proposed fix
-  if (file) {
-    const buffer = await file.arrayBuffer()
+  if (file) {
+    const MAX_FILE_SIZE = 10 * 1024 * 1024
+    if (file.size > MAX_FILE_SIZE) {
+      throw new Error('File size must be less than 10MB')
+    }
+    const buffer = await file.arrayBuffer()
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 301 - 305, Add a server-side size guard for
uploads matching the file.type checks ('text/plain', 'text/csv',
'application/json') to prevent large-memory malicious uploads: define a
MAX_UPLOAD_BYTES constant and in the same handler where file.type is inspected
(the branch using file.type === 'text/plain' || 'text/csv' ||
'application/json') reject the request if file.size (or a streamed byte counter
if using streams) exceeds that limit, returning/throwing an appropriate error
before attempting to parse or buffer the file; ensure the guard runs
unconditionally on the server side so client-side checks cannot be bypassed.

Comment on lines +29 to +241
const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d']

interface GraphSectionProps {
result: DataAnalysisResult | string | StreamableValue<DataAnalysisResult>
}

export function GraphSection({ result }: GraphSectionProps) {
if (!result) return null;

// Check if result is a static DataAnalysisResult object
// A StreamableValue is an opaque object and shouldn't have these properties
const isStatic = typeof result === 'object' && result !== null &&
('chartType' in (result as any) || 'title' in (result as any) || 'data' in (result as any));
const isString = typeof result === 'string';

if (isStatic || isString) {
return <GraphCard data={result as any} />;
}

// Handle case where it might be a streamable value or something else
// We use a safe wrapper to avoid crashing if useStreamableValue throws
return <StreamedGraphSection result={result as any} />;
}

function StreamedGraphSection({ result }: { result: StreamableValue<any> }) {
const [data, error, pending] = useStreamableValue(result);

if (pending && !data) {
return (
<Section className="py-2">
<div className="animate-pulse flex space-y-4 flex-col">
<div className="h-4 bg-muted rounded w-3/4"></div>
<div className="h-64 bg-muted rounded"></div>
</div>
</Section>
);
}

return <GraphCard data={data} />;
}

function GraphCard({ data, pending }: { data: any, pending?: boolean }) {
const chartData: DataAnalysisResult | undefined = React.useMemo(() => {
if (!data) return undefined;
if (typeof data === 'string') {
try {
return JSON.parse(data);
} catch (e) {
console.error('Error parsing graph data:', e);
return undefined;
}
}
return data as DataAnalysisResult;
}, [data]);

if (!chartData) return null;

const { title, description, chartType, data: plotData, config } = chartData;

const renderChart = () => {
if (!plotData || !config) return <div className="flex items-center justify-center h-full text-muted-foreground italic">Missing chart data or configuration</div>;

const themeColors = {
text: 'hsl(var(--foreground))',
grid: 'hsl(var(--border))',
tooltip: {
bg: 'hsl(var(--card))',
text: 'hsl(var(--card-foreground))',
border: 'hsl(var(--border))'
}
}

const commonAxisProps = {
stroke: themeColors.text,
fontSize: 12,
tickLine: false,
axisLine: false,
}

switch (chartType) {
case 'bar':
return (
<ResponsiveContainer width="100%" height="100%">
<BarChart data={plotData} margin={{ top: 10, right: 10, left: -20, bottom: 20 }}>
<CartesianGrid strokeDasharray="3 3" stroke={themeColors.grid} vertical={false} />
<XAxis
dataKey={config.xAxisKey}
{...commonAxisProps}
dy={10}
/>
<YAxis {...commonAxisProps} />
<Tooltip
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend wrapperStyle={{ paddingTop: '20px' }} />
{config.series?.map((s, i) => (
<Bar key={s.key} dataKey={s.key} name={s.name} fill={s.color || COLORS[i % COLORS.length]} radius={[4, 4, 0, 0]} />
))}
</BarChart>
</ResponsiveContainer>
);
case 'line':
return (
<ResponsiveContainer width="100%" height="100%">
<LineChart data={plotData} margin={{ top: 10, right: 10, left: -20, bottom: 20 }}>
<CartesianGrid strokeDasharray="3 3" stroke={themeColors.grid} vertical={false} />
<XAxis
dataKey={config.xAxisKey}
{...commonAxisProps}
dy={10}
/>
<YAxis {...commonAxisProps} />
<Tooltip
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend wrapperStyle={{ paddingTop: '20px' }} />
{config.series?.map((s, i) => (
<Line key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || COLORS[i % COLORS.length]} strokeWidth={2} dot={{ r: 4 }} activeDot={{ r: 6 }} />
))}
</LineChart>
</ResponsiveContainer>
);
case 'area':
return (
<ResponsiveContainer width="100%" height="100%">
<AreaChart data={plotData} margin={{ top: 10, right: 10, left: -20, bottom: 20 }}>
<CartesianGrid strokeDasharray="3 3" stroke={themeColors.grid} vertical={false} />
<XAxis
dataKey={config.xAxisKey}
{...commonAxisProps}
dy={10}
/>
<YAxis {...commonAxisProps} />
<Tooltip
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend wrapperStyle={{ paddingTop: '20px' }} />
{config.series?.map((s, i) => (
<Area key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || COLORS[i % COLORS.length]} fill={s.color || COLORS[i % COLORS.length]} fillOpacity={0.3} />
))}
</AreaChart>
</ResponsiveContainer>
);
case 'pie':
return (
<ResponsiveContainer width="100%" height="100%">
<PieChart>
<Pie
data={plotData}
dataKey={config.series?.[0]?.key}
nameKey={config.xAxisKey}
cx="50%"
cy="50%"
outerRadius={80}
label={{ fill: themeColors.text, fontSize: 12 }}
>
{plotData.map((entry, index) => (
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
))}
</Pie>
<Tooltip
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend />
</PieChart>
</ResponsiveContainer>
);
case 'scatter':
return (
<ResponsiveContainer width="100%" height="100%">
<ScatterChart margin={{ top: 10, right: 10, left: -20, bottom: 20 }}>
<CartesianGrid strokeDasharray="3 3" stroke={themeColors.grid} />
<XAxis
type="number"
dataKey={config.xAxisKey}
name={config.xAxisKey}
{...commonAxisProps}
dy={10}
/>
<YAxis type="number" dataKey={config.yAxisKey} name={config.yAxisKey} {...commonAxisProps} />
<Tooltip
cursor={{ strokeDasharray: '3 3' }}
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend wrapperStyle={{ paddingTop: '20px' }} />
{config.series?.map((s, i) => (
<Scatter key={s.key} name={s.name} data={plotData} fill={s.color || COLORS[i % COLORS.length]} />
))}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, explore the repository structure to understand the theme setup
find . -type f -name "*.css" -o -name "*.ts" -o -name "*.tsx" | head -20
git ls-files | grep -E "(theme|global|config)" | head -20

Repository: QueueLab/QCX

Length of output: 818


🏁 Script executed:

# Search for CSS variables definition - common locations
find . -type f \( -name "*.css" -o -name "globals.css" -o -name "*.scss" \) | xargs grep -l "primary\|secondary\|accent\|muted" 2>/dev/null | head -10

Repository: QueueLab/QCX

Length of output: 73


🏁 Script executed:

# Check if --primary, --secondary, --accent, --muted CSS variables are defined
rg "--primary|--secondary|--accent|--muted" --type css --type tsx --type ts | head -30

Repository: QueueLab/QCX

Length of output: 496


🏁 Script executed:

# Verify the exact COLORS usage in graph-section.tsx
rg "COLORS" components/graph-section.tsx -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 1666


🏁 Script executed:

# Examine globals.css for CSS variables
cat app/globals.css | head -100

Repository: QueueLab/QCX

Length of output: 2468


🏁 Script executed:

# Check tailwind.config.ts for theme colors
cat tailwind.config.ts

Repository: QueueLab/QCX

Length of output: 2482


🏁 Script executed:

# Search for primary, secondary, accent, muted in CSS variables (correct rg syntax)
rg "primary|secondary|accent|muted" --type css -C 3

Repository: QueueLab/QCX

Length of output: 3378


Use theme-aware series colors instead of hardcoded palette.

The COLORS array is static and won't adapt to light/dark/earth themes. Extract --primary, --secondary, --accent, and --muted from CSS variables, convert them to hex (since Recharts requires hex for SVG), and apply them to series rendering across all chart types.

🎨 Proposed fix (theme palette + hex conversion)
-const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d']
+const FALLBACK_COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d']
+const THEME_COLOR_VARS = ['--primary', '--secondary', '--accent', '--muted']
+
+const hslToHex = (hsl: string) => {
+  const [h, s, l] = hsl.replace(/%/g, '').split(/\s+/).map(Number)
+  if ([h, s, l].some(Number.isNaN)) return ''
+  const sN = s / 100
+  const lN = l / 100
+  const c = (1 - Math.abs(2 * lN - 1)) * sN
+  const x = c * (1 - Math.abs((h / 60) % 2 - 1))
+  const m = lN - c / 2
+  let [r, g, b] = [0, 0, 0]
+  if (h < 60) [r, g, b] = [c, x, 0]
+  else if (h < 120) [r, g, b] = [x, c, 0]
+  else if (h < 180) [r, g, b] = [0, c, x]
+  else if (h < 240) [r, g, b] = [0, x, c]
+  else if (h < 300) [r, g, b] = [x, 0, c]
+  else [r, g, b] = [c, 0, x]
+  const toHex = (v: number) => Math.round((v + m) * 255).toString(16).padStart(2, '0')
+  return `#${toHex(r)}${toHex(g)}${toHex(b)}`
+}
 function GraphCard({ data, pending }: { data: any, pending?: boolean }) {
+  const [seriesColors, setSeriesColors] = React.useState(FALLBACK_COLORS)
+  React.useEffect(() => {
+    if (typeof window === 'undefined') return
+    const styles = getComputedStyle(document.documentElement)
+    const palette = THEME_COLOR_VARS
+      .map(v => hslToHex(styles.getPropertyValue(v).trim()))
+      .filter(Boolean)
+    if (palette.length) setSeriesColors(palette)
+  }, [])
-                <Bar key={s.key} dataKey={s.key} name={s.name} fill={s.color || COLORS[i % COLORS.length]} radius={[4, 4, 0, 0]} />
+                <Bar key={s.key} dataKey={s.key} name={s.name} fill={s.color || seriesColors[i % seriesColors.length]} radius={[4, 4, 0, 0]} />
-                <Line key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || COLORS[i % COLORS.length]} strokeWidth={2} dot={{ r: 4 }} activeDot={{ r: 6 }} />
+                <Line key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || seriesColors[i % seriesColors.length]} strokeWidth={2} dot={{ r: 4 }} activeDot={{ r: 6 }} />
-                <Area key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || COLORS[i % COLORS.length]} fill={s.color || COLORS[i % COLORS.length]} fillOpacity={0.3} />
+                <Area key={s.key} type="monotone" dataKey={s.key} name={s.name} stroke={s.color || seriesColors[i % seriesColors.length]} fill={s.color || seriesColors[i % seriesColors.length]} fillOpacity={0.3} />
-                  <Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
+                  <Cell key={`cell-${index}`} fill={seriesColors[index % seriesColors.length]} />
-                <Scatter key={s.key} name={s.name} data={plotData} fill={s.color || COLORS[i % COLORS.length]} />
+                <Scatter key={s.key} name={s.name} data={plotData} fill={s.color || seriesColors[i % seriesColors.length]} />
🤖 Prompt for AI Agents
In `@components/graph-section.tsx` around lines 29 - 241, Replace the static
COLORS palette with a theme-aware palette derived from CSS variables (e.g.,
--primary, --secondary, --accent, --muted) inside GraphCard (or top-level used
by GraphCard); read values via
getComputedStyle(document.documentElement).getPropertyValue, convert returned
CSS color strings to hex (handle hex, rgb(a), and hsl formats), build an array
like themePalette = [primaryHex, secondaryHex, accentHex, mutedHex] with a
fallback to the original COLORS, and then use s.color || themePalette[i %
themePalette.length] (and for Pie Cells) and for stroke/fill defaults across
Bar, Line, Area, Scatter and Cell renderings so charts adapt to light/dark/earth
themes while preserving existing fallbacks.

Comment on lines +216 to +242
case 'scatter':
return (
<ResponsiveContainer width="100%" height="100%">
<ScatterChart margin={{ top: 10, right: 10, left: -20, bottom: 20 }}>
<CartesianGrid strokeDasharray="3 3" stroke={themeColors.grid} />
<XAxis
type="number"
dataKey={config.xAxisKey}
name={config.xAxisKey}
{...commonAxisProps}
dy={10}
/>
<YAxis type="number" dataKey={config.yAxisKey} name={config.yAxisKey} {...commonAxisProps} />
<Tooltip
cursor={{ strokeDasharray: '3 3' }}
contentStyle={{
backgroundColor: themeColors.tooltip.bg,
color: themeColors.tooltip.text,
borderColor: themeColors.tooltip.border,
borderRadius: '8px'
}}
/>
<Legend wrapperStyle={{ paddingTop: '20px' }} />
{config.series?.map((s, i) => (
<Scatter key={s.key} name={s.name} data={plotData} fill={s.color || COLORS[i % COLORS.length]} />
))}
</ScatterChart>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Guard missing yAxisKey for scatter charts.

yAxisKey is optional in the type but required for a scatter chart. Without it, the axis binds to undefined.

🛡️ Proposed fix
       case 'scatter':
+        if (!config.yAxisKey) {
+          return (
+            <div className="flex items-center justify-center h-full text-muted-foreground italic">
+              Missing y-axis key for scatter chart
+            </div>
+          )
+        }
         return (
🤖 Prompt for AI Agents
In `@components/graph-section.tsx` around lines 216 - 242, The scatter branch
currently assumes config.yAxisKey exists and binds YAxis to undefined; update
the 'scatter' case to validate that config.yAxisKey is present before rendering
the ScatterChart (e.g., early return a fallback/null or render an informative
placeholder/error UI), and ensure the YAxis and any bindings (YAxis dataKey and
Tooltip/XAxis names that reference config.yAxisKey) only use config.yAxisKey
when defined; locate the scatter rendering in the switch case (the JSX using
ScatterChart, XAxis, YAxis, Tooltip, and the map over config.series) and add the
guard/conditional rendering around it using config.yAxisKey to prevent binding
to undefined.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

1.0.0 Standard release Bug fix help wanted Extra attention is needed Review effort 3/5

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants