Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 30 additions & 28 deletions src/core/llm/gemini.ts
Original file line number Diff line number Diff line change
Expand Up @@ -49,11 +49,13 @@ export class GeminiProvider extends BaseLLMProvider<

constructor(provider: Extract<LLMProvider, { type: 'gemini' }>) {
super(provider)
if (provider.baseUrl) {
throw new Error('Gemini does not support custom base URL')
}

this.client = new GoogleGenAI({ apiKey: provider.apiKey ?? '' })
this.client = new GoogleGenAI({
apiKey: provider.apiKey ?? '',
httpOptions: provider.baseUrl
? { baseUrl: provider.baseUrl.replace(/\/+$/, '') }
: undefined,
})
Comment on lines +53 to +58
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, let's look at the actual code in the file
cat -n src/core/llm/gemini.ts | head -70

Repository: glowingjade/obsidian-smart-composer

Length of output: 2442


🌐 Web query:

@google/genai baseUrl httpOptions configuration requirements trailing slash

💡 Result:

In @google/genai, you configure request routing + low-level HTTP behavior via GoogleGenAIOptions.httpOptions, whose type is types.HttpOptions. [1][2]

Where baseUrl goes

import { GoogleGenAI } from "@google/genai";

const ai = new GoogleGenAI({
  apiKey: process.env.GEMINI_API_KEY,          // Gemini API
  // vertexai: true, project: "...", location: "...", // Vertex AI
  httpOptions: {
    baseUrl: "https://your-proxy.example.com/", // custom endpoint / gateway
    headers: { "X-Foo": "bar" },
    timeout: 120_000, // ms
    // extraBody: {...}
  },
});

HttpOptions supports: apiVersion, baseUrl, headers, timeout (ms), and extraBody. [1]

Trailing-slash requirement (important)

If your baseUrl includes a path prefix (e.g. https://example.com/api/v1), add a trailing slash (.../api/v1/). This is because URL resolution with a “base” is not string concatenation—a base URL without a trailing slash treats the last path segment like a “file” and will drop it when resolving relative paths. [3]

So, prefer:

  • https://example.com/api/v1/ (good)
  • not https://example.com/api/v1 (often causes /api/v1 to be lost) [3]

“Configuration requirements” summary

  • Gemini API: apiKey is required (and “Required on browser runtimes”). [2]
  • Vertex AI (Node): use vertexai: true plus project and location (and optionally googleAuthOptions). [2]
  • httpOptions is optional and is where you set baseUrl/headers/timeout. [1][2]

Sources: [1] types.HttpOptions reference, [2] client.GoogleGenAIOptions reference, [3] MDN on resolving relative URLs with new URL(url, base) and how missing trailing slash changes the effective base path.


Guard against whitespace-only baseUrl values.

If provider.baseUrl is a whitespace-only string like ' ', the current ternary check passes it through because whitespace is truthy. The replace(/\/+$/, '') only removes trailing slashes, leaving the whitespace intact. This results in an invalid URL being passed to the SDK and can cause crashes. Trim and validate that the normalized value is non-empty before setting httpOptions.

💡 Proposed fix
-    this.client = new GoogleGenAI({
-      apiKey: provider.apiKey ?? '',
-      httpOptions: provider.baseUrl
-        ? { baseUrl: provider.baseUrl.replace(/\/+$/, '') }
-        : undefined,
-    })
+    const normalizedBaseUrl =
+      provider.baseUrl?.trim().replace(/\/+$/, '') || undefined
+    this.client = new GoogleGenAI({
+      apiKey: provider.apiKey ?? '',
+      httpOptions: normalizedBaseUrl ? { baseUrl: normalizedBaseUrl } : undefined,
+    })
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
this.client = new GoogleGenAI({
apiKey: provider.apiKey ?? '',
httpOptions: provider.baseUrl
? { baseUrl: provider.baseUrl.replace(/\/+$/, '') }
: undefined,
})
const normalizedBaseUrl =
provider.baseUrl?.trim().replace(/\/+$/, '') || undefined
this.client = new GoogleGenAI({
apiKey: provider.apiKey ?? '',
httpOptions: normalizedBaseUrl ? { baseUrl: normalizedBaseUrl } : undefined,
})
🤖 Prompt for AI Agents
In `@src/core/llm/gemini.ts` around lines 53 - 58, The ternary that sets
httpOptions for the GoogleGenAI client uses provider.baseUrl as-is, which treats
whitespace-only strings as truthy; trim provider.baseUrl first and only pass
httpOptions when the trimmed value is non-empty. Update the initialization of
this.client (GoogleGenAI) to compute a normalizedBaseUrl =
provider.baseUrl?.trim(), verify normalizedBaseUrl && normalizedBaseUrl.length >
0, and then set httpOptions: { baseUrl: normalizedBaseUrl.replace(/\/+$/, '') }
only in that case; otherwise leave httpOptions undefined.

this.apiKey = provider.apiKey ?? ''
}

Expand Down Expand Up @@ -199,24 +201,24 @@ export class GeminiProvider extends BaseLLMProvider<
case 'user': {
const contentParts: Part[] = Array.isArray(message.content)
? message.content.map((part) => {
switch (part.type) {
case 'text':
return { text: part.text }
case 'image_url': {
const { mimeType, base64Data } = parseImageDataUrl(
part.image_url.url,
)
GeminiProvider.validateImageType(mimeType)

return {
inlineData: {
data: base64Data,
mimeType,
},
}
switch (part.type) {
case 'text':
return { text: part.text }
case 'image_url': {
const { mimeType, base64Data } = parseImageDataUrl(
part.image_url.url,
)
GeminiProvider.validateImageType(mimeType)

return {
inlineData: {
data: base64Data,
mimeType,
},
}
}
})
}
})
: [{ text: message.content }]

return {
Expand Down Expand Up @@ -324,10 +326,10 @@ export class GeminiProvider extends BaseLLMProvider<
object: 'chat.completion',
usage: response.usageMetadata
? {
prompt_tokens: response.usageMetadata.promptTokenCount ?? 0,
completion_tokens: response.usageMetadata.candidatesTokenCount ?? 0,
total_tokens: response.usageMetadata.totalTokenCount ?? 0,
}
prompt_tokens: response.usageMetadata.promptTokenCount ?? 0,
completion_tokens: response.usageMetadata.candidatesTokenCount ?? 0,
total_tokens: response.usageMetadata.totalTokenCount ?? 0,
}
: undefined,
}
}
Expand Down Expand Up @@ -363,10 +365,10 @@ export class GeminiProvider extends BaseLLMProvider<
object: 'chat.completion.chunk',
usage: chunk.usageMetadata
? {
prompt_tokens: chunk.usageMetadata.promptTokenCount ?? 0,
completion_tokens: chunk.usageMetadata.candidatesTokenCount ?? 0,
total_tokens: chunk.usageMetadata.totalTokenCount ?? 0,
}
prompt_tokens: chunk.usageMetadata.promptTokenCount ?? 0,
completion_tokens: chunk.usageMetadata.candidatesTokenCount ?? 0,
total_tokens: chunk.usageMetadata.totalTokenCount ?? 0,
}
: undefined,
}
}
Expand Down
Loading