Skip to content

Conversation

@omChauhanDev
Copy link
Contributor

@omChauhanDev omChauhanDev commented Feb 3, 2026

Please describe the changes in your PR. If it is addressing an issue, please reference that as well.

Fixes #3549

When LLMSetToolsFrame was processed, the code called _update_settings() without the required argument, causing a TypeError. Since Gemini Live does not currently support mid-session tool updates, replaced the broken call with a warning log.

Before :
image

After :
image

@codecov
Copy link

codecov bot commented Feb 3, 2026

Codecov Report

❌ Patch coverage is 0% with 1 line in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/pipecat/services/google/gemini_live/llm.py 0.00% 1 Missing ⚠️
Files with missing lines Coverage Δ
src/pipecat/services/google/gemini_live/llm.py 19.72% <0.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

if frame.tools is not None:
self._tools_from_init = frame.tools
if self._session:
await self._reconnect()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might not want to do this for now, reconnects are bad and slow. I believe we will wait until Gemini supports it. In the meantime we probably might want to remove the update_settings and probably add a warning instead saying that this is not supported in Gemini Live.

@kompfner has been looking into this and can comment.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what I did as a workaround was to monkey patch the Gemini class locally

class GeminiLiveVertexLLMService(BaseGeminiLiveVertexLLMService):
    """Patched Gemini Live service that fixes bugs in the base implementation.

    Fixes:
    1. _update_settings bug: https://github.com/pipecat-ai/pipecat/blob/main/src/pipecat/services/google/gemini_live/llm.py#L953
       The base class calls _update_settings() without the required 'settings' argument.
    2. Transcription timeout task not being properly awaited when cancelled.
    """

    async def process_frame(self, frame: Frame, direction: FrameDirection):
        """Process frames with fix for LLMSetToolsFrame handling."""
        # Fix the bug: pass current settings when handling LLMSetToolsFrame
        if isinstance(frame, LLMSetToolsFrame):
            await self._update_settings(self._settings)
            # Continue processing through parent for other frame handling
            return

        # All other frames go through normal processing
        await super().process_frame(frame, direction)

    async def cancel_task(self, task, timeout: float = 1.0):
        """Override to properly await cancelled transcription timeout tasks."""
        if task and not task.done():
            task.cancel()
            try:
                # Properly await the cancelled task to avoid RuntimeWarning
                await asyncio.wait_for(task, timeout=timeout)
            except (asyncio.CancelledError, asyncio.TimeoutError):
                # Expected when task is cancelled
                pass
            except Exception as e:
                logger.debug(f"Error cancelling task: {e}")

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have an open PR that adds the following (per the changelog entry):

- Added support to Gemini Live (`GeminiLiveLLMService`) for programmatically swapping tools or editing context at runtime; now you can use `LLMMessagesAppendFrame`, `LLMMessagesUpdateFrame`, `LLMMessagesTransformFrame`, and `LLMSetToolsFrame` with Gemini Live, like you would with text-to-text services. Note that this new functionality only works if you're using `LLMContext` and `LLMContextAggregatorPair` rather than the deprecated `OpenAILLMContext` and associated aggregators.

So we probably won't need this PR when that one lands.

Good catch, though, that the LLMSetToolsFrame handling in Gemini Live was never quite working as intended 🤦 .

Do note, however, that—as @aconchillo pointed out—a reconnect is necessary to make tool or conversation history changes at runtime. It's not too slow, but it might be noticeable (though not always).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sense, i will close this one then

@omChauhanDev omChauhanDev force-pushed the fix/gemini-live-set-tools branch from 2c40767 to cd4e1f2 Compare February 9, 2026 00:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

GeminiLiveVertexLLMService Vertex AI issue with update_settings

4 participants