Skip to content

15 modal response is weird when using azure open ai modals#16

Merged
Louis-7 merged 2 commits intomainfrom
15-modal-response-is-weird-when-using-azure-open-ai-modals
Jan 29, 2026
Merged

15 modal response is weird when using azure open ai modals#16
Louis-7 merged 2 commits intomainfrom
15-modal-response-is-weird-when-using-azure-open-ai-modals

Conversation

@Louis-7
Copy link
Contributor

@Louis-7 Louis-7 commented Jan 29, 2026

Pull Request Template

Description

  • Fix modal id is not unique on chat popup page issue.
  • Send init prompt with user role instead of system role when using azure modals.

Related Issue

Closes #15

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Refactor
  • Other

Checklist

  • My code follows the style guidelines of this project
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR makes chat model selection provider-aware and improves how Azure OpenAI prompts are initialized, fixing ambiguous model IDs and odd Azure responses.

Changes:

  • Extend the chat settings schema with a defaultProvider field, update defaults, and normalize defaultProvider/defaultModel on load in SettingsService.
  • Update AIService to resolve the active provider/model using both defaultProvider and defaultModel, with sensible fallbacks and clearer error handling.
  • Fix the ChatPopup model selector to use unique per-provider model keys and persist both provider and model; plumb defaultProvider through main.ts into TextSelectionService so Azure OpenAI gets an appropriate initial message role.

Reviewed changes

Copilot reviewed 6 out of 7 changed files in this pull request and generated no comments.

Show a summary per file
File Description
src/types/setting.d.ts Adds chat.defaultProvider to the settings type so provider choice is tracked alongside the default model.
settings.default.json Adds defaultProvider to the default chat config and initializes it to null for backwards compatibility.
electron/SettingsService.ts Normalizes settings on startup to ensure chat.defaultProvider/defaultModel always point to a valid provider/model pair, falling back to the first available model when necessary.
src/services/AIService.ts Uses defaultProvider + defaultModel to select the active provider/model, with fallback resolution by model ID and finally the first configured provider/model, and throws a single consolidated error when resolution fails.
src/pages/ChatPopup/ChatPopup.tsx Constructs unique model selection keys from providerId::modelId, computes a consistent selected key from defaultProvider/defaultModel, and updates both values on selection change.
main.ts Passes the current chat.defaultProvider into TextSelectionService.handleTextSelection for both hotkey execution and manual text selection triggers.
electron/TextSelectionService.ts Accepts an optional providerId and sets the initial prompt message role to user specifically for the azure-openai provider, otherwise keeping it as system.

@Louis-7 Louis-7 merged commit 2b5b199 into main Jan 29, 2026
11 checks passed
@Louis-7 Louis-7 deleted the 15-modal-response-is-weird-when-using-azure-open-ai-modals branch January 29, 2026 06:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Model response is weird when using azure open ai models.

1 participant