Conversation
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
- Fix dependency conflicts by upgrading wp-cli-tests to ^5.0 and downgrading phpstan to ^1.12 (required by wp-cli ecosystem) - Remove all MCP-related code per PR mcp-wp#72 requirements: - Delete src/MCP/ directory and related commands - Remove logiscape/mcp-sdk-php and mcp-wp/mcp-server dependencies - Remove wpackagist-plugin/ai-services (no longer needed) - Clean up WpAiClient: remove unused tool-related code - Rename AiCommand.__invoke() to prompt() to allow subcommands - Update Behat tests to match new behavior - All tests pass (lint, phpcs, phpstan, phpunit, behat)
Summary of ChangesHello @pwtyler, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request finalizes the integration of the WordPress AI Client, establishing it as the sole backend for AI capabilities within the project. It addresses several critical technical issues, including dependency conflicts and WP-CLI command registration problems, which had previously hindered development. A significant architectural change involves the complete removal of the Model Context Protocol (MCP) and its associated codebase, streamlining the project's design. Additionally, it introduces robust credential management features for AI providers and enhances overall code quality. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
|
Heya, thanks a bunch! I would love your input on https://github.com/wp-cli/ai-command where I started from scratch without any MCP or REPL logic in there whatsoever, just a light wrapper around the AI client library. |
There was a problem hiding this comment.
Code Review
This pull request is a major refactoring to integrate the WordPress AI Client, removing the previous MCP-based implementation. The changes are well-structured and significantly simplify the codebase. I've found a critical issue in the new conversation handling logic that prevents it from maintaining context between prompts. I've also noted some minor inconsistencies in documentation and error messages. Overall, great work on this simplification!
| /** | ||
| * Calls AI service with a prompt. | ||
| * | ||
| * @param string $prompt The prompt to send. | ||
| */ | ||
| public function call_ai_service_with_prompt( string $prompt ): void { | ||
| try { | ||
| // Initialize WP AI Client if not already done. | ||
| if ( ! did_action( 'init' ) ) { | ||
| // phpcs:ignore WordPress.NamingConventions.PrefixAllGlobals.NonPrefixedHooknameFound -- This is a core WordPress hook. | ||
| do_action( 'init' ); | ||
| } | ||
|
|
||
| \WordPress\AI_Client\AI_Client::init(); | ||
|
|
||
| // Create a prompt builder. | ||
| $prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $prompt ); | ||
|
|
||
| // Apply model preference if specified. | ||
| if ( $this->service && $this->model ) { | ||
| $prompt_builder = $prompt_builder->using_model_preference( [ $this->service, $this->model ] ); | ||
| } elseif ( $this->model ) { | ||
| // If only model is specified without a service, try common providers. | ||
| // This provides a reasonable fallback that works with most configurations. | ||
| // The WP AI Client will automatically use the first available provider | ||
| // that has the specified model and is properly configured. | ||
| $prompt_builder = $prompt_builder->using_model_preference( | ||
| [ 'anthropic', $this->model ], | ||
| [ 'openai', $this->model ], | ||
| [ 'google', $this->model ] | ||
| ); | ||
| } | ||
|
|
||
| // Generate text response. | ||
| $text = $prompt_builder->generate_text(); | ||
|
|
||
| // Output the response. | ||
| WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) ); | ||
|
|
||
| // Keep the session open for follow-up questions. | ||
| $this->continue_conversation(); | ||
|
|
||
| } catch ( Exception $e ) { | ||
| WP_CLI::error( $e->getMessage() ); | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Continues the conversation with follow-up prompts. | ||
| */ | ||
| private function continue_conversation(): void { | ||
| $user_response = prompt( '', false, '' ); | ||
|
|
||
| if ( empty( $user_response ) ) { | ||
| return; | ||
| } | ||
|
|
||
| try { | ||
| $prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $user_response ); | ||
|
|
||
| if ( $this->service && $this->model ) { | ||
| $prompt_builder = $prompt_builder->using_model_preference( [ $this->service, $this->model ] ); | ||
| } | ||
|
|
||
| $text = $prompt_builder->generate_text(); | ||
|
|
||
| WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) ); | ||
|
|
||
| $this->continue_conversation(); | ||
| } catch ( Exception $e ) { | ||
| WP_CLI::error( $e->getMessage() ); | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
The current implementation of the conversation flow has a critical bug: it doesn't maintain the conversation context between turns. A new conversation is started with every follow-up prompt because \WordPress\AI_Client\AI_Client::prompt() is called in a loop, which creates a new Prompt object each time.
Additionally, the continue_conversation method uses recursion, which could lead to a stack overflow on long conversations. An iterative approach using a while loop would be more robust.
I suggest refactoring this to correctly maintain the conversation state by reusing the same Prompt object and calling its generate_text() method with the new user input. This also resolves an issue where model preferences are not applied to follow-up prompts.
/**
* Calls AI service with a prompt and enters a conversation loop.
*
* @param string $prompt The initial prompt to send.
*/
public function call_ai_service_with_prompt( string $prompt ): void {
try {
// Initialize WP AI Client if not already done.
if ( ! did_action( 'init' ) ) {
// phpcs:ignore WordPress.NamingConventions.PrefixAllGlobals.NonPrefixedHooknameFound -- This is a core WordPress hook.
do_action( 'init' );
}
\WordPress\AI_Client\AI_Client::init();
// Create a prompt builder for the conversation.
$prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $prompt );
// Apply model preference if specified.
if ( $this->service && $this->model ) {
$prompt_builder->using_model_preference( [ $this->service, $this->model ] );
} elseif ( $this->model ) {
// If only model is specified without a service, try common providers.
$prompt_builder->using_model_preference(
[ 'anthropic', $this->model ],
[ 'openai', $this->model ],
[ 'google', $this->model ]
);
}
// Generate the first response.
$text = $prompt_builder->generate_text();
WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );
// Continue the conversation in a loop.
$this->continue_conversation( $prompt_builder );
} catch ( Exception $e ) {
WP_CLI::error( $e->getMessage() );
}
}
/**
* Continues the conversation with follow-up prompts.
*
* @param \WordPress\AI_Client\Prompt $prompt_builder The prompt object for the conversation.
*/
private function continue_conversation( \WordPress\AI_Client\Prompt $prompt_builder ): void {
while ( true ) {
$user_response = prompt( '', false, '' );
if ( empty( $user_response ) ) {
return;
}
try {
// By calling generate_text() on the same prompt_builder object, we maintain the conversation history.
$text = $prompt_builder->generate_text( $user_response );
WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );
} catch ( Exception $e ) {
WP_CLI::error( $e->getMessage() );
}
}
}
}| # WP-CLI AI Command | ||
|
|
||
| A WP-CLI package that enables AI interactions with WordPress via the Model Context Protocol (MCP). | ||
|
|
||
| ## Project Overview | ||
|
|
||
| - **Type**: WP-CLI package | ||
| - **PHP Version**: 8.2+ | ||
| - **Namespace**: `McpWp\AiCommand` | ||
| - **License**: Apache-2.0 | ||
|
|
||
| ## Architecture | ||
|
|
||
| ``` | ||
| src/ | ||
| ├── AI/ # AI client implementations (WpAiClient, AiClient) | ||
| ├── MCP/ # MCP protocol implementation | ||
| │ ├── Servers/ # MCP servers (WP_CLI tools) | ||
| │ └── Client.php # MCP client | ||
| ├── Utils/ # Utilities (logging, config) | ||
| ├── AiCommand.php # Main `wp ai` command | ||
| ├── CredentialsCommand.php # `wp ai credentials` subcommand | ||
| ├── McpCommand.php # `wp mcp` command | ||
| └── McpServerCommand.php # `wp mcp server` subcommand | ||
| ``` | ||
|
|
||
| ## Development Commands | ||
|
|
||
| ```bash | ||
| # Run all tests | ||
| composer test | ||
|
|
||
| # Individual test suites | ||
| composer phpunit # PHPUnit tests | ||
| composer behat # Behat integration tests | ||
| composer phpcs # Code style checks | ||
| composer phpstan # Static analysis | ||
| composer lint # Linter | ||
|
|
||
| # Fix code style | ||
| composer format # or: composer phpcbf | ||
|
|
||
| # Prepare test environment | ||
| composer prepare-tests | ||
| ``` | ||
|
|
||
| ## Code Style | ||
|
|
||
| - Uses WP_CLI_CS ruleset (WordPress coding standards) | ||
| - Global namespace prefix: `McpWp\AiCommand` (classes) or `ai_command` (functions/variables) | ||
| - Run `composer format` to auto-fix style issues | ||
|
|
||
| ## Key Dependencies | ||
|
|
||
| - `logiscape/mcp-sdk-php`: MCP SDK for PHP | ||
| - `mcp-wp/mcp-server`: MCP server implementation | ||
| - `wp-cli/wp-cli`: WP-CLI framework | ||
| - `wordpress/wp-ai-client`: WordPress AI client (dev dependency for testing) | ||
|
|
||
| ## WP-CLI Commands | ||
|
|
||
| - `wp ai` - Main AI interaction command | ||
| - `wp ai credentials list|set|delete` - Manage AI provider API keys | ||
| - `wp mcp prompt` - MCP prompt handling | ||
| - `wp mcp server add|list|remove|update` - Manage MCP servers | ||
|
|
||
| ## Testing Notes | ||
|
|
||
| - PHPUnit tests are in `tests/phpunit/` | ||
| - Behat feature tests are in `features/` | ||
| - PHPStan config: `phpstan.neon.dist` |
There was a problem hiding this comment.
This documentation file seems to be outdated and doesn't reflect the significant changes made in this pull request. Several sections are now incorrect:
- Line 3: The project no longer uses the Model Context Protocol (MCP). It now uses the WP AI Client.
- Lines 15-25 (Architecture): The diagram includes the
MCP/directory,McpCommand.php, andMcpServerCommand.php, all of which have been removed. - Lines 55-56 (Key Dependencies): It lists MCP-related dependencies that are no longer in the project.
- Lines 64-65 (WP-CLI Commands): It lists
wp mcpcommands that have been removed.
Please update this file to accurately describe the current state of the project, focusing on the integration with WP AI Client.
| private function ensure_wp_ai_client_available(): void { | ||
| if ( ! class_exists( '\WordPress\AI_Client\AI_Client' ) ) { | ||
| WP_CLI::error( 'The WP AI Client is not available. Please ensure the WP AI plugin is installed and activated.' ); | ||
| } | ||
| } |
There was a problem hiding this comment.
The error message here is slightly different and less informative than the one in AiCommand.php. For consistency and to provide better guidance to the user, I recommend using the same error message in both places. The message in AiCommand.php is more helpful as it mentions that WP AI Client is also available in WordPress 7.0+.
private function ensure_wp_ai_client_available(): void {
if ( ! class_exists( '\WordPress\AI_Client\AI_Client' ) ) {
WP_CLI::error( 'This command requires the WP AI Client. Please ensure WordPress 7.0+ or the AI plugin is installed and activated.' );
}
}
Ah nice, I finally got around to poking at this project been meaning to basically since you shared it. Was trying to get a PoC up and running, will take a look over there too/instead. |
Summary
This PR completes the work started in #72 to integrate the WordPress AI Client as the backend for AI functionality. It also fixes several issues that prevented #72 from working correctly.
What's Included from #72
The original Copilot PR implemented:
wp ai credentialssubcommand with list/set/delete operationsAdditional Fixes (beyond #72)
1. Fixed
composer installfailureThe original PR had dependency conflicts preventing installation:
phpcompatibility/php-compatibility(from wp-cli-tests v4.x) requiredsquizlabs/php_codesniffer ^4.0wp-coding-standards/wpcsrequiredsquizlabs/php_codesniffer ^3.xFix: Upgraded
wp-cli-teststo^5.0and downgradedphpstanto^1.12(required by wp-cli ecosystem).2. Fixed WP-CLI subcommand registration bug
The original PR used
__invokeforAiCommand, then tried to registerai credentialsas a subcommand. WP-CLI doesn't allow subcommands on callable commands, causing:Fix: Renamed
__invoketoprompt(), so the command structure is now:wp ai prompt "<text>"- Run AI promptswp ai credentials list|set|delete- Manage credentials3. Removed MCP code per PR requirements
The PR description stated "MCP tool integration has been temporarily removed" and the original issue said "All the
wp mcpcommands are not really needed anymore", but the code was still present.Fix: Removed ~1,800 lines of MCP-related code:
src/MCP/directoryMcpCommand.php,McpServerCommand.phplogiscape/mcp-sdk-phpandmcp-wp/mcp-serverdependencieswpackagist-plugin/ai-services(no longer needed)4. Fixed PHPStan/PHPCS errors
WpAiClient(tool-related methods, unused parameters)WP_CLI::error()callsstrictRulesconfig options not available in PHPStan 1.xUsage
Testing
All tests pass:
composer lint✅composer phpcs✅composer phpstan✅composer phpunit✅composer behat✅