Skip to content

Leverage WP AI Client#73

Draft
pwtyler wants to merge 6 commits intomcp-wp:mainfrom
pwtyler:fix/composer-and-cleanup
Draft

Leverage WP AI Client#73
pwtyler wants to merge 6 commits intomcp-wp:mainfrom
pwtyler:fix/composer-and-cleanup

Conversation

@pwtyler
Copy link

@pwtyler pwtyler commented Jan 13, 2026

Summary

This PR completes the work started in #72 to integrate the WordPress AI Client as the backend for AI functionality. It also fixes several issues that prevented #72 from working correctly.

What's Included from #72

The original Copilot PR implemented:

  • WP AI Client integration: Command requires WP AI Client (WordPress 7.0+ or AI plugin)
  • Credentials management: wp ai credentials subcommand with list/set/delete operations
  • Simplified approach: Focused on core text generation functionality

Additional Fixes (beyond #72)

1. Fixed composer install failure

The original PR had dependency conflicts preventing installation:

  • phpcompatibility/php-compatibility (from wp-cli-tests v4.x) required squizlabs/php_codesniffer ^4.0
  • wp-coding-standards/wpcs required squizlabs/php_codesniffer ^3.x

Fix: Upgraded wp-cli-tests to ^5.0 and downgraded phpstan to ^1.12 (required by wp-cli ecosystem).

2. Fixed WP-CLI subcommand registration bug

The original PR used __invoke for AiCommand, then tried to register ai credentials as a subcommand. WP-CLI doesn't allow subcommands on callable commands, causing:

Fatal error: 'wp ai' can't have subcommands

Fix: Renamed __invoke to prompt(), so the command structure is now:

  • wp ai prompt "<text>" - Run AI prompts
  • wp ai credentials list|set|delete - Manage credentials

3. Removed MCP code per PR requirements

The PR description stated "MCP tool integration has been temporarily removed" and the original issue said "All the wp mcp commands are not really needed anymore", but the code was still present.

Fix: Removed ~1,800 lines of MCP-related code:

  • Deleted src/MCP/ directory
  • Deleted McpCommand.php, McpServerCommand.php
  • Removed logiscape/mcp-sdk-php and mcp-wp/mcp-server dependencies
  • Removed wpackagist-plugin/ai-services (no longer needed)

4. Fixed PHPStan/PHPCS errors

  • Removed unused code in WpAiClient (tool-related methods, unused parameters)
  • Fixed PHPDoc mismatches
  • Removed unreachable statements after WP_CLI::error() calls
  • Removed strictRules config options not available in PHPStan 1.x

Usage

# Configure credentials
wp ai credentials set openai sk-proj-...
wp ai credentials set anthropic sk-ant-...
wp ai credentials list

# Run prompts
wp ai prompt "Explain WordPress in one sentence"
wp ai prompt "Summarize WordPress history" --model=gpt-4o --service=openai

# Delete credentials
wp ai credentials delete openai

Testing

All tests pass:

  • composer lint
  • composer phpcs
  • composer phpstan
  • composer phpunit
  • composer behat

Copilot AI and others added 6 commits December 7, 2025 14:55
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
Co-authored-by: swissspidy <841956+swissspidy@users.noreply.github.com>
- Fix dependency conflicts by upgrading wp-cli-tests to ^5.0 and
  downgrading phpstan to ^1.12 (required by wp-cli ecosystem)
- Remove all MCP-related code per PR mcp-wp#72 requirements:
  - Delete src/MCP/ directory and related commands
  - Remove logiscape/mcp-sdk-php and mcp-wp/mcp-server dependencies
  - Remove wpackagist-plugin/ai-services (no longer needed)
- Clean up WpAiClient: remove unused tool-related code
- Rename AiCommand.__invoke() to prompt() to allow subcommands
- Update Behat tests to match new behavior
- All tests pass (lint, phpcs, phpstan, phpunit, behat)
@gemini-code-assist
Copy link

Summary of Changes

Hello @pwtyler, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request finalizes the integration of the WordPress AI Client, establishing it as the sole backend for AI capabilities within the project. It addresses several critical technical issues, including dependency conflicts and WP-CLI command registration problems, which had previously hindered development. A significant architectural change involves the complete removal of the Model Context Protocol (MCP) and its associated codebase, streamlining the project's design. Additionally, it introduces robust credential management features for AI providers and enhances overall code quality.

Highlights

  • WordPress AI Client Integration: The project now fully utilizes the official WordPress AI Client for all AI functionality, replacing the previous AI Services plugin and Model Context Protocol (MCP) integration.
  • Dependency Resolution: Critical composer install failures due to conflicting php-codesniffer requirements have been resolved by upgrading wp-cli-tests to ^5.0 and downgrading phpstan to ^1.12.
  • WP-CLI Command Structure Refinement: The wp ai command now correctly supports subcommands by renaming the main AI prompt method from __invoke to prompt, allowing for wp ai prompt and wp ai credentials commands.
  • Complete MCP Removal: All Model Context Protocol (MCP) related code, dependencies, and commands (wp mcp, wp mcp server) have been entirely removed from the project, simplifying the architecture.
  • AI Credentials Management: A new wp ai credentials subcommand has been introduced, enabling users to easily list, set, and delete API keys for various AI providers directly via the CLI.
  • Code Quality Improvements: Various PHPStan and PHPCS errors have been addressed, including removal of unused code, PHPDoc mismatches, unreachable statements, and outdated PHPStan configurations.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@codecov
Copy link

codecov bot commented Jan 13, 2026

Codecov Report

❌ Patch coverage is 66.66667% with 25 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/AI/WpAiClient.php 35.48% 20 Missing ⚠️
ai-command.php 0.00% 2 Missing ⚠️
src/CredentialsCommand.php 94.44% 2 Missing ⚠️
src/AiCommand.php 83.33% 1 Missing ⚠️

📢 Thoughts on this report? Let us know!

@swissspidy
Copy link
Collaborator

Heya, thanks a bunch! I would love your input on https://github.com/wp-cli/ai-command where I started from scratch without any MCP or REPL logic in there whatsoever, just a light wrapper around the AI client library.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request is a major refactoring to integrate the WordPress AI Client, removing the previous MCP-based implementation. The changes are well-structured and significantly simplify the codebase. I've found a critical issue in the new conversation handling logic that prevents it from maintaining context between prompts. I've also noted some minor inconsistencies in documentation and error messages. Overall, great work on this simplification!

Comment on lines +24 to +97
/**
* Calls AI service with a prompt.
*
* @param string $prompt The prompt to send.
*/
public function call_ai_service_with_prompt( string $prompt ): void {
try {
// Initialize WP AI Client if not already done.
if ( ! did_action( 'init' ) ) {
// phpcs:ignore WordPress.NamingConventions.PrefixAllGlobals.NonPrefixedHooknameFound -- This is a core WordPress hook.
do_action( 'init' );
}

\WordPress\AI_Client\AI_Client::init();

// Create a prompt builder.
$prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $prompt );

// Apply model preference if specified.
if ( $this->service && $this->model ) {
$prompt_builder = $prompt_builder->using_model_preference( [ $this->service, $this->model ] );
} elseif ( $this->model ) {
// If only model is specified without a service, try common providers.
// This provides a reasonable fallback that works with most configurations.
// The WP AI Client will automatically use the first available provider
// that has the specified model and is properly configured.
$prompt_builder = $prompt_builder->using_model_preference(
[ 'anthropic', $this->model ],
[ 'openai', $this->model ],
[ 'google', $this->model ]
);
}

// Generate text response.
$text = $prompt_builder->generate_text();

// Output the response.
WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );

// Keep the session open for follow-up questions.
$this->continue_conversation();

} catch ( Exception $e ) {
WP_CLI::error( $e->getMessage() );
}
}

/**
* Continues the conversation with follow-up prompts.
*/
private function continue_conversation(): void {
$user_response = prompt( '', false, '' );

if ( empty( $user_response ) ) {
return;
}

try {
$prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $user_response );

if ( $this->service && $this->model ) {
$prompt_builder = $prompt_builder->using_model_preference( [ $this->service, $this->model ] );
}

$text = $prompt_builder->generate_text();

WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );

$this->continue_conversation();
} catch ( Exception $e ) {
WP_CLI::error( $e->getMessage() );
}
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The current implementation of the conversation flow has a critical bug: it doesn't maintain the conversation context between turns. A new conversation is started with every follow-up prompt because \WordPress\AI_Client\AI_Client::prompt() is called in a loop, which creates a new Prompt object each time.

Additionally, the continue_conversation method uses recursion, which could lead to a stack overflow on long conversations. An iterative approach using a while loop would be more robust.

I suggest refactoring this to correctly maintain the conversation state by reusing the same Prompt object and calling its generate_text() method with the new user input. This also resolves an issue where model preferences are not applied to follow-up prompts.

	/**
	 * Calls AI service with a prompt and enters a conversation loop.
	 *
	 * @param string $prompt The initial prompt to send.
	 */
	public function call_ai_service_with_prompt( string $prompt ): void {
		try {
			// Initialize WP AI Client if not already done.
			if ( ! did_action( 'init' ) ) {
				// phpcs:ignore WordPress.NamingConventions.PrefixAllGlobals.NonPrefixedHooknameFound -- This is a core WordPress hook.
				do_action( 'init' );
			}

			\WordPress\AI_Client\AI_Client::init();

			// Create a prompt builder for the conversation.
			$prompt_builder = \WordPress\AI_Client\AI_Client::prompt( $prompt );

			// Apply model preference if specified.
			if ( $this->service && $this->model ) {
				$prompt_builder->using_model_preference( [ $this->service, $this->model ] );
			} elseif ( $this->model ) {
				// If only model is specified without a service, try common providers.
				$prompt_builder->using_model_preference(
					[ 'anthropic', $this->model ],
					[ 'openai', $this->model ],
					[ 'google', $this->model ]
				);
			}

			// Generate the first response.
			$text = $prompt_builder->generate_text();
			WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );

			// Continue the conversation in a loop.
			$this->continue_conversation( $prompt_builder );

		} catch ( Exception $e ) {
			WP_CLI::error( $e->getMessage() );
		}
	}

	/**
	 * Continues the conversation with follow-up prompts.
	 *
	 * @param \WordPress\AI_Client\Prompt $prompt_builder The prompt object for the conversation.
	 */
	private function continue_conversation( \WordPress\AI_Client\Prompt $prompt_builder ): void {
		while ( true ) {
			$user_response = prompt( '', false, '' );

			if ( empty( $user_response ) ) {
				return;
			}

			try {
				// By calling generate_text() on the same prompt_builder object, we maintain the conversation history.
				$text = $prompt_builder->generate_text( $user_response );
				WP_CLI::line( WP_CLI::colorize( "%G$text%n" ) );
			} catch ( Exception $e ) {
				WP_CLI::error( $e->getMessage() );
			}
		}
	}
}

Comment on lines +1 to +71
# WP-CLI AI Command

A WP-CLI package that enables AI interactions with WordPress via the Model Context Protocol (MCP).

## Project Overview

- **Type**: WP-CLI package
- **PHP Version**: 8.2+
- **Namespace**: `McpWp\AiCommand`
- **License**: Apache-2.0

## Architecture

```
src/
├── AI/ # AI client implementations (WpAiClient, AiClient)
├── MCP/ # MCP protocol implementation
│ ├── Servers/ # MCP servers (WP_CLI tools)
│ └── Client.php # MCP client
├── Utils/ # Utilities (logging, config)
├── AiCommand.php # Main `wp ai` command
├── CredentialsCommand.php # `wp ai credentials` subcommand
├── McpCommand.php # `wp mcp` command
└── McpServerCommand.php # `wp mcp server` subcommand
```

## Development Commands

```bash
# Run all tests
composer test

# Individual test suites
composer phpunit # PHPUnit tests
composer behat # Behat integration tests
composer phpcs # Code style checks
composer phpstan # Static analysis
composer lint # Linter

# Fix code style
composer format # or: composer phpcbf

# Prepare test environment
composer prepare-tests
```

## Code Style

- Uses WP_CLI_CS ruleset (WordPress coding standards)
- Global namespace prefix: `McpWp\AiCommand` (classes) or `ai_command` (functions/variables)
- Run `composer format` to auto-fix style issues

## Key Dependencies

- `logiscape/mcp-sdk-php`: MCP SDK for PHP
- `mcp-wp/mcp-server`: MCP server implementation
- `wp-cli/wp-cli`: WP-CLI framework
- `wordpress/wp-ai-client`: WordPress AI client (dev dependency for testing)

## WP-CLI Commands

- `wp ai` - Main AI interaction command
- `wp ai credentials list|set|delete` - Manage AI provider API keys
- `wp mcp prompt` - MCP prompt handling
- `wp mcp server add|list|remove|update` - Manage MCP servers

## Testing Notes

- PHPUnit tests are in `tests/phpunit/`
- Behat feature tests are in `features/`
- PHPStan config: `phpstan.neon.dist`

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This documentation file seems to be outdated and doesn't reflect the significant changes made in this pull request. Several sections are now incorrect:

  • Line 3: The project no longer uses the Model Context Protocol (MCP). It now uses the WP AI Client.
  • Lines 15-25 (Architecture): The diagram includes the MCP/ directory, McpCommand.php, and McpServerCommand.php, all of which have been removed.
  • Lines 55-56 (Key Dependencies): It lists MCP-related dependencies that are no longer in the project.
  • Lines 64-65 (WP-CLI Commands): It lists wp mcp commands that have been removed.

Please update this file to accurately describe the current state of the project, focusing on the integration with WP AI Client.

Comment on lines +137 to +141
private function ensure_wp_ai_client_available(): void {
if ( ! class_exists( '\WordPress\AI_Client\AI_Client' ) ) {
WP_CLI::error( 'The WP AI Client is not available. Please ensure the WP AI plugin is installed and activated.' );
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The error message here is slightly different and less informative than the one in AiCommand.php. For consistency and to provide better guidance to the user, I recommend using the same error message in both places. The message in AiCommand.php is more helpful as it mentions that WP AI Client is also available in WordPress 7.0+.

	private function ensure_wp_ai_client_available(): void {
		if ( ! class_exists( '\WordPress\AI_Client\AI_Client' ) ) {
			WP_CLI::error( 'This command requires the WP AI Client. Please ensure WordPress 7.0+ or the AI plugin is installed and activated.' );
		}
	}

@pwtyler
Copy link
Author

pwtyler commented Jan 13, 2026

Heya, thanks a bunch! I would love your input on https://github.com/wp-cli/ai-command where I started from scratch without any MCP or REPL logic in there whatsoever, just a light wrapper around the AI client library.

Ah nice, I finally got around to poking at this project been meaning to basically since you shared it. Was trying to get a PoC up and running, will take a look over there too/instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Leverage WP AI Client

3 participants