Skip to content

Commit 4bb5785

Browse files
feat: interactive onboard wizard — pick provider and enter API key in one step
Previously, 'v1claw onboard' generated a config with a hardcoded Chinese model (glm-4.7) and told users to manually edit JSON. Now it interactively asks which provider to use (Gemini recommended as free), prompts for the API key, and sets the correct model automatically. Config is ready to use immediately — no manual editing needed. Changes: - Add interactive provider selection + API key prompt to onboard command - Change default model from 'glm-4.7' to empty (set during onboard) - Update config.example.json to show gemini-2.0-flash - Simplify README setup steps across all 4 platforms (remove manual JSON editing step) - Update tests to reflect new empty default model Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
1 parent 529dce9 commit 4bb5785

File tree

6 files changed

+120
-115
lines changed

6 files changed

+120
-115
lines changed

README.md

Lines changed: 20 additions & 100 deletions
Original file line numberDiff line numberDiff line change
@@ -197,91 +197,31 @@ This compiles V1Claw into a single file. It takes 2-5 minutes on a phone. When i
197197
./build/v1claw-linux-arm64 onboard
198198
```
199199

200-
This creates your config file at `~/.v1claw/config.json`.
200+
The setup wizard will ask you to:
201+
1. **Pick your AI provider** (Gemini is free and recommended)
202+
2. **Paste your API key**
201203

202-
#### Step 7: Add your API key
204+
That's it — your config is ready at `~/.v1claw/config.json`.
203205

204-
Open the config file in a text editor:
206+
> **Don't have an API key yet?** Press Enter to pick Gemini, then get a free key at [aistudio.google.com/apikey](https://aistudio.google.com/apikey). You can re-run `onboard` anytime, or edit the config manually with `nano ~/.v1claw/config.json`.
205207
206-
```bash
207-
nano ~/.v1claw/config.json
208-
```
209-
210-
Find the `"agents"` section and change the model name. Find the `"providers"` section and add your API key.
211-
212-
**If you're using Google Gemini** (free), change it to look like this:
213-
214-
```json
215-
{
216-
"agents": [
217-
{
218-
"name": "v1claw",
219-
"model": "gemini-2.0-flash"
220-
}
221-
],
222-
"providers": {
223-
"gemini": {
224-
"api_key": "YOUR_GEMINI_API_KEY_HERE"
225-
}
226-
}
227-
}
228-
```
229-
230-
**If you're using OpenAI:**
231-
232-
```json
233-
{
234-
"agents": [
235-
{
236-
"name": "v1claw",
237-
"model": "gpt-4o"
238-
}
239-
],
240-
"providers": {
241-
"openai": {
242-
"api_key": "sk-YOUR_OPENAI_KEY_HERE"
243-
}
244-
}
245-
}
246-
```
247-
248-
**If you're using Anthropic Claude:**
249-
250-
```json
251-
{
252-
"agents": [
253-
{
254-
"name": "v1claw",
255-
"model": "claude-sonnet-4-20250514"
256-
}
257-
],
258-
"providers": {
259-
"anthropic": {
260-
"api_key": "sk-ant-YOUR_KEY_HERE"
261-
}
262-
}
263-
}
264-
```
265-
266-
Save the file: press `Ctrl+O`, then `Enter`, then `Ctrl+X`.
267-
268-
#### Step 8: Test it!
208+
#### Step 7: Test it!
269209

270210
```bash
271211
./build/v1claw-linux-arm64 agent -m "Hello! What can you do?"
272212
```
273213

274214
You should see the AI respond. **If it does — congratulations, V1Claw is working on your phone!** 🎉
275215

276-
#### Step 9: Start chatting
216+
#### Step 8: Start chatting
277217

278218
```bash
279219
./build/v1claw-linux-arm64 agent
280220
```
281221

282222
This opens an interactive chat. Type anything and press Enter. Type `exit` or press `Ctrl+C` to quit.
283223

284-
#### Step 10: Enable phone hardware (optional)
224+
#### Step 9: Enable phone hardware (optional)
285225

286226
Want V1Claw to use your mic, camera, or read notifications? Edit the config again:
287227

@@ -309,7 +249,7 @@ Add a `"permissions"` section (you can turn each feature on or off individually)
309249

310250
> 🔒 **Every feature is OFF by default.** Only turn on what you need. You can change these anytime by editing the config and restarting.
311251
312-
#### Step 11: Run V1Claw 24/7 in the background (optional)
252+
#### Step 10: Run V1Claw 24/7 in the background (optional)
313253

314254
```bash
315255
nohup ./build/v1claw-linux-arm64 gateway > v1claw.log 2>&1 &
@@ -391,31 +331,23 @@ The binary will appear at `build/v1claw-darwin-arm64` (Apple Silicon) or `build/
391331
./build/v1claw-darwin-* onboard
392332
```
393333

394-
#### Step 5: Add your API key
395-
396-
```bash
397-
nano ~/.v1claw/config.json
398-
```
399-
400-
Change the model and add your API key (see the Android Step 7 above for examples with Gemini, OpenAI, or Claude).
401-
402-
Save: `Ctrl+O` → Enter → `Ctrl+X`.
334+
The setup wizard will ask you to pick a provider and enter your API key. Your config is ready immediately.
403335

404-
#### Step 6: Test it
336+
#### Step 5: Test it
405337

406338
```bash
407339
./build/v1claw-darwin-* agent -m "Hello! Tell me a fun fact."
408340
```
409341

410342
If you see a response — **it's working!** 🎉
411343

412-
#### Step 7: Interactive chat
344+
#### Step 6: Interactive chat
413345

414346
```bash
415347
./build/v1claw-darwin-* agent
416348
```
417349

418-
#### Step 8: Run as a 24/7 service (optional)
350+
#### Step 7: Run as a 24/7 service (optional)
419351

420352
```bash
421353
# Quick background mode
@@ -478,23 +410,15 @@ The binary will appear at `build/v1claw-linux-amd64` or `build/v1claw-linux-arm6
478410
./build/v1claw-linux-* onboard
479411
```
480412

481-
#### Step 5: Add your API key
482-
483-
```bash
484-
nano ~/.v1claw/config.json
485-
```
486-
487-
Change the model and add your API key (see the Android Step 7 above for examples with Gemini, OpenAI, or Claude).
488-
489-
Save: `Ctrl+O` → Enter → `Ctrl+X`.
413+
The setup wizard will ask you to pick a provider and enter your API key. Your config is ready immediately.
490414

491-
#### Step 6: Test it
415+
#### Step 5: Test it
492416

493417
```bash
494418
./build/v1claw-linux-* agent -m "Hello! What can you do?"
495419
```
496420

497-
#### Step 7: Run as a 24/7 system service (optional)
421+
#### Step 6: Run as a 24/7 system service (optional)
498422

499423
```bash
500424
# Install the binary
@@ -572,19 +496,15 @@ go build -o build/v1claw.exe ./cmd/v1claw
572496
build\v1claw.exe onboard
573497
```
574498

575-
#### Step 5: Add your API key
576-
577-
Open the config file at `%USERPROFILE%\.v1claw\config.json` in Notepad or any text editor.
578-
579-
Change the model and add your API key (see the Android Step 7 above for examples with Gemini, OpenAI, or Claude).
499+
The setup wizard will ask you to pick a provider and enter your API key. Your config is ready immediately.
580500

581-
#### Step 6: Test it
501+
#### Step 5: Test it
582502

583503
```bash
584504
build\v1claw.exe agent -m "Hello! What can you do?"
585505
```
586506

587-
#### Step 7: Interactive chat
507+
#### Step 6: Interactive chat
588508

589509
```bash
590510
build\v1claw.exe agent
@@ -608,7 +528,7 @@ cd V1Claw
608528

609529
# Copy the example config and edit it
610530
cp config/config.example.json config/config.json
611-
nano config/config.json # Add your API key (see Android Step 7 above)
531+
nano config/config.json # Add your provider API key
612532

613533
# Run a one-shot query
614534
docker compose run --rm v1claw-agent -m "Hello V1Claw!"

cmd/v1claw/main.go

Lines changed: 92 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -247,6 +247,78 @@ func onboard() {
247247
}
248248

249249
cfg := config.DefaultConfig()
250+
251+
// Interactive provider setup
252+
scanner := bufio.NewScanner(os.Stdin)
253+
254+
fmt.Println("\n🔧 Let's set up your AI provider.")
255+
fmt.Println("")
256+
fmt.Println("Pick a provider:")
257+
fmt.Println(" 1. Google Gemini (free — recommended)")
258+
fmt.Println(" 2. OpenAI (GPT-5, GPT-4)")
259+
fmt.Println(" 3. Anthropic (Claude)")
260+
fmt.Println(" 4. Groq (fast, free tier)")
261+
fmt.Println(" 5. DeepSeek")
262+
fmt.Println(" 6. OpenRouter (100+ models)")
263+
fmt.Println(" 7. Ollama (local, no API key)")
264+
fmt.Println(" 8. Skip (configure later)")
265+
fmt.Print("\nEnter number [1]: ")
266+
267+
choice := "1"
268+
if scanner.Scan() {
269+
t := strings.TrimSpace(scanner.Text())
270+
if t != "" {
271+
choice = t
272+
}
273+
}
274+
275+
type providerInfo struct {
276+
name string
277+
model string
278+
keyHint string
279+
keyURL string
280+
}
281+
282+
providerMap := map[string]providerInfo{
283+
"1": {name: "gemini", model: "gemini-2.0-flash", keyHint: "Gemini API key", keyURL: "https://aistudio.google.com/apikey"},
284+
"2": {name: "openai", model: "gpt-4o", keyHint: "OpenAI API key (starts with sk-)", keyURL: "https://platform.openai.com/api-keys"},
285+
"3": {name: "anthropic", model: "claude-sonnet-4-20250514", keyHint: "Anthropic API key (starts with sk-ant-)", keyURL: "https://console.anthropic.com/keys"},
286+
"4": {name: "groq", model: "llama-3.3-70b-versatile", keyHint: "Groq API key", keyURL: "https://console.groq.com/keys"},
287+
"5": {name: "deepseek", model: "deepseek-chat", keyHint: "DeepSeek API key", keyURL: "https://platform.deepseek.com/api_keys"},
288+
"6": {name: "openrouter", model: "google/gemini-2.0-flash-exp:free", keyHint: "OpenRouter API key", keyURL: "https://openrouter.ai/keys"},
289+
"7": {name: "ollama", model: "llama3.2"},
290+
}
291+
292+
if info, ok := providerMap[choice]; ok {
293+
cfg.Agents.Defaults.Model = info.model
294+
295+
if info.name == "ollama" {
296+
cfg.Providers.Ollama.APIBase = "http://localhost:11434/v1"
297+
fmt.Println("\n✓ Ollama selected. Make sure Ollama is running locally.")
298+
fmt.Printf(" Model: %s (change with: ollama pull <model>)\n", info.model)
299+
} else {
300+
fmt.Printf("\nGet your key at: %s\n", info.keyURL)
301+
fmt.Printf("Enter your %s: ", info.keyHint)
302+
303+
apiKey := ""
304+
if scanner.Scan() {
305+
apiKey = strings.TrimSpace(scanner.Text())
306+
}
307+
308+
if apiKey == "" {
309+
fmt.Println("\n⚠ No API key entered. You can add it later in:", configPath)
310+
} else {
311+
setProviderKey(cfg, info.name, apiKey)
312+
fmt.Println("\n✓ API key saved.")
313+
}
314+
fmt.Printf(" Model: %s\n", cfg.Agents.Defaults.Model)
315+
}
316+
} else if choice != "8" {
317+
fmt.Println("\n⚠ Invalid choice. You can configure manually in:", configPath)
318+
} else {
319+
fmt.Println("\n⚠ Skipped. Add your provider and API key in:", configPath)
320+
}
321+
250322
if err := config.SaveConfig(configPath, cfg); err != nil {
251323
fmt.Printf("Error saving config: %v\n", err)
252324
os.Exit(1)
@@ -255,11 +327,26 @@ func onboard() {
255327
workspace := cfg.WorkspacePath()
256328
createWorkspaceTemplates(workspace)
257329

258-
fmt.Printf("%s V1 is ready!\n", logo)
259-
fmt.Println("\nNext steps:")
260-
fmt.Println(" 1. Add your API key to", configPath)
261-
fmt.Println(" Get one at: https://openrouter.ai/keys")
262-
fmt.Println(" 2. Chat: v1claw agent -m \"Hello!\"")
330+
fmt.Printf("\n%s V1 is ready!\n", logo)
331+
fmt.Println("\nTry it out:")
332+
fmt.Println(" v1claw agent -m \"Hello!\"")
333+
}
334+
335+
func setProviderKey(cfg *config.Config, provider, key string) {
336+
switch provider {
337+
case "gemini":
338+
cfg.Providers.Gemini.APIKey = key
339+
case "openai":
340+
cfg.Providers.OpenAI.APIKey = key
341+
case "anthropic":
342+
cfg.Providers.Anthropic.APIKey = key
343+
case "groq":
344+
cfg.Providers.Groq.APIKey = key
345+
case "deepseek":
346+
cfg.Providers.DeepSeek.APIKey = key
347+
case "openrouter":
348+
cfg.Providers.OpenRouter.APIKey = key
349+
}
263350
}
264351

265352
func copyEmbeddedToTarget(targetDir string) error {

config/config.example.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
"defaults": {
44
"workspace": "~/.v1claw/workspace",
55
"restrict_to_workspace": true,
6-
"model": "glm-4.7",
6+
"model": "gemini-2.0-flash",
77
"max_tokens": 8192,
88
"temperature": 0.7,
99
"max_tool_iterations": 20

pkg/config/config.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -271,7 +271,7 @@ func DefaultConfig() *Config {
271271
Workspace: "~/.v1claw/workspace",
272272
RestrictToWorkspace: true,
273273
Provider: "",
274-
Model: "glm-4.7",
274+
Model: "",
275275
MaxTokens: 8192,
276276
Temperature: 0.7,
277277
MaxToolIterations: 20,

pkg/config/config_test.go

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -27,12 +27,12 @@ func TestDefaultConfig_WorkspacePath(t *testing.T) {
2727
}
2828
}
2929

30-
// TestDefaultConfig_Model verifies model is set
30+
// TestDefaultConfig_Model verifies model default is empty (set during interactive onboard)
3131
func TestDefaultConfig_Model(t *testing.T) {
3232
cfg := DefaultConfig()
3333

34-
if cfg.Agents.Defaults.Model == "" {
35-
t.Error("Model should not be empty")
34+
if cfg.Agents.Defaults.Model != "" {
35+
t.Errorf("Model should be empty by default, got %q", cfg.Agents.Defaults.Model)
3636
}
3737
}
3838

@@ -182,9 +182,7 @@ func TestConfig_Complete(t *testing.T) {
182182
if cfg.Agents.Defaults.Workspace == "" {
183183
t.Error("Workspace should not be empty")
184184
}
185-
if cfg.Agents.Defaults.Model == "" {
186-
t.Error("Model should not be empty")
187-
}
185+
// Model is empty by default (set during interactive onboard)
188186
if cfg.Agents.Defaults.Temperature == 0 {
189187
t.Error("Temperature should have default value")
190188
}

pkg/migrate/migrate_test.go

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -293,8 +293,8 @@ func TestConvertConfig(t *testing.T) {
293293
if len(warnings) != 0 {
294294
t.Errorf("expected no warnings, got %v", warnings)
295295
}
296-
if cfg.Agents.Defaults.Model != "glm-4.7" {
297-
t.Errorf("default model should be glm-4.7, got %q", cfg.Agents.Defaults.Model)
296+
if cfg.Agents.Defaults.Model != "" {
297+
t.Errorf("default model should be empty, got %q", cfg.Agents.Defaults.Model)
298298
}
299299
})
300300
}

0 commit comments

Comments
 (0)