Voice AI assistant for Meta smart glasses powered by OpenAI Realtime API.
Talk hands-free through your glasses. The AI hears you, sees what you see, and responds in natural voice.
- 🎙️ Voice conversations — talk naturally through glasses mic, hear responses in speakers
- 💬 Conversation history — browse and continue past discussions
Tools the AI can use:
- 📷
take_photo— see through glasses camera ("what am I looking at?") - 🌐
search_internet— real-time news, weather, prices, sports scores (requires Perplexity API key) - 🧠
manage_memory— remember things about you across conversations
⚠️ Developer Mode Required
Your Meta glasses must have Developer Mode enabled before this app can connect to them.
Without this, the app will build successfully but won't see your glasses.How to enable:
- Open Meta AI app on your phone
- Go to Settings → App Info
- Tap App version number five times quickly — this reveals the Developer Mode toggle
- Enable the Developer Mode toggle
- Tap Enable to confirm
See Meta Wearables Setup Guide for detailed instructions.
git clone https://github.com/kirill-markin/meta-glasses-ios-openai.git
cd meta-glasses-ios-openai
# Copy config templates
cp Config.xcconfig.example Config.xcconfig
cp meta-glasses-ios-openai/Config.swift.example meta-glasses-ios-openai/Config.swiftThe project requires two config files (both are gitignored for security):
| Required file | Example template |
|---|---|
Config.xcconfig |
Config.xcconfig.example |
meta-glasses-ios-openai/Config.swift |
Config.swift.example |
Copy each .example file (remove the .example suffix) and fill in your values:
Config.xcconfig — Xcode build settings (required):
PRODUCT_BUNDLE_IDENTIFIER = com.yourcompany.metaglasses
DEVELOPMENT_TEAM = YOUR_TEAM_ID_HERE
META_APP_ID = YOUR_META_APP_ID_HERE
Config.swift — API keys (optional at build time):
static let openAIAPIKey = "" // Optional: configure in app or set here as default
static let perplexityAPIKey = "" // Optional: leave empty to disable search💡 API keys can be configured in-app. You can leave
Config.swiftkeys empty and add them later in Settings → AI → Models (OpenAI) or AI Tools (Perplexity). Keys set inConfig.swiftare used as defaults on first launch.
Open meta-glasses-ios-openai.xcodeproj in Xcode → Run on physical iOS device.
⚠️ Simulator won't work — Bluetooth is required for glasses connection.
| What | Where to get |
|---|---|
| Physical iOS device | — |
| Meta smart glasses | Paired via Meta AI app |
| Meta App ID | developer.meta.com |
| OpenAI API key | platform.openai.com/api-keys |
| Perplexity API key (optional) | perplexity.ai/settings/api |
💡 API keys can be added in the app. You don't need to set them at build time — configure them in Settings → AI → Models (OpenAI) or AI Tools → search_internet (Perplexity).
💡 Perplexity API key is optional. Without it, the
search_internettool will be disabled but everything else works normally.
💡 Don't forget to enable Developer Mode on your glasses — see Before You Start above.
- Swift 5 / SwiftUI
- Meta Wearables SDK (MWDATCore, MWDATCamera)
- OpenAI Realtime API (WebSocket)
- Bluetooth HFP for glasses audio
meta-glasses-ios-openai/
├── VoiceAgentView.swift # Main voice UI
├── RealtimeAPIClient.swift # OpenAI WebSocket + audio
├── GlassesManager.swift # Meta SDK integration
├── ThreadsManager.swift # Conversation history
├── SettingsManager.swift # User prompt & memories
└── AudioManager.swift # Bluetooth HFP audio
You can start the voice assistant hands-free using Siri:
| Action | What to say |
|---|---|
| Start session | "Hey Siri, start session with Glasses" |
| Stop session | "Stop session" (said to the AI, not Siri) |
Limitation: "Hey Siri" is heard by your iPhone, not the glasses. If your phone is too far away or in a bag, Siri won't hear you. The glasses microphone cannot trigger Siri — this is an iOS limitation.
Future: When Meta enables custom button actions on glasses, this can be improved.
Kirill Markin — github.com/kirill-markin
