Snap a photo. Find your part. A React Native Expo application demonstrating Couchbase Lite mobile database capabilities with visual product search powered by CLIP embeddings.
This demo app showcases a visual product search workflow:
- Capture - Take a photo of a part or select from gallery
- Embed - Generate a 512-dimensional vector embedding using CLIP (on-device or via API)
- Search - Query Couchbase Lite using vector similarity search
- Match - Display similar products sorted by cosine distance
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Camera/Image ββββββΆβ CLIP Model ββββββΆβ Vector Query β
β Input β β (512-dim) β β Couchbase Lite β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ
β Similar Productsβ
β (Top N) β
βββββββββββββββββββ
- π± Mobile Database - Couchbase Lite with offline-first architecture
- π Sync Gateway Integration - Push/pull replication with Capella
- π Visual Search - AI-powered image similarity search
- πΈ Camera Integration - Capture or select images for search
- π§ On-Device ML - MobileCLIP inference via TensorFlow Lite
- π¦ Product Catalog - Browse synced products and vectors
- πΎ My Items - Save captured items with image blobs
| Home | Products | Visual Search |
|---|---|---|
| Dashboard with collection counts | Browse synced products | Camera β embedding β results |
| Component | Technology | Version |
|---|---|---|
| Framework | React Native (Expo) | 0.76.3 / SDK 52 |
| Language | TypeScript | 5.3.3 |
| Navigation | Expo Router | 4.0.0 |
| Database | Couchbase Lite (cbl-reactnative) |
0.6.3 |
| ML Inference | TensorFlow Lite (react-native-fast-tflite) |
2.0.0 |
| Image Picker | Expo Image Picker | 16.0.6 |
cb_cko_demo/
βββ cb_cko_demo_app/ # Main Expo application
β βββ app/ # Expo Router pages (file-based routing)
β β βββ _layout.tsx # Root layout with navigation
β β βββ index.tsx # Home screen (dashboard)
β β βββ products.tsx # Products collection browser
β β βββ vectors.tsx # Vectors collection browser
β β βββ myitems.tsx # User-captured items
β β βββ search.tsx # Visual search screen
β β βββ settings.tsx # Sync & database settings
β βββ providers/
β β βββ DatabaseProvider.tsx # React context for DB access
β β βββ SettingsProvider.tsx # App settings context
β βββ services/
β β βββ database.service.ts # Couchbase Lite operations
β β βββ local-embedding.service.ts # On-device CLIP inference
β βββ assets/ # Icons and images
β βββ ios/ # iOS native project
β βββ android/ # Android native project
β βββ app.json # Expo configuration
β βββ package.json # Dependencies
βββ patches/ # Patch files for dependencies
βββ mobileclip-s1.tflite # MobileCLIP TensorFlow Lite model
βββ AGENTS.md # AI assistant instructions
βββ RUN.md # Visual search implementation details
βββ README.md # This file
| Property | Value |
|---|---|
| Database | db |
| Scope | catalog |
| Collection | Description | Key Fields |
|---|---|---|
products |
Product catalog data | sku, name, type, model, image_url |
vectors |
512-dim embeddings | sku, embedding (float array) |
myItems |
User-captured items | owner, image (blob), embedding |
- Node.js 18+
- Xcode 15+ (for iOS)
- Android Studio (for Android)
- CocoaPods (for iOS)
You MUST have these services running to use the ONLINE (API) embedding mode:
| Service | Repository |
|---|---|
| Vector Embedding API | https://github.com/anujsahni/vector-embedding |
| Vector Search API | https://github.com/anujsahni/vector-search |
Clone and run these services before enabling ONLINE mode in Settings.
# Clone the repository
git clone https://github.com/your-username/cb_cko_demo.git
cd cb_cko_demo/cb_cko_demo_app
# Install dependencies
npm install
# Generate native projects
npx expo prebuild --clean
# Install iOS pods
cd ios && pod install && cd ..# Start Metro bundler
npm start
# Run on iOS simulator
npm run ios
# Run on Android emulator
npm run android
β οΈ Note: Thecbl-reactnativeplugin requires native modules and cannot run in Expo Go. Use development builds (expo run:iosorexpo run:android).
The app is pre-configured to sync with a Capella deployment:
| Setting | Value |
|---|---|
| Endpoint | wss://vexlc6f-ekguwo9l.apps.cloud.couchbase.com:4984/cko |
| Auth | Basic (username/password) |
| Mode | Push and pull replication |
Configure credentials in the Settings screen or update defaults in services/database.service.ts.
The app supports multiple embedding generation modes:
| Mode | Description | Latency |
|---|---|---|
| Local (On-Device) | MobileCLIP via TensorFlow Lite | ~50-100ms |
| API (Remote) | Server-side CLIP embedding | ~200-500ms |
Toggle between modes in the Settings screen.
- Image Capture: Select or capture an image via
expo-image-picker - Embedding Generation:
- On-device: MobileCLIP model generates 512-dim vector
- Remote: POST to embedding API endpoint
- Vector Query: Query
catalog.vectorsusingAPPROX_VECTOR_DISTANCE() - Result Display: Show matching products with similarity scores
const vectorIndex = new VectorIndex({
expression: 'embedding',
dimensions: 512,
centroids: 1,
metric: DistanceMetric.Cosine,
encoding: { type: 'none' },
isLazy: false,
});SELECT META().id AS id, *,
APPROX_VECTOR_DISTANCE(embedding, $vector, "COSINE") AS distance
FROM catalog.vectors
ORDER BY APPROX_VECTOR_DISTANCE(embedding, $vector, "COSINE")
LIMIT 10| Script | Description |
|---|---|
npm start |
Start Metro bundler |
npm run ios |
Build and run on iOS |
npm run android |
Build and run on Android |
npm run web |
Start web version (limited functionality) |
- Create a new file in
app/directory (e.g.,app/newscreen.tsx) - Export a default React component
- Expo Router automatically creates the route
All Couchbase Lite operations are centralized in services/database.service.ts:
initializeDatabase()- Open DB and create collectionsrunOneTimeSync()- Execute push/pull replicationsearchSimilarVectors()- Vector similarity searchgetDocumentById()- Fetch single document
| Phase | Status | Description |
|---|---|---|
| Crawl | β Complete | Basic setup, database init, sync button |
| Walk | β Complete | Collection viewers, settings page, improved UX |
| Run | π§ In Progress | Visual product search (camera β vector β results) |
See RUN.md for detailed Run phase implementation.
The vector index requires at least centroids Γ 25 documents with embeddings. With 1 centroid, you need 25+ vectors in the collection.
# Clean and rebuild
cd ios && pod deintegrate && pod install && cd ..
npx expo prebuild --clean
npm run ios- Verify WebSocket URL includes
wss://protocol - Check username/password credentials
- Ensure network connectivity to Capella endpoint
The plugin links to a local path. Ensure cbl-reactnative is available at ../../cbl-reactnative or update package.json:
"cbl-reactnative": "file:../../cbl-reactnative"- Couchbase Lite React Native SDK
- Couchbase Lite Vector Search Guide
- Expo Router Documentation
- MobileCLIP on TensorFlow Hub
- react-native-fast-tflite
This project is for demonstration purposes.
Built with β€οΈ using Couchbase Lite and React Native