A comprehensive full-stack AI application suite showcasing production-ready LangChain and LangGraph implementations.
This monorepo demonstrates advanced AI agent patterns, semantic search, multi-turn conversations, and rich UI experiencesโall built with modern TypeScript, Angular, and NestJS.
A complete e-commerce conversational AI that helps users discover products through natural language:
- Semantic Product Search: Vector-powered search with pgvector and embeddings
- Multi-turn Context: Remembers conversation history across interactions
- Custom LangChain Tools: Product search with category filtering + category browsing
- Rich UI: Interactive chat with markdown support and product cards
- 27,752 Products: Full product catalog with vector embeddings ready for semantic search
A versatile conversational AI demonstrating LangChain fundamentals:
- Personalization Tools: Custom LangChain tool for user context
- Conversation Threading: UUID-based conversation management
- Rich Content: Markdown, syntax highlighting, and Mermaid diagrams
- Production Patterns: NgRx Signal Store, immer.js, tapResponse
An ETL pipeline for generating and storing vector embeddings:
- Batch Processing: Efficiently handles 27K+ products
- Ollama Integration: Local embeddings with nomic-embed-text (768 dimensions)
- pgvector Storage: PostgreSQL with HNSW indexing for fast similarity search
- Migration Support: TypeORM migrations for schema management
- โ
Agent Creation: Using
createAgentwith model + tools + memory - โ
Custom Tools: Building domain-specific tools with the
tool()function - โ
Structured Output: Using
toolStrategywith Zod schemas for consistent responses - โ
Memory Persistence: LangGraph's
MemorySavercheckpointer for conversation state - โ Conversation Threading: Managing multiple concurrent conversations with thread IDs
- โ Embeddings: Generating vector representations with Ollama (nomic-embed-text)
- โ Vector Stores: PGVectorStore integration with PostgreSQL
- โ Semantic Search: Similarity search with cosine distance
- โ
Retrieval: Using
.asRetriever()pattern with filtering - โ HNSW Indexing: Fast approximate nearest neighbor search
- โ Error Handling: tapResponse pattern for structured error flows
- โ Optimistic Updates: Immediate UI feedback with rollback on failure
- โ Type Safety: End-to-end type safety with Zod schemas
- โ Model Configuration: Centralized model provider with dependency injection
- โ API Documentation: Comprehensive Swagger/OpenAPI specs
The workspace is organized into several applications and libraries:
-
chat-api(apps/chat-api): A NestJS backend providing the AI chat logic.- Implements LangGraph
MemorySaverfor conversation threading. - Custom LangChain tools for user personalization.
- Swagger UI for interactive API documentation.
- ๐ Detailed Documentation
- Implements LangGraph
-
chat-ui(apps/chat-ui): An AnalogJS/Angular frontend.- Uses NgRx Signal Store for reactive state management.
- Shared Zod schemas for validation and type safety.
- Proxy-based integration with the backend.
- Includes comprehensive service documentation for the chat store.
-
ecommerce-assistant-api(apps/ecommerce-assistant-api): A conversational e-commerce assistant API.- Semantic product search using pgvector and LangChain.
- Multi-turn conversations with memory persistence.
- Custom LangChain tools for product search and category browsing.
- Structured responses with Markdown support.
- Swagger UI for interactive API documentation.
- ๐ Detailed Documentation
-
ecommerce-assistant-ui(apps/ecommerce-assistant-ui): An AnalogJS/Angular frontend for the shopping assistant.- Interactive chat interface with product recommendations.
- Rendered with Markdown and custom product cards.
- Built with NgRx Signal Store.
- ๐ Detailed Documentation
-
product-ingest(apps/product-ingest): An embedding pipeline for e-commerce product data.- Processes product catalogs and generates vector embeddings for semantic search.
- First step in building an intelligent e-commerce agent.
- Integrates with vector databases for efficient product knowledge retrieval.
- ๐ Detailed Documentation
-
hello-agent(apps/hello-agent): A CLI tool built with Nest Commander for quick AI interactions.
iac(iac/): Infrastructure as Code for deployment and vector database setup.
-
chat-components(libs/chat-components): Reusable Angular UI components (message bubbles, markdown rendering).- ๐ Detailed Documentation
- Features markdown rendering with syntax highlighting.
- Supports Mermaid diagram visualization.
- Optimized for performance and accessibility.
-
model-provider(libs/model-provider): A shared library for MistralAI configuration and integration.- ๐ Detailed Documentation
- Provides centralized Mistral AI model management.
- Supports both synchronous and asynchronous configuration.
- Enables dependency injection across the workspace.
-
communication(libs/communication): Shared utilities for API communication, validation, and error handling.- ๐ Detailed Documentation
- Zod schema validation with structured error responses.
- Standardized API error handling wrappers.
- Type-safe conversation ID validation.
- Used by chat-ui and ecommerce-assistant-ui server routes.
- LLM: Mistral AI (
mistral-large-latest) - Orchestration: LangChain JS - Agent framework with custom tools
- Memory: LangGraph - MemorySaver checkpointer for conversation state
- Embeddings: Ollama - nomic-embed-text (768 dimensions)
- Vector Database: PostgreSQL + pgvector extension
- Framework: NestJS - Enterprise Node.js framework
- Build Tool: Vite - Fast development and production builds
- Validation: Zod schemas + class-validator
- API Docs: Swagger/OpenAPI with interactive UI
- ORM: TypeORM for database migrations
- Framework: Angular + AnalogJS - Meta-framework for Angular
- State: NgRx Signal Store - Reactive state management with signals
- Styling: Tailwind CSS + DaisyUI
- Content: Marked.js (markdown), Prism.js (syntax highlighting), Mermaid.js (diagrams)
- Monorepo: Nx - Smart, fast build system
- Package Manager: npm
- TypeScript: Full type safety across the stack
- Infrastructure: Docker Compose for local development
graph TD;
A[User] --> B[Chat UI];
B --> C[Chat API];
C --> D[LangChain Agent];
D --> E[Mistral AI];
D --> F[MemorySaver];
E --> G[Response];
F --> G;
G --> B;
subgraph Frontend
B[Chat UI] --> H[NgRx Signal Store];
B --> I[Chat Components];
I --> J[Markdown Renderer];
I --> K[Mermaid Support];
end
subgraph Backend
C[Chat API] --> L[Model Provider];
C --> M[Custom Tools];
C --> N[Swagger Docs];
end
subgraph Shared
O[Zod Schemas] --> B;
O --> C;
P[Model Provider] --> C;
Q[Communication Utils] --> B;
Q --> C;
end
- User Interaction: User sends a message through the chat interface
- State Management: NgRx Signal Store manages conversation state
- API Request: Chat UI sends request to NestJS backend
- AI Processing: LangChain agent processes request with context from MemorySaver
- Model Integration: Mistral AI generates response using the configured model
- Response Handling: Backend returns structured response with conversation context
- Content Rendering: Chat UI displays response with markdown, code highlighting, and diagrams
This workspace uses buildable libraries with npm workspaces for code sharing between applications. This enables:
- โ Type-safe imports across apps
- โ Shared validation logic (Zod schemas)
- โ Reusable utility functions
- โ Nx build caching and optimization
@langchain-course-ws/communication: Validation utilities (safeParseOrThrow,callWithErrorHandling)@langchain-course-ws/model-provider: LangChain model configuration (Mistral AI, Ollama embeddings)@langchain-course-ws/chat-components: Reusable Angular chat UI components
NPM Workspaces (package.json):
{
"workspaces": ["dist/libs/*"]
}This creates Node.js-resolvable symlinks:
node_modules/@langchain-course-ws/communication โ dist/libs/communication
Auto-Build Dependencies (nx.json):
{
"targetDefaults": {
"@analogjs/platform:vite-dev-server": {
"dependsOn": ["^build"]
}
}
}Libraries are automatically built before serving applications.
# Libraries auto-build, no manual steps needed
nx serve chat-ui
# If you add a new library, recreate symlinks
npm install --legacy-peer-deps๐ Full Documentation: See docs/BUILDABLE_LIBRARIES.md for complete details on:
- How buildable libraries work with AnalogJS SSR
- Adding new libraries
- Troubleshooting module resolution
- Import guidelines
Required:
- Node.js v18 or higher
- npm (comes with Node.js)
- Mistral AI API Key (Get one here)
For E-Commerce Assistant:
- Docker (for PostgreSQL + pgvector)
- Ollama (Download) with
nomic-embed-textmodel
-
Clone and install dependencies:
git clone <repository-url> cd langchain-course-ws npm install
-
Configure environment:
# Create .env file in the root cat > .env << EOF MISTRAL_API_KEY=your_mistral_api_key_here EOF
-
Start the applications:
# Option 1: Run both API and UI together npm run dev # Option 2: Run individually npm run chat-api:dev # API on http://localhost:3311 npm run chat-ui:dev # UI on http://localhost:4200
-
Access the applications:
- Chat UI: http://localhost:4200
- Chat API Docs: http://localhost:3311/api/docs
Click to expand full setup instructions
# Install Ollama (see https://ollama.ai/)
# Then pull the embedding model:
ollama pull nomic-embed-text# Start PostgreSQL with pgvector
docker-compose -f iac/docker-compose.postgres.yml up -d
# WSL2 only: Start Ollama proxy
docker-compose -f iac/docker-compose.nginx.yml up -d# Update .env with additional settings
cat >> .env << EOF
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/langchain
OLLAMA_BASE_URL=http://localhost:11435
PORT=3312
EOFnpm run product-ingest:migrate# Build and run the ingestion pipeline
npm run product-ingest:build
node dist/apps/product-ingest/main.js ingestThis will process 27,752 products and generate embeddings. Time varies based on your hardware (5-30+ minutes depending on CPU/GPU).
# Terminal 1: Start API
npm run ecommerce-assistant-api:dev
# Terminal 2: Start UI
npm run ecommerce-assistant-ui:dev- Shopping UI: http://localhost:4200 (AnalogJS dev server)
- API Documentation: http://localhost:3312/api/docs
- Database:
psql postgresql://postgres:postgres@localhost:5432/langchain
See detailed documentation:
The workspace uses Vitest for unit testing and Playwright for E2E testing.
# Run all tests
npx nx run-many -t test
# Run specific project tests
npm run chat-api:test
npm run chat-ui:test
npm run chat-components:test
npm run communication:test- Visual Graph:
npx nx graph- See how projects depend on each other. - Generate Code:
npx nx g @nx/angular:component my-component --project=chat-ui - Linting:
npx nx run-many -t lint
Applications:
- Chat API - Multi-turn conversation API with LangChain agent
- Chat UI Components - Angular components architecture
- E-Commerce Assistant API - Semantic product search API
- E-Commerce Assistant UI - Shopping assistant interface
- Product Ingest Pipeline - Vector embedding generation
Libraries:
- Chat Components - Reusable Angular chat UI components
- Model Provider - Centralized Mistral AI configuration
- Communication - Shared API utilities, validation, and error handling
Infrastructure & Services:
- Infrastructure Setup - Docker configurations for PostgreSQL and Ollama
- Chat Store - NgRx Signal Store patterns
- Shared Schemas - Zod validation schemas
LangChain & AI:
- LangChain JS Documentation - JavaScript/TypeScript library docs
- LangGraph Documentation - State management for agents
- Mistral AI API - LLM provider documentation
- Ollama Documentation - Local model hosting
- pgvector GitHub - Vector similarity search
Frontend Technologies:
- Angular - Modern web framework
- AnalogJS - Meta-framework for Angular with SSR
- NgRx Signal Store - Reactive state management
- Tailwind CSS - Utility-first CSS
- DaisyUI - Tailwind component library
Backend & Tooling:
- NestJS - Enterprise Node.js framework
- Nx - Smart monorepo tools
- Zod - TypeScript-first schema validation
- Streaming Responses: Implement Server-Sent Events (SSE) for real-time message streaming
- Persistent Memory: Replace in-memory checkpointer with PostgreSQL/Redis storage
- Dynamic Categories: Auto-detect product categories from database instead of hardcoding
- User Authentication: Add login/signup with session management
- Conversation Management: Add UI for viewing, searching, and deleting past conversations
- Hybrid Search: Combine semantic search with keyword search and filters (price, rating)
- Product Comparison: Tool for side-by-side product comparison
- Shopping Cart: Full cart management with checkout flow
- Personalized Recommendations: User preference learning and recommendation engine
- Multi-modal Support: Image understanding for product visuals
- Voice Interface: Speech-to-text and text-to-speech integration
- Rate Limiting: Implement API rate limiting per user/session
- Conversation Summarization: Automatic summarization for long conversations
- Observability: Add logging, metrics, and tracing (OpenTelemetry)
- Increased Test Coverage: More comprehensive unit and integration tests
- CI/CD: Automated deployment pipeline
- Production Deployment: Containerization and cloud deployment (AWS/Azure/GCP)
langchain-course-ws/
โโโ apps/
โ โโโ chat-api/ # NestJS API with LangChain agent
โ โโโ chat-ui/ # AnalogJS/Angular frontend
โ โโโ ecommerce-assistant-api/ # E-commerce semantic search API
โ โโโ ecommerce-assistant-ui/ # Shopping assistant UI
โ โโโ product-ingest/ # Vector embedding pipeline
โ โโโ hello-agent/ # CLI demo tool
โโโ libs/
โ โโโ chat-components/ # Reusable Angular UI components
โ โโโ communication/ # Shared API utilities and validation
โ โโโ model-provider/ # Mistral AI configuration library
โโโ iac/
โ โโโ docker-compose.postgres.yml # PostgreSQL + pgvector
โ โโโ docker-compose.nginx.yml # Ollama proxy (WSL2)
โโโ data/ # Product datasets
โโโ doc-images/ # Documentation images
Built as part of the LangChain learning journey. Special thanks to the open-source communities behind:
- LangChain & LangGraph
- Angular & AnalogJS
- Nx & NestJS
- Mistral AI & Ollama
Made with โค๏ธ and AI

