AI microservice exposing the project’s genaiflows as an HTTP API, ready to live in a separate repository.
Access note: To access our main back-end or private resources, please request access at andresr@asharastudios.com.
Network note: Any contract or network references must target SEI EVM. If you see examples for other chains, adapt them to SEI EVM.
-
POST /towner/message
- body:
{ towner: Towner, language?: string } - response:
{ message: string }
- body:
-
POST /nft/analyze
- body:
{ metadata: object | string } - response:
{ message: string }
- body:
Requirements:
- Node 18+
GOOGLE_API_KEYenvironment variable for@langchain/google-genai
Setup and run:
npm install
npm run build
npm start
# watch mode
npm run dev# syntax=docker/dockerfile:1
FROM node:20-alpine AS deps
WORKDIR /app
COPY package*.json ./
RUN npm ci --omit=dev
FROM node:20-alpine AS build
WORKDIR /app
COPY . .
RUN npm run build
FROM node:20-alpine AS runner
WORKDIR /app
ENV NODE_ENV=production
COPY --from=deps /app/node_modules ./node_modules
COPY --from=build /app/dist ./dist
COPY package.json .
EXPOSE 8080
CMD ["node", "dist/index.js"]Build & run:
docker build -t ai-towner-service:latest .
docker run -e GOOGLE_API_KEY=xxxxx -p 8080:8080 ai-towner-service:latestSee src/genaiflows/townermessage/types.ts for the expected structure.