Overview
The platform is a conversational interface similar to ChatGPT but uniquely designed to integrate
LLMs from multiple providers. The backend architecture facilitates real-time communication
with diverse AI models, manages complex user sessions, stores contextual memories, and
handles file operations with maximum efficiency. We also store files and their embeddings to
give users more context driven answers using RAG (Retrieval Augmented Generation).
Additionally, we have some pre-built Agents that the users can use to handle some more complex
tasks like Deep Research.
Data Storage Architecture
- Chat History - JSON Blob via Turso:
All user-generated chats are stored in a structured JSON blob format using Turso, a distributed
edge database optimized for low-latency reads and writes.
-
- Format: chat_id, user_id, name, messages[], timestamp
-
- Justification: JSON format allows flexibility in storing metadata like token usage, model
responses, and user edits.
- Users & Sessions: SQL with Clerk:
User authentication, session persistence, and metadata (like roles and permissions) are managed
via Clerk over a traditional SQL-based database
-
- Schema: Users, Sessions, API keys, OAuth tokens
-
- Security: Built-in 2FA, passwordless auth, session encryption.
- Files & Images: Object Storage via UploadThing: User-uploaded assets (files, images, media) are stored using UploadThing on object storage
systems (e.g., AWS S3 under the hood).
-
- CDN-enabled for fast access
-
- Metadata (file type, size, access history) linked to SQL
- Database: MongoDB, Redis
- Cloud: AWS (EC2, S3), Docker
Challenges
- Integrating multiple LLMs with varying APIs and response formats.
- Optimizing real-time communication for low latency.
- Managing contextual memory across long user sessions.
Outcomes
The FlowGPT platform achieved a 95% user satisfaction rate in beta testing, with an average response time of under 200ms. It successfully handled 10,000+ concurrent users and supported multilingual interactions. The project was recognized for its innovative approach to LLM integration and scalability.
Back to Portfolio