Project Red Queen is a sophisticated full-stack AI chat application that brings intelligent, context-aware conversations to users through a modern web interface. Named after the relentless character from Lewis Carroll's Through the Looking-Glass, the app embodies continuous adaptation and evolution in AI interactions.
- Framework: Django 5.2+ with Django REST Framework
- AI Integration: Google Gemini for natural language processing and multimodal AI responses
- Vector Database: ChromaDB for persistent conversation memory and context retrieval
- Document Processing: LlamaIndex for intelligent document ingestion and querying
- Audio Capabilities: Text-to-speech synthesis using Edge TTS and audio analysis with Librosa
- Deployment: Railway with Nixpacks, supporting Python 3.12 and uv package management
- Framework: Next.js with TypeScript
- UI/UX: Responsive design with real-time chat interface
- Deployment: Vercel for seamless hosting and CDN
- Natural language processing with Google Gemini AI
- Context-aware responses using vector embeddings
- Multimodal input support (text, potentially images/audio)
- Persistent conversation history
- Document intelligence through LlamaIndex
- Audio processing for voice interactions
- Customizable system prompts
- Extensible AI model integrations
- HTTPS-enabled secure communications
- CORS configuration for cross-origin requests
- Scalable deployment on Railway
- Environment-based configuration management
- Backend: Python 3.12, Django, Google Generative AI, ChromaDB, LlamaIndex, Librosa, Edge TTS
- Frontend: TypeScript, Next.js, React
- Database: SQLite (development), configurable for PostgreSQL/MySQL in production
- Deployment: Railway (backend), Vercel (frontend)
- Package Management: uv for Python dependencies
- Version Control: Git with GitHub
- Python 3.12+
- Node.js 18+
- Git
-
Clone the repository
git clone https://github.com/eddietal2/Project_Red_Queen.git cd Project_Red_Queen -
Backend Setup
cd b-e uv sync cp .env.example .env # Configure your environment variables python manage.py migrate python manage.py runserver
-
Frontend Setup
cd f-e npm install cp .env.example .env.local # Set NEXT_PUBLIC_API_URL to backend URL npm run dev
-
Access the application
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
Project_Red_Queen/
├── b-e/ # Backend (Django)
│ ├── config/ # Django settings
│ ├── ai_app/ # Main AI application
│ ├── manage.py
│ ├── pyproject.toml
│ └── uv.lock
├── f-e/ # Frontend (Next.js)
│ ├── app/
│ ├── components/
│ ├── package.json
│ └── next.config.js
├── .gitignore
├── README.md
└── LICENSE
GOOGLE_API_KEY=your_google_api_key
DEBUG=True
SECRET_KEY=your_django_secret_key
ALLOWED_HOSTS=localhost,127.0.0.1NEXT_PUBLIC_API_URL=http://localhost:8000- Connect your GitHub repository to Railway
- Set environment variables in Railway dashboard
- Deploy automatically on push
- Connect your GitHub repository to Vercel
- Set
NEXT_PUBLIC_API_URLto your Railway backend URL - Deploy automatically
cd b-e
python manage.py testcd f-e
npm testWe welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Version: 1.0.0 (Stable Release)
- Development: Active
- License: MIT
- Contributors: Open to contributions
For support, email eddietaylor@example.com or join our Discord community.
Special thanks to the open-source community for the amazing libraries that made this project possible: Django, Next.js, Google AI, ChromaDB, and many more.
The Red Queen is live and ready to chat! 🎭
For more information, visit our documentation or check out the issues page.