Skip to content

deliteser112/ory-python-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Legal Intel Dashboard

A comprehensive full-stack platform for legal document processing, natural language querying, and intelligent metadata extraction. Built with FastAPI + React TypeScript, demonstrating enterprise-grade architecture and production-ready code.

๐Ÿš€ Features

  • Document Upload: Drag-and-drop interface for multiple PDF/DOCX legal documents
  • Intelligent Processing: Automatic extraction of metadata (agreement types, jurisdictions, industries, geography)
  • Natural Language Querying: Ask questions in plain English across your document collection
  • Interactive Dashboard: Visual insights with charts and analytics
  • Secure Authentication: JWT-based security with rate limiting
  • Scalable Architecture: Built for production with proper error handling and logging

๐Ÿ—๏ธ Architecture

Backend (Python/FastAPI)

  • Framework: FastAPI with Python 3.9+
  • Database: SQLAlchemy ORM with SQLite (configurable for PostgreSQL/MySQL)
  • Authentication: JWT tokens with refresh mechanism
  • Document Processing: Intelligent pattern recognition for metadata extraction
  • API Design: RESTful endpoints with proper validation and error handling

Frontend (React/TypeScript)

  • Framework: React 18 with TypeScript
  • Styling: Tailwind CSS with responsive design
  • Charts: Recharts for data visualization
  • State Management: React Context API with hooks
  • File Handling: React Dropzone for document uploads

๐Ÿ“‹ Requirements

  • Python 3.9+
  • Node.js 18+
  • Poetry (Python dependency management)
  • npm or yarn

๐Ÿ› ๏ธ Installation & Setup

Backend Setup

  1. Install Poetry (if not already installed):

    curl -sSL https://install.python-poetry.org | python3 -
  2. Install Dependencies:

    poetry install
  3. Environment Configuration:

    cp env.example .env
    # Edit .env with your configuration
  4. Run Backend:

    poetry run python run.py

Frontend Setup

  1. Install Dependencies:

    cd frontend
    npm install
  2. Run Development Server:

    npm run dev
  3. Build for Production:

    npm run build

๐Ÿ”ง Configuration

Environment Variables (.env)

# Database Configuration
DATABASE_URL=sqlite:///./legal_intel.db

# File Upload Settings
UPLOAD_DIR=uploads
MAX_FILE_SIZE=10485760

# Security
SECRET_KEY=your-super-secret-key-change-this-in-production
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7

# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_DB=0
REDIS_PASSWORD=

# Rate Limiting
RATE_LIMIT_PER_MINUTE=60
RATE_LIMIT_PER_HOUR=1000
MAX_CONCURRENT_UPLOADS=5

# AI/LLM Configuration
OPENAI_API_KEY=your-openai-api-key
OPENAI_MODEL=gpt-4

๐Ÿ“š API Endpoints

Authentication

  • POST /auth/register - User registration
  • POST /auth/login - User login
  • POST /auth/refresh - Refresh access token
  • POST /auth/logout - User logout

Documents

  • POST /documents/upload - Upload multiple documents
  • GET /documents/ - List all documents
  • GET /documents/{id} - Get document details
  • DELETE /documents/{id} - Delete document
  • GET /documents/stats/summary - Document statistics

Query

  • POST /query - Natural language query across documents
  • GET /query/history - Query history

Dashboard

  • GET /dashboard - Dashboard analytics and insights

๐Ÿ” Usage Examples

Natural Language Queries

Example 1: Find agreements governed by UAE law

Question: "Which agreements are governed by UAE law?"

Response:

{
  "results": [
    {
      "document": "nda_abudhabi.pdf",
      "governing_law": "UAE"
    },
    {
      "document": "supplier_contract_dubai.docx",
      "governing_law": "UAE"
    }
  ],
  "total_results": 2
}

Example 2: Find technology industry contracts

Question: "Show me all contracts in the technology industry"

Document Upload

  1. Navigate to the Upload page
  2. Drag and drop PDF/DOCX files or click to browse
  3. Files are automatically processed and metadata extracted
  4. View processing status and results

๐Ÿงช Testing

Backend Tests

poetry run pytest
poetry run pytest tests/ -v

Frontend Tests

cd frontend
npm test

Code Quality

# Backend
poetry run black .
poetry run isort .
poetry run flake8
poetry run mypy .
poetry run pylint app/

# Frontend
cd frontend
npm run lint

๐Ÿš€ Production Deployment

Backend Deployment

  1. Database: Use PostgreSQL or MySQL for production
  2. Redis: Configure Redis for session management and caching
  3. File Storage: Use cloud storage (AWS S3, Azure Blob) for documents
  4. Environment: Set production environment variables
  5. Process Manager: Use systemd, supervisor, or Docker

Frontend Deployment

  1. Build: npm run build
  2. Serve: Use nginx, Apache, or cloud hosting
  3. CDN: Configure CDN for static assets

Docker Deployment

# Build and run with Docker Compose
docker-compose up -d

๐Ÿ”’ Security Features

  • JWT-based authentication with refresh tokens
  • Rate limiting on API endpoints
  • File type validation and size limits
  • SQL injection protection via SQLAlchemy
  • CORS configuration for frontend integration
  • Secure password hashing with bcrypt

๐Ÿ“Š Performance & Scaling

Current Implementation

  • SQLite database (suitable for development/small scale)
  • In-memory processing for document analysis
  • Basic caching with Redis

Production Scaling

  • Database: PostgreSQL with connection pooling
  • Document Processing: Async processing with Celery
  • Caching: Redis cluster for distributed caching
  • Load Balancing: Multiple backend instances
  • CDN: CloudFront/Akamai for static assets

Large Document Handling

For production use with large documents:

  • Implement streaming file processing
  • Use background tasks for document analysis
  • Implement document chunking for very large files
  • Add progress tracking for long-running operations

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ†˜ Support

For support and questions:

  • Create an issue in the GitHub repository
  • Check the documentation
  • Review the API endpoints

๐Ÿ”ฎ Future Enhancements

  • Advanced AI: Integration with real LLM APIs (OpenAI, Anthropic)
  • Document Indexing: Elasticsearch for advanced search capabilities
  • Real-time Updates: WebSocket support for live document processing
  • Advanced Analytics: Machine learning insights and trend analysis
  • Multi-language Support: Internationalization for global use
  • API Versioning: Proper API versioning for production use
  • Monitoring: Prometheus metrics and Grafana dashboards
  • CI/CD: Automated testing and deployment pipelines

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published