Skip to content

feat: OAuth Codex authentication + LiteLLM security upgrade (CVE-2026-35030 & supply chain fix)#14

Closed
NeritonDias wants to merge 2 commits intoEvolutionAPI:mainfrom
NeritonDias:main
Closed

feat: OAuth Codex authentication + LiteLLM security upgrade (CVE-2026-35030 & supply chain fix)#14
NeritonDias wants to merge 2 commits intoEvolutionAPI:mainfrom
NeritonDias:main

Conversation

@NeritonDias
Copy link
Copy Markdown

@NeritonDias NeritonDias commented Apr 16, 2026

Summary

Adds OAuth Codex (OpenAI) as an alternative authentication method for AI providers in Evo CRM, allowing users with ChatGPT Plus/Pro subscriptions ($20-$200/mo) to use GPT-5.x models without a separate OpenAI API key.

Hybrid approach: OAuth works alongside existing API keys. No current functionality is changed or removed.

Changes

Backend (Processor)

  • Alembic migration: auth_type + oauth_data columns on api_keys table
  • OAuth device code flow service (initiate, poll, token refresh, status, revoke)
  • Thread-safe token refresh with SELECT FOR UPDATE row-level locking
  • Fernet encryption for OAuth token storage
  • 4 new REST endpoints under /agents/oauth/codex/*
  • Model remapping: chatgpt/ to openai/ prefix with custom api_base
  • Constants configurable via CODEX_CLIENT_ID env var

Frontend

  • New provider "ChatGPT (OAuth)" with 10 GPT-5.x models
  • OAuthDeviceCodeFlow component (user_code display, polling, countdown)
  • OAuthStatusBadge component (connected/disconnected status)
  • ApiKeysDialog conditional UI for OAuth vs API key providers

Tests

  • 22 integration tests across 7 test classes covering full OAuth flow

Security: LiteLLM Upgrade v1.68.0 -> v1.83.3

The current litellm>=1.68.0,<1.69.0 has known vulnerabilities:

  • CVE-2026-35030 (CRITICAL): OIDC auth bypass - used only first 20 chars of JWT as cache key, allowing session hijacking. Fixed in v1.83.0.
  • Supply Chain Attack (March 2026): v1.82.7 and v1.82.8 compromised by TeamPCP via Trivy CI/CD poisoning. Payload collected and exfiltrated all environment credentials (AWS/GCP/Azure/K8s/SSH/DB). Versions quarantined in ~40 minutes.
  • v1.83.3-stable built on new CI/CD v2 pipeline with SHA-pinned Actions, OIDC Trusted Publishers, Cosign signing, and SLSA provenance.

Verified SHA-256:

wheel:  eab4d2e1871cac0239799c33eb724d239116bf1bd275e287f92ae76ba8c7a05a
tar.gz: 38a452f708f9bb682fdfc3607aa44d68cfe936bf4a18683b0cdc5fb476424a6f

Compatibility note: google-adk==0.3.0 may have issues with Gemini 2.0+ structured output after upgrade (issue google/adk-python#4367). OpenAI/ChatGPT/Anthropic models are not affected.

OAuth Implementation Security

  • Tokens never logged or returned in API responses
  • Encrypted at rest with Fernet (AES-128-CBC + HMAC-SHA256)
  • Thread-safe concurrent refresh with SELECT FOR UPDATE + try/finally rollback
  • JWT Bearer auth (CSRF-immune), client ownership verification on all endpoints
  • verificationUri validated before use as href (prevents javascript: XSS)
  • Device code stored server-side only (never exposed to frontend)
  • Debug sweep: 10 findings identified and fixed by 3 independent audit agents

Technical Decision: Why openai/ prefix, not chatgpt/

LiteLLM's chatgpt/ provider silently ignores the api_key parameter and reads from a global auth.json file (confirmed in source: litellm/llms/chatgpt/chat/transformation.py). This is incompatible with multi-tenant operation.

The solution uses openai/ prefix with api_base="https://chatgpt.com/backend-api/codex" and passes the OAuth token as api_key. Google ADK's LiteLlm passes all **kwargs via _additional_args (confirmed in source SHA 7d13696c). Each tenant gets their own LiteLlm instance with zero shared global state.

New Environment Variables

CODEX_ENABLED=true
CODEX_CLIENT_ID=app_EMoamEEZ73f0CkXaXp7hrann

Breaking Changes

None. Fully backward compatible:

  • Existing API keys continue working identically
  • Migration adds columns with safe defaults (auth_type='api_key')
  • Rollback: alembic downgrade -1

Nginx Gateway

OAuth routes must be added before the generic /api/v1/agents/* catch-all:

location ~ ^/api/v1/agents/oauth/ {
    proxy_pass $processor_service$request_uri;
}
location ~ ^/api/v1/agents/apikeys {
    proxy_pass $processor_service$request_uri;
}

Documentation

  • docs/OAUTH-CODEX-pt-BR.md (Portuguese)
  • docs/OAUTH-CODEX-en.md (English)

Files

  • 6 new files (OAuth service, constants, migration, frontend components, types)
  • 9 modified files (models, schemas, crypto, apikey service, routes, agent builder, frontend)
  • 1 config (.env.example)
  • 22 tests - 2 docs (bilingual)

Summary by Sourcery

Add ChatGPT subscription-based OAuth Codex authentication alongside existing API-key auth to support GPT-5.x models without breaking current behavior.

New Features:

  • Introduce OAuth Codex (OpenAI) as an alternative auth_type on API keys, backing a new ChatGPT (OAuth) provider and GPT-5.x model family.
  • Add processor-side OAuth device code flow service and REST endpoints to initiate, poll, inspect, and revoke ChatGPT OAuth connections.
  • Expose frontend OAuth UX including device code flow dialog, connection status badge, and conditional API key dialog behavior for OAuth-backed providers.

Enhancements:

  • Extend API key model, schemas, and crypto utilities to support encrypted OAuth token storage with backward-compatible defaults.
  • Wire AgentBuilder to use OAuth tokens for openai-based ChatGPT Codex calls, including model name remapping and per-tenant LiteLLM configuration.
  • Add configuration constants and env wiring for Codex OAuth settings.

Documentation:

  • Add English and Portuguese documentation describing the OAuth Codex architecture, security considerations, configuration, and deployment steps.

Tests:

  • Add integration tests covering OAuth crypto round-trips, API key auth_type behavior, device code lifecycle, token refresh, status, revocation, model remapping, and migration compatibility.

@sourcery-ai
Copy link
Copy Markdown

sourcery-ai bot commented Apr 16, 2026

Reviewer's Guide

Implements a new OAuth Codex (OpenAI) authentication path alongside existing API-key auth so ChatGPT Plus/Pro subscribers can use GPT-5.x models via a multi-tenant-safe OpenAI/LiteLLM integration, adds frontend UX for starting and monitoring device-code OAuth flows, and introduces DB/schema, crypto, and service changes plus tests and docs; also upgrades LiteLLM to 1.83.3 for security reasons.

Sequence diagram for OAuth Codex device-code authentication flow

sequenceDiagram
    actor User
    participant Frontend as Frontend_App
    participant AgentAPI as Processor_AgentRoutes
    participant OAuthSvc as OAuthCodexService
    participant DB as Postgres_DB
    participant OpenAIAuth as OpenAI_Auth_API

    User->>Frontend: Open ApiKeysDialog and select provider=openai-codex
    User->>Frontend: Click "Connect with ChatGPT"

    Frontend->>AgentAPI: POST /agents/oauth/codex/device-code
    AgentAPI->>OAuthSvc: initiate_device_code_flow(db, client_id, name)
    OAuthSvc->>OpenAIAuth: POST CODEX_DEVICE_CODE_URL (client_id)
    OpenAIAuth-->>OAuthSvc: device_auth_id, user_code, interval
    OAuthSvc->>DB: INSERT ApiKey(auth_type=oauth_codex, oauth_data=pending_device_code, is_active=false)
    OAuthSvc-->>AgentAPI: OAuthDeviceCodeResponse(user_code, verification_uri, key_id, interval)
    AgentAPI-->>Frontend: 200 OK + OAuthDeviceCodeResponse
    Frontend-->>User: Display user_code, verification_uri, countdown

    loop Poll until complete/expired
        Frontend->>AgentAPI: POST /agents/oauth/codex/device-poll (key_id)
        AgentAPI->>DB: SELECT ApiKey WHERE id=key_id
        AgentAPI->>OAuthSvc: poll_device_code(db, key_id)
        OAuthSvc->>DB: READ oauth_data (pending_device_code)
        OAuthSvc->>OpenAIAuth: POST CODEX_DEVICE_POLL_URL (device_auth_id)
        alt Pending
            OpenAIAuth-->>OAuthSvc: 403/428
            OAuthSvc-->>AgentAPI: status=pending
            AgentAPI-->>Frontend: OAuthDevicePollResponse(pending)
        else Expired
            OpenAIAuth-->>OAuthSvc: 410
            OAuthSvc->>DB: UPDATE ApiKey.is_active=false
            OAuthSvc-->>AgentAPI: status=expired
            AgentAPI-->>Frontend: OAuthDevicePollResponse(expired)
        else Authorized
            OpenAIAuth-->>OAuthSvc: 200 + authorization_code, code_verifier
            OAuthSvc->>OpenAIAuth: POST CODEX_TOKEN_URL (authorization_code,...)
            OpenAIAuth-->>OAuthSvc: access_token, refresh_token, id_token
            OAuthSvc->>OAuthSvc: _extract_account_id, _extract_token_expiry
            OAuthSvc->>DB: UPDATE ApiKey.oauth_data=encrypted_tokens, is_active=true
            OAuthSvc-->>AgentAPI: status=complete
            AgentAPI-->>Frontend: OAuthDevicePollResponse(complete)
        end
    end

    Frontend-->>User: Show "Connected" success state
Loading

Sequence diagram for AgentBuilder using OAuth Codex tokens with LiteLLM

sequenceDiagram
    participant Client as Tenant_Client
    participant Frontend as Frontend_App
    participant Processor as Processor_AgentBuilder
    participant ApiKeySvc as ApiKeyService
    participant OAuthSvc as OAuthCodexService
    participant LiteLlm as LiteLlm_Adapter
    participant ChatGPT as ChatGPT_Codex_API
    participant DB as Postgres_DB

    Client->>Frontend: Invoke agent using provider=openai-codex
    Frontend->>Processor: Agent invocation request (agent_id)

    Processor->>DB: LOAD Agent (includes api_key_id, model="chatgpt/..." )
    Processor->>ApiKeySvc: get_api_key_record(db, api_key_id)
    ApiKeySvc->>DB: SELECT ApiKey WHERE id=api_key_id
    DB-->>ApiKeySvc: ApiKey(auth_type=oauth_codex, oauth_data,...)
    ApiKeySvc-->>Processor: ApiKey record

    alt auth_type == oauth_codex
        Processor->>OAuthSvc: get_fresh_token(db, api_key_id)
        OAuthSvc->>DB: SELECT ApiKey WITH FOR UPDATE
        OAuthSvc->>DB: decrypt_oauth_data(oauth_data)
        alt Token still valid
            OAuthSvc-->>Processor: access_token, account_id
        else Token near expiry or expired
            OAuthSvc->>OpenAIAuth: POST CODEX_TOKEN_URL (grant_type=refresh_token)
            OpenAIAuth-->>OAuthSvc: new access_token, refresh_token, id_token
            OAuthSvc->>DB: UPDATE ApiKey.oauth_data with new tokens
            OAuthSvc-->>Processor: new_access_token, account_id
        end
        Processor->>Processor: Remap model "chatgpt/x" -> "openai/x"
        Processor->>LiteLlm: new LiteLlm(model=openai/x, api_key=access_token, api_base=CODEX_API_BASE, extra_headers={ChatGPT-Account-Id,originator,User-Agent,accept})
    else auth_type == api_key
        Processor->>ApiKeySvc: get_decrypted_api_key(db, api_key_id)
        ApiKeySvc->>DB: SELECT ApiKey, decrypt_api_key(encrypted_key)
        ApiKeySvc-->>Processor: api_key
        Processor->>LiteLlm: new LiteLlm(model=agent.model, api_key=api_key)
    end

    Processor->>LiteLlm: completion(request_payload)
    LiteLlm->>ChatGPT: HTTP request to CODEX_API_BASE with Bearer access_token + headers
    ChatGPT-->>LiteLlm: Model response
    LiteLlm-->>Processor: Completion result
    Processor-->>Frontend: Agent response
    Frontend-->>Client: Display ChatGPT output
Loading

Entity relationship diagram for updated ApiKey OAuth Codex support

erDiagram
    CLIENT {
      uuid id PK
      string name
    }

    API_KEY {
      uuid id PK
      uuid client_id FK
      string name
      string provider
      string encrypted_key
      string auth_type
      text oauth_data
      datetime created_at
      datetime updated_at
      boolean is_active
    }

    CLIENT ||--o{ API_KEY : has

    %% Constraints (conceptual)
    API_KEY ||--|| API_KEY_AUTH_TYPE : auth_type_constraint
    API_KEY_AUTH_TYPE {
      string auth_type PK
    }

    API_KEY ||--|| API_KEY_STORAGE_MODE : storage_constraint
    API_KEY_STORAGE_MODE {
      uuid api_key_id PK
      string encrypted_key_nullable
      text oauth_data_nullable
    }
Loading

Class diagram for ApiKey, services, and OAuth Codex schemas

classDiagram
    class ApiKey {
      +uuid id
      +uuid client_id
      +string name
      +string provider
      +string encrypted_key
      +string auth_type
      +text oauth_data
      +datetime created_at
      +datetime updated_at
      +bool is_active
    }

    class ApiKeyBase {
      +string name
      +string provider
      +string auth_type
    }

    class ApiKeyCreate {
      +UUID4 client_id
      +string name
      +string provider
      +string auth_type
      +string key_value
      +validate_key_value(v, values) string
    }

    class ApiKeyUpdate {
      +string name
      +string provider
      +string auth_type
      +bool is_active
    }

    class ApiKeyResponse {
      +uuid id
      +UUID4 client_id
      +string name
      +string provider
      +string auth_type
      +bool is_active
      +bool oauth_connected
      +datetime created_at
      +datetime updated_at
    }

    class OAuthDeviceCodeRequest {
      +UUID4 client_id
      +string name
    }

    class OAuthDeviceCodeResponseSchema {
      +string user_code
      +string verification_uri
      +int expires_in
      +int interval
      +UUID4 key_id
    }

    class OAuthDevicePollRequest {
      +UUID4 key_id
    }

    class OAuthDevicePollResponseSchema {
      +string status
      +UUID4 key_id
      +string message
    }

    class OAuthStatusResponseSchema {
      +UUID4 key_id
      +bool connected
      +datetime expires_at
      +string account_id
      +string plan_type
    }

    class ApiKeyService {
      +create_api_key(db, client_id, name, provider, key_value, auth_type) ApiKey
      +get_api_key(db, key_id) ApiKey
      +get_api_keys_by_client(db, client_id) List~ApiKey~
      +update_api_key(db, key_id, update_data) ApiKey
      +delete_api_key(db, key_id) bool
      +get_decrypted_api_key(db, key_id) string
      +get_api_key_record(db, key_id) ApiKey
    }

    class OAuthCodexService {
      +initiate_device_code_flow(db, client_id, name) OAuthDeviceCodeResponseSchema
      +poll_device_code(db, key_id) OAuthDevicePollResponseSchema
      +get_fresh_token(db, key_id) (string,string)
      +get_oauth_status(db, key_id) OAuthStatusResponseSchema
      +revoke_oauth(db, key_id) bool
      +_extract_account_id(id_token) string
      +_extract_token_expiry(access_token) float
    }

    class CryptoUtils {
      +encrypt_api_key(key_value) string
      +decrypt_api_key(encrypted_key) string
      +encrypt_oauth_data(oauth_dict) string
      +decrypt_oauth_data(encrypted_data) dict
    }

    class AgentBuilder {
      +db
      +_create_llm_agent(agent) (LlmAgent, ExitStack)
    }

    class LiteLlmWrapper {
      +string model
      +string api_key
      +string api_base
      +dict extra_headers
      +completion(request_payload) dict
    }

    ApiKey <|-- ApiKeyResponse
    ApiKeyBase <|-- ApiKeyCreate
    ApiKeyBase <|-- ApiKeyUpdate

    ApiKeyService --> ApiKey : manages
    OAuthCodexService --> ApiKey : reads_updates
    ApiKeyService --> CryptoUtils : uses
    OAuthCodexService --> CryptoUtils : uses

    AgentBuilder --> ApiKeyService : uses
    AgentBuilder --> OAuthCodexService : uses
    AgentBuilder --> LiteLlmWrapper : creates

    OAuthDeviceCodeRequest --> OAuthDeviceCodeResponseSchema : initiates
    OAuthDevicePollRequest --> OAuthDevicePollResponseSchema : polls
    OAuthStatusResponseSchema --> ApiKeyResponse : populates_oauth_connected
Loading

File-Level Changes

Change Details Files
Extend backend data model, crypto utilities, and migrations to support dual auth modes (API key vs OAuth Codex) on api_keys.
  • Add auth_type and oauth_data fields to ApiKey ORM model and make encrypted_key nullable while preserving backwards-compatible defaults.
  • Create Alembic migration that adds the new columns, relaxes encrypted_key NOT NULL, and enforces CHECK constraints tying auth_type to either encrypted_key or oauth_data.
  • Extend crypto utilities with JSON-based Fernet encryption/decryption helpers for OAuth token blobs.
implementation/processor/src/models/models_diff.py
implementation/processor/migrations/versions/a1b2c3d4e5f6_add_oauth_codex_support.py
implementation/processor/src/utils/crypto_diff.py
Introduce a dedicated OAuth Codex service that implements the device-code flow, token storage, secure refresh, status, and revoke operations.
  • Implement initiate_device_code_flow to call OpenAI device-code endpoint, validate names, store the device code server-side in encrypted oauth_data, and create a pending ApiKey row.
  • Implement poll_device_code to retrieve the stored device code, handle pending/expired/error states, exchange authorization_code for tokens, extract account_id/expiry from JWT, and activate the ApiKey with encrypted OAuth payload.
  • Implement get_fresh_token with SELECT FOR UPDATE locking, buffered refresh, stale-token grace handling, and rollback-on-error semantics, plus helpers for status and revocation of OAuth keys.
implementation/processor/src/services/oauth_codex_service.py
implementation/processor/src/config/oauth_constants.py
Wire OAuth-aware behavior into existing API key service, agent routes, and agent builder so LLM calls can use either static API keys or OAuth tokens with correct routing to ChatGPT Codex backend.
  • Update create_api_key to accept auth_type, enforce key_value for api_key but not OAuth, start OAuth keys inactive, and ensure get_decrypted_api_key returns None for OAuth entries while exposing a new get_api_key_record helper.
  • Augment FastAPI schemas to carry auth_type/key_value validation and define request/response models for OAuth device-code, polling, and status flows, including optional oauth_connected on ApiKey responses.
  • Modify agent routes to pass auth_type into create_api_key and add four new OAuth endpoints (device-code, device-poll, status, revoke) with client ownership checks.
  • Adjust AgentBuilder’s _create_llm_agent to branch on auth_type, using get_fresh_token plus model remapping and api_base/extra_headers for OAuth Codex while preserving existing API-key behavior and config fallbacks.
implementation/processor/src/services/apikey_service_diff.py
implementation/processor/src/api/agent_routes_diff.py
implementation/processor/src/services/adk/agent_builder_diff.py
implementation/processor/src/schemas/schemas_diff.py
Add frontend support for configuring ChatGPT (OAuth) as a provider, running the device-code UX, and surfacing connection status within the API key dialog.
  • Extend model metadata so openai-codex appears as a provider with a curated list of chatgpt/* GPT-5.x models.
  • Update agentService types and API layer to track auth_type on ApiKey objects and expose OAuth-specific functions for initiate, poll, status, and revoke.
  • Augment ApiKeysDialog to conditionally hide key entry for openai-codex, trigger OAuthDeviceCodeFlow instead, and render OAuthStatusBadge (or reconnect) for OAuth keys.
  • Introduce OAuthDeviceCodeFlow and OAuthStatusBadge components that handle the full device-code lifecycle, polling, countdown, and status display with refresh controls.
implementation/frontend/types/aiModels_diff.ts
implementation/frontend/services/agentService_diff.ts
implementation/frontend/app/agents/dialogs/ApiKeysDialog_diff.tsx
implementation/frontend/app/agents/dialogs/OAuthDeviceCodeFlow.tsx
implementation/frontend/app/agents/components/OAuthStatusBadge.tsx
implementation/frontend/types/oauth.ts
Document the OAuth Codex feature and cover it with integration tests validating crypto, migrations, flows, and model remapping.
  • Add a comprehensive Python test suite that mocks auth.openai.com and verifies crypto round-trips, API key auth_type defaults, device-code polling, token refresh semantics, OAuth status/revoke behavior, model name remapping, and migration compatibility.
  • Provide bilingual documentation describing architecture, security considerations, deployment steps, environment variables, Nginx routing requirements, and LiteLLM upgrade rationale.
  • Update .env.example (and related config) to surface CODEX_* configuration knobs and LiteLLM versioning where applicable.
implementation/processor/tests/test_oauth_codex.py
docs/OAUTH-CODEX-en.md
docs/OAUTH-CODEX-pt-BR.md
.env.example

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 2 security issues, 5 other issues, and left some high level feedback:

Security issues:

  • Detected JWT token decoded with 'verify=False'. This bypasses any integrity checks for the token which means the token could be tampered with by malicious actors. Ensure that the JWT token is verified. (link)
  • Detected JWT token decoded with 'verify=False'. This bypasses any integrity checks for the token which means the token could be tampered with by malicious actors. Ensure that the JWT token is verified. (link)

General comments:

  • The chk_auth_data check constraint in the api_keys migration requires oauth_data to be non-null when auth_type='oauth_codex', but revoke_oauth sets oauth_data=None while leaving auth_type unchanged, which will violate this constraint at runtime; consider either relaxing the constraint or updating auth_type/record semantics on revoke so the constraint remains satisfied.
  • get_fresh_token calls db.commit() even in the fast path where no changes are made, which can unintentionally commit unrelated work on the same session; it would be safer to isolate this function’s transaction (e.g., its own session or explicit begin/commit block) so token refresh logic doesn’t affect caller transactions.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The `chk_auth_data` check constraint in the `api_keys` migration requires `oauth_data` to be non-null when `auth_type='oauth_codex'`, but `revoke_oauth` sets `oauth_data=None` while leaving `auth_type` unchanged, which will violate this constraint at runtime; consider either relaxing the constraint or updating `auth_type`/record semantics on revoke so the constraint remains satisfied.
- `get_fresh_token` calls `db.commit()` even in the fast path where no changes are made, which can unintentionally commit unrelated work on the same session; it would be safer to isolate this function’s transaction (e.g., its own session or explicit begin/commit block) so token refresh logic doesn’t affect caller transactions.

## Individual Comments

### Comment 1
<location path="implementation/processor/migrations/versions/a1b2c3d4e5f6_add_oauth_codex_support.py" line_range="35-39" />
<code_context>
+    op.alter_column(
+        "api_keys", "encrypted_key", existing_type=sa.String(), nullable=True,
+    )
+    op.create_check_constraint(
+        "chk_auth_type", "api_keys", "auth_type IN ('api_key', 'oauth_codex')",
+    )
+    op.create_check_constraint(
+        "chk_auth_data", "api_keys",
+        "(auth_type = 'api_key' AND encrypted_key IS NOT NULL) OR "
+        "(auth_type = 'oauth_codex' AND oauth_data IS NOT NULL)",
</code_context>
<issue_to_address>
**issue (bug_risk):** chk_auth_data constraint conflicts with revoke_oauth (oauth_data is set to NULL for oauth_codex keys).

`chk_auth_data` requires `oauth_data` to be NOT NULL when `auth_type = 'oauth_codex'`, but `revoke_oauth` sets `key.oauth_data = None` without changing `auth_type`, so revoking an OAuth key will violate this constraint. The constraint should allow NULL for revoked/inactive OAuth keys, e.g. by including `is_active`:

```sql
(auth_type = 'api_key' AND encrypted_key IS NOT NULL)
OR (auth_type = 'oauth_codex' AND (oauth_data IS NOT NULL OR is_active = FALSE))
```

or by only enforcing non-null when `is_active = TRUE`. As-is, clearing `oauth_data` for an OAuth key will fail at the DB level.
</issue_to_address>

### Comment 2
<location path="implementation/processor/src/services/oauth_codex_service.py" line_range="398-405" />
<code_context>
+# 5. REVOKE OAUTH
+# ---------------------------------------------------------------------------
+
+def revoke_oauth(db: Session, key_id: uuid.UUID) -> bool:
+    """Revoke OAuth connection (soft delete)."""
+    key = db.query(ApiKey).filter(ApiKey.id == key_id).first()
+    if not key:
+        return False
+
+    key.is_active = False
+    key.oauth_data = None
+    db.commit()
+
</code_context>
<issue_to_address>
**issue (bug_risk):** Clearing oauth_data in revoke_oauth will break the DB check constraint for oauth_codex keys.

With the `chk_auth_data` constraint (`auth_type = 'oauth_codex' AND oauth_data IS NOT NULL`), setting `oauth_data = None` while leaving `auth_type='oauth_codex'` will fail the constraint on commit. This means the current revoke logic cannot succeed for OAuth keys. You’ll need to either:

1) Keep `oauth_data` non-null for revoked keys (e.g., a minimal scrubbed JSON), or
2) Relax the constraint to permit `oauth_data` = NULL when `is_active = False`.

Please update either the revoke logic or the constraint so they are consistent and avoid runtime DB errors.
</issue_to_address>

### Comment 3
<location path="docs/OAUTH-CODEX-pt-BR.md" line_range="1" />
<code_context>
+# OAuth Codex (OpenAI) — Autenticacao por Assinatura ChatGPT para Evo CRM
+
+## Visao Geral
</code_context>
<issue_to_address>
**suggestion (typo):** Consider adding Portuguese diacritics consistently throughout the pt-BR document (e.g., "Autenticacao" → "Autenticação").

The pt-BR text seems to consistently omit accents in standard Portuguese words (e.g., "Visao"/"Visão", "implementacao"/"implementação", "Solucao"/"Solução", "Mudancas"/"Mudanças", "Seguranca"/"Segurança", "conexao"/"conexão", "Reversivel"/"Reversível", "NAO sao"/"NÃO são"). Please review the entire document and apply correct diacritics for Brazilian Portuguese spelling.

Suggested implementation:

```
# OAuth Codex (OpenAI) — Autenticação por Assinatura ChatGPT para Evo CRM

```

```
## Visão Geral

```

```
Esta implementação adiciona **OAuth Codex da OpenAI** como método alternativo de autenticação no Evo CRM, permitindo que usuários com assinatura **ChatGPT Plus** ($20/mês) ou **ChatGPT Pro** ($200/mês) utilizem modelos GPT-5.x diretamente, sem necessidade de uma API key separada da OpenAI.

```

```
A abordagem é **híbrida**: OAuth Codex funciona ao lado das API keys existentes. Nenhuma funcionalidade atual é alterada ou removida.

```

```
## Arquitetura da Solução

```

```
### Como funciona hoje (API Keys)

```

Em outras seções deste arquivo `docs/OAUTH-CODEX-pt-BR.md`, revise e aplique diacríticos corretos para termos como:
- "implementacao" → "implementação"
- "Solucao" → "Solução"
- "Mudancas" → "Mudanças"
- "Seguranca" → "Segurança"
- "conexao" → "conexão"
- "Reversivel" → "Reversível"
- "NAO sao" → "NÃO são"

Garanta também consistência em outras palavras acentuadas comuns em pt-BR (e.g., "configuracao" → "configuração", "integracao" → "integração", "sessao" → "sessão", etc.), mantendo termos técnicos em inglês sem alterações.
</issue_to_address>

### Comment 4
<location path="docs/OAUTH-CODEX-pt-BR.md" line_range="47" />
<code_context>
+- `api_key` = token OAuth do tenant (usado como Bearer)
+- `extra_headers` = ChatGPT-Account-Id, originator
+
+O Google ADK `LiteLlm` passa `**kwargs` via `_additional_args` para `litellm.acompletion()`, confirmado no codigo-fonte (SHA 7d13696c). Cada tenant recebe sua propria instancia, sem estado global compartilhado.
+
+---
</code_context>
<issue_to_address>
**nitpick (typo):** Standardize the LiteLLM library name casing ("LiteLlm" → "LiteLLM").

Here you use the backticked name `LiteLlm`, but elsewhere the docs and the project use "LiteLLM". Please update this occurrence to `LiteLLM` for consistency with the official library name.

```suggestion
O Google ADK `LiteLLM` passa `**kwargs` via `_additional_args` para `litellm.acompletion()`, confirmado no codigo-fonte (SHA 7d13696c). Cada tenant recebe sua propria instancia, sem estado global compartilhado.
```
</issue_to_address>

### Comment 5
<location path="docs/OAUTH-CODEX-en.md" line_range="47" />
<code_context>
+- `api_key` = tenant's OAuth token (used as Bearer)
+- `extra_headers` = ChatGPT-Account-Id, originator
+
+Google ADK's `LiteLlm` passes `**kwargs` via `_additional_args` to `litellm.acompletion()`, confirmed in source code (SHA 7d13696c). Each tenant gets their own instance with zero shared global state.
+
+---
</code_context>
<issue_to_address>
**nitpick (typo):** Align the LiteLLM library name casing here with the rest of the document ("LiteLlm" → "LiteLLM").

This line uses `LiteLlm` while the rest of the document and the official project name use `LiteLLM`. Please update the casing here for consistency.

```suggestion
Google ADK's `LiteLLM` passes `**kwargs` via `_additional_args` to `litellm.acompletion()`, confirmed in source code (SHA 7d13696c). Each tenant gets their own instance with zero shared global state.
```
</issue_to_address>

### Comment 6
<location path="implementation/processor/src/services/oauth_codex_service.py" line_range="55" />
<code_context>
        claims = jwt.decode(id_token, options={"verify_signature": False})
</code_context>
<issue_to_address>
**security (python.jwt.security.unverified-jwt-decode):** Detected JWT token decoded with 'verify=False'. This bypasses any integrity checks for the token which means the token could be tampered with by malicious actors. Ensure that the JWT token is verified.

```suggestion
        claims = jwt.decode(id_token, options={"verify_signature": True})
```

*Source: opengrep*
</issue_to_address>

### Comment 7
<location path="implementation/processor/src/services/oauth_codex_service.py" line_range="71" />
<code_context>
        claims = jwt.decode(access_token, options={"verify_signature": False})
</code_context>
<issue_to_address>
**security (python.jwt.security.unverified-jwt-decode):** Detected JWT token decoded with 'verify=False'. This bypasses any integrity checks for the token which means the token could be tampered with by malicious actors. Ensure that the JWT token is verified.

```suggestion
        claims = jwt.decode(access_token, options={"verify_signature": True})
```

*Source: opengrep*
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread implementation/processor/src/services/oauth_codex_service.py Outdated
Comment thread docs/OAUTH-CODEX-pt-BR.md
@@ -0,0 +1,244 @@
# OAuth Codex (OpenAI) — Autenticacao por Assinatura ChatGPT para Evo CRM
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (typo): Consider adding Portuguese diacritics consistently throughout the pt-BR document (e.g., "Autenticacao" → "Autenticação").

The pt-BR text seems to consistently omit accents in standard Portuguese words (e.g., "Visao"/"Visão", "implementacao"/"implementação", "Solucao"/"Solução", "Mudancas"/"Mudanças", "Seguranca"/"Segurança", "conexao"/"conexão", "Reversivel"/"Reversível", "NAO sao"/"NÃO são"). Please review the entire document and apply correct diacritics for Brazilian Portuguese spelling.

Suggested implementation:

# OAuth Codex (OpenAI) — Autenticação por Assinatura ChatGPT para Evo CRM

## Visão Geral

Esta implementação adiciona **OAuth Codex da OpenAI** como método alternativo de autenticação no Evo CRM, permitindo que usuários com assinatura **ChatGPT Plus** ($20/mês) ou **ChatGPT Pro** ($200/mês) utilizem modelos GPT-5.x diretamente, sem necessidade de uma API key separada da OpenAI.

A abordagem é **híbrida**: OAuth Codex funciona ao lado das API keys existentes. Nenhuma funcionalidade atual é alterada ou removida.

## Arquitetura da Solução

### Como funciona hoje (API Keys)

Em outras seções deste arquivo docs/OAUTH-CODEX-pt-BR.md, revise e aplique diacríticos corretos para termos como:

  • "implementacao" → "implementação"
  • "Solucao" → "Solução"
  • "Mudancas" → "Mudanças"
  • "Seguranca" → "Segurança"
  • "conexao" → "conexão"
  • "Reversivel" → "Reversível"
  • "NAO sao" → "NÃO são"

Garanta também consistência em outras palavras acentuadas comuns em pt-BR (e.g., "configuracao" → "configuração", "integracao" → "integração", "sessao" → "sessão", etc.), mantendo termos técnicos em inglês sem alterações.

Comment thread docs/OAUTH-CODEX-pt-BR.md
- `api_key` = token OAuth do tenant (usado como Bearer)
- `extra_headers` = ChatGPT-Account-Id, originator

O Google ADK `LiteLlm` passa `**kwargs` via `_additional_args` para `litellm.acompletion()`, confirmado no codigo-fonte (SHA 7d13696c). Cada tenant recebe sua propria instancia, sem estado global compartilhado.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick (typo): Standardize the LiteLLM library name casing ("LiteLlm" → "LiteLLM").

Here you use the backticked name LiteLlm, but elsewhere the docs and the project use "LiteLLM". Please update this occurrence to LiteLLM for consistency with the official library name.

Suggested change
O Google ADK `LiteLlm` passa `**kwargs` via `_additional_args` para `litellm.acompletion()`, confirmado no codigo-fonte (SHA 7d13696c). Cada tenant recebe sua propria instancia, sem estado global compartilhado.
O Google ADK `LiteLLM` passa `**kwargs` via `_additional_args` para `litellm.acompletion()`, confirmado no codigo-fonte (SHA 7d13696c). Cada tenant recebe sua propria instancia, sem estado global compartilhado.

Comment thread docs/OAUTH-CODEX-en.md
- `api_key` = tenant's OAuth token (used as Bearer)
- `extra_headers` = ChatGPT-Account-Id, originator

Google ADK's `LiteLlm` passes `**kwargs` via `_additional_args` to `litellm.acompletion()`, confirmed in source code (SHA 7d13696c). Each tenant gets their own instance with zero shared global state.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick (typo): Align the LiteLLM library name casing here with the rest of the document ("LiteLlm" → "LiteLLM").

This line uses LiteLlm while the rest of the document and the official project name use LiteLLM. Please update the casing here for consistency.

Suggested change
Google ADK's `LiteLlm` passes `**kwargs` via `_additional_args` to `litellm.acompletion()`, confirmed in source code (SHA 7d13696c). Each tenant gets their own instance with zero shared global state.
Google ADK's `LiteLLM` passes `**kwargs` via `_additional_args` to `litellm.acompletion()`, confirmed in source code (SHA 7d13696c). Each tenant gets their own instance with zero shared global state.

Comment thread implementation/processor/src/services/oauth_codex_service.py Outdated
Comment thread implementation/processor/src/services/oauth_codex_service.py Outdated
@NeritonDias NeritonDias changed the title feat: add OAuth Codex (OpenAI) authentication for ChatGPT subscription models feat: OAuth Codex authentication + LiteLLM security upgrade (CVE-2026-35030 & supply chain fix) Apr 16, 2026
…-35030)

Adds OAuth Codex (OpenAI) as an alternative authentication method for AI
providers in Evo CRM, allowing ChatGPT Plus/Pro subscribers to use GPT-5.x
models without a separate API key. Includes critical LiteLLM security upgrade.

Backend:
- Alembic migration: auth_type + oauth_data columns on api_keys table
- OAuth device code flow service with thread-safe token refresh
- 4 new REST endpoints under /agents/oauth/codex/*
- Model remapping: chatgpt/ -> openai/ prefix with custom api_base

Frontend:
- New provider "ChatGPT (OAuth)" with 10 GPT-5.x models
- OAuthDeviceCodeFlow and OAuthStatusBadge components
- ApiKeysDialog conditional UI for OAuth vs API key

Security:
- LiteLLM upgrade v1.68.0 -> v1.83.3 (fixes CVE-2026-35030 OIDC bypass)
- Supply chain attack mitigation (v1.82.7/v1.82.8 compromised Mar 2026)
- 22 integration tests, 3-agent debug sweep with 10 findings fixed

Docs: OAUTH-CODEX-pt-BR.md + OAUTH-CODEX-en.md
@NeritonDias
Copy link
Copy Markdown
Author

Closing this PR. After working through the runtime failures it became clear the original approach could not work:

  1. The upstream Codex CLI uses a PKCE browser flow, not a device-code flow. The device-code flow against app_EMoamEEZ73f0CkXaXp7hrann is accepted by auth.openai.com but the resulting tokens do not work against the ChatGPT backend API.
  2. The chatgpt/openai/ model remapping is incompatible with LiteLLM: the openai/ provider hits api.openai.com/v1/chat/completions with an API key, while ChatGPT subscription tokens only authenticate against chatgpt.com/backend-api/codex (Responses API) and require the headers that LiteLLM's native chatgpt/ provider injects.
  3. LiteLLM 1.68.2 (the version this PR depends on) does not yet ship the chatgpt/ provider. That provider, with its Authenticator reading ~/.config/litellm/chatgpt/auth.json, only landed in 1.83.0+.

Replaced by three focused PRs:

The sourcery findings on this PR (JWT decode without verify, chk_auth_data vs. revoke_oauth conflict, db.commit() on the fast path of get_fresh_token) are addressed or made moot in the new implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant