Skip to content

Add Perplexity LLM plugin#5610

Open
jliounis wants to merge 1 commit intolivekit:mainfrom
jliounis:add-perplexity-plugin
Open

Add Perplexity LLM plugin#5610
jliounis wants to merge 1 commit intolivekit:mainfrom
jliounis:add-perplexity-plugin

Conversation

@jliounis
Copy link
Copy Markdown

Summary

Adds a livekit-plugins-perplexity plugin that exposes Perplexity's OpenAI-compatible chat completions endpoint as a LiveKit Agents LLM.

The plugin is a thin wrapper around livekit-plugins-openai, mirroring the pattern used by livekit-plugins-cerebras and livekit-plugins-baseten:

  • base_url = https://api.perplexity.ai
  • Default model: sonar-pro
  • API key from the PERPLEXITY_API_KEY env var (or argument)
  • Reuses the existing OpenAI plugin transport — no new runtime dependencies
from livekit.plugins import perplexity

llm = perplexity.LLM(model="sonar-pro")  # picks up PERPLEXITY_API_KEY

Attribution header

Every outgoing chat request carries X-Pplx-Integration: livekit-agents/<plugin-version> (forwarded via extra_headers on the underlying OpenAI client) so usage from the LiveKit integration can be attributed upstream. A unit test asserts this header is configured.

Test plan

  • uv run --package livekit-plugins-perplexity pytest tests/test_plugin_perplexity.py — 4 passed
  • uv run ruff check livekit-plugins/livekit-plugins-perplexity tests/test_plugin_perplexity.py — clean
  • Plugin imports and LLM() instantiates with PERPLEXITY_API_KEY set; raises a clear ValueError otherwise
  • LLM().provider == "Perplexity" and _client.base_url starts with https://api.perplexity.ai

The added tests are pure mocks (no network) and cover: default model, base URL, attribution header, missing-key error, and provider name.

Files

  • livekit-plugins/livekit-plugins-perplexity/ — new package (mirrors cerebras/baseten layout)
  • tests/test_plugin_perplexity.py — unit tests
  • pyproject.toml — register new workspace member

Adds livekit-plugins-perplexity, a thin wrapper around the OpenAI plugin
that targets Perplexity's OpenAI-compatible chat completions endpoint
(`https://api.perplexity.ai`) with `sonar-pro` as the default model.

Every outgoing request carries an `X-Pplx-Integration: livekit-agents/<version>`
attribution header so usage from this plugin can be identified upstream.

Mirrors the structure of `livekit-plugins-cerebras` / `livekit-plugins-baseten`.

Signed-off-by: James Liounis <james.liounis@perplexity.ai>
@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 potential issue.

⚠️ 1 issue in files not directly in the diff

⚠️ Missing perplexity optional dependency in livekit-agents/pyproject.toml (livekit-agents/pyproject.toml:102)

Every other standalone plugin package (cerebras, xai, openai, anthropic, etc.) is registered as an optional dependency in livekit-agents/pyproject.toml:59-124 so users can install via pip install livekit-agents[perplexity]. The PR adds the workspace source in the root pyproject.toml:45 but omits the corresponding perplexity = ["livekit-plugins-perplexity>=1.5.6"] entry in livekit-agents/pyproject.toml. For reference, the cerebras plugin added its entry (cerebras = ["livekit-plugins-cerebras>=1.5.6"] at livekit-agents/pyproject.toml:77) when it was first introduced in its PR.

View 3 additional findings in Devin Review.

Open in Devin Review

Copy link
Copy Markdown
Member

@tinalenguyen tinalenguyen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @james-pplx, thank you for creating this PR! Could you sign the CLA and add the plugin to livekit-agents/pyproject.toml?

Support for [Perplexity](https://www.perplexity.ai/) LLMs via the OpenAI-compatible
chat completions endpoint at `https://api.perplexity.ai`.

See [https://docs.livekit.io/agents/integrations/llm/](https://docs.livekit.io/agents/integrations/llm/) for more information.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
See [https://docs.livekit.io/agents/integrations/llm/](https://docs.livekit.io/agents/integrations/llm/) for more information.
See [https://docs.livekit.io/agents/models/llm/perplexity/](https://docs.livekit.io/agents/models/llm/perplexity/) for more information.

we can update the docs on our end to reflect this plugin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants