Add Perplexity LLM plugin#5610
Conversation
Adds livekit-plugins-perplexity, a thin wrapper around the OpenAI plugin that targets Perplexity's OpenAI-compatible chat completions endpoint (`https://api.perplexity.ai`) with `sonar-pro` as the default model. Every outgoing request carries an `X-Pplx-Integration: livekit-agents/<version>` attribution header so usage from this plugin can be identified upstream. Mirrors the structure of `livekit-plugins-cerebras` / `livekit-plugins-baseten`. Signed-off-by: James Liounis <james.liounis@perplexity.ai>
|
|
There was a problem hiding this comment.
Devin Review found 1 potential issue.
⚠️ 1 issue in files not directly in the diff
⚠️ Missing perplexity optional dependency in livekit-agents/pyproject.toml (livekit-agents/pyproject.toml:102)
Every other standalone plugin package (cerebras, xai, openai, anthropic, etc.) is registered as an optional dependency in livekit-agents/pyproject.toml:59-124 so users can install via pip install livekit-agents[perplexity]. The PR adds the workspace source in the root pyproject.toml:45 but omits the corresponding perplexity = ["livekit-plugins-perplexity>=1.5.6"] entry in livekit-agents/pyproject.toml. For reference, the cerebras plugin added its entry (cerebras = ["livekit-plugins-cerebras>=1.5.6"] at livekit-agents/pyproject.toml:77) when it was first introduced in its PR.
View 3 additional findings in Devin Review.
tinalenguyen
left a comment
There was a problem hiding this comment.
Hi @james-pplx, thank you for creating this PR! Could you sign the CLA and add the plugin to livekit-agents/pyproject.toml?
| Support for [Perplexity](https://www.perplexity.ai/) LLMs via the OpenAI-compatible | ||
| chat completions endpoint at `https://api.perplexity.ai`. | ||
|
|
||
| See [https://docs.livekit.io/agents/integrations/llm/](https://docs.livekit.io/agents/integrations/llm/) for more information. |
There was a problem hiding this comment.
| See [https://docs.livekit.io/agents/integrations/llm/](https://docs.livekit.io/agents/integrations/llm/) for more information. | |
| See [https://docs.livekit.io/agents/models/llm/perplexity/](https://docs.livekit.io/agents/models/llm/perplexity/) for more information. |
we can update the docs on our end to reflect this plugin
Summary
Adds a
livekit-plugins-perplexityplugin that exposes Perplexity's OpenAI-compatible chat completions endpoint as a LiveKit Agents LLM.The plugin is a thin wrapper around
livekit-plugins-openai, mirroring the pattern used bylivekit-plugins-cerebrasandlivekit-plugins-baseten:base_url = https://api.perplexity.aisonar-proPERPLEXITY_API_KEYenv var (or argument)Attribution header
Every outgoing chat request carries
X-Pplx-Integration: livekit-agents/<plugin-version>(forwarded viaextra_headerson the underlying OpenAI client) so usage from the LiveKit integration can be attributed upstream. A unit test asserts this header is configured.Test plan
uv run --package livekit-plugins-perplexity pytest tests/test_plugin_perplexity.py— 4 passeduv run ruff check livekit-plugins/livekit-plugins-perplexity tests/test_plugin_perplexity.py— cleanLLM()instantiates withPERPLEXITY_API_KEYset; raises a clearValueErrorotherwiseLLM().provider == "Perplexity"and_client.base_urlstarts withhttps://api.perplexity.aiThe added tests are pure mocks (no network) and cover: default model, base URL, attribution header, missing-key error, and provider name.
Files
livekit-plugins/livekit-plugins-perplexity/— new package (mirrors cerebras/baseten layout)tests/test_plugin_perplexity.py— unit testspyproject.toml— register new workspace member