LLM access to pplx-api 3 by Perplexity Labs
Install this plugin in the same environment as LLM.
llm install llm-perplexityFirst, set an API key for Perplexity AI:
llm keys set perplexity
# Paste key hereRun llm models to list the models, and llm models --options to include a list of their options.
Run prompts like this:
llm -m sonar-reasoning 'Fun facts about plums'
llm -m sonar-pro 'Fun facts about pelicans'
llm -m sonar 'Fun facts about walruses'You can also access these models through OpenRouter. First install the OpenRouter plugin:
llm install llm-openrouterThen set your OpenRouter API key:
llm keys set openrouterUse the --option use_openrouter true flag to route requests through OpenRouter:
llm -m sonar-small --option use_openrouter true 'Fun facts about pelicans'To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-perplexity
python3 -m venv venv
source venv/bin/activateNow install the dependencies and test dependencies:
llm install -e '.[test]'This plugin was made after the llm-claude-3 plugin by Simon Willison.