Skip to main content
2025 Python Packaging Survey is now live!  Take the survey now

Run evals against prompts using LLM

Project description

llm-evals-plugin

PyPI Changelog Tests License

Run evals against prompts using LLM

Very early alpha: everything is likely to change.

Installation

Install this plugin in the same environment as LLM.

llm install llm-evals-plugin

Usage

See issue 1.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-evals-plugin
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page