Skip to content

Latest commit

 

History

History
49 lines (38 loc) · 2.83 KB

File metadata and controls

49 lines (38 loc) · 2.83 KB
title Batch LLM Evaluator
sidebarTitle Batch LLM Evaluator
description This example project evaluates multiple LLM models using the Vercel AI SDK and streams updates to the frontend using Trigger.dev Realtime.

import RealtimeLearnMore from "/snippets/realtime-learn-more.mdx";

Overview

This demo is a full stack example that uses the following:

  • A Next.js app with Prisma for the database.
  • Trigger.dev Realtime to stream updates to the frontend.
  • Work with multiple LLM models using the Vercel AI SDK. (OpenAI, Anthropic, XAI)
  • Distribute tasks across multiple tasks using the new batch.triggerByTaskAndWait method.

GitHub repo

<Card title="View the Batch LLM Evaluator repo" icon="GitHub" href="https://github.com/triggerdotdev/examples/tree/main/batch-llm-evaluator"

Click here to view the full code for this project in our examples repository on GitHub. You can fork it and use it as a starting point for your own project.

Video

<video controls className="w-full aspect-video" src="https://content.trigger.dev/batch-llm-evaluator.mp4"

Relevant code